Welcome to the Science Year Books. This section reviews the science news from 1997 to the present. Each year is broken into months and each month is broken into different stories, each reviewed on its own page. Read about medical, genetic, weather, technology, space, evolution, physics, conservation and earth sciences news. This is a fascinating and absorbing overview for those interested in what is happening in the scientific world of today.
#
"Science Year Books Years",2,0,0,0
\J1997 Science in Review\j
\J1998 Science in Review\j
\J1999 Science in Review\j
\J2000 Science in Review\j
#
"1997 Science in Review",3,0,0,0
\JJanuary 1997 Science Review\j
\JFebruary 1997 Science Review\j
\JMarch 1997 Science Review\j
\JApril 1997 Science Review\j
\JMay 1997 Science Review\j
\JJune 1997 Science Review\j
\JJuly 1997 Science Review\j
\JAugust 1997 Science Review\j
\JSeptember 1997 Science Review\j
\JOctober 1997 Science Review\j
\JNovember 1997 Science Review\j
\JDecember 1997 Science Review\j
#
"January 1997 Science Review",4,0,0,0
\JTime machines get a step closer\j
\JNew drugs\j
\JHumans older than thought?\j
\JEuropa is seismic\j
\JMoons full of life?\j
\JBlack holes in every galaxy?\j
\JNobel winners (1996)\j
\JMathematics update\j
\JObituaries for January 97\j
\JArchaeological finding in Beijing\j
\JShort science updates (Jan 1997)\j
\JGreen cells for a greener Australia?\j
\JPhotovoltaic cell--what is it?\j
\JPhotovoltaic cells--practical applications\j
\JCar navigation system update\j
\JNavigation solutions for the car\j
#
"February 1997 Science Review",5,0,0,0
\JHubble gets a facelift\j
\JA new radio telescope in space\j
\JSheep cloning a success\j
\JLead linked to bad teeth\j
\JTransgenic corn in France\j
\JKeeping the worms down\j
\JPlants can shout for help?\j
\JCancer gene--how it gets nasty\j
\JBirds, dinosaurs, and the big asteroid\j
\JStrong spider web\j
\JSpy satellite photos\j
\JMobile phones and car accidents\j
\JFaster modems on the way\j
\JEncryption\j
\JMorse code disappearing\j
\JCarbon on the ocean floor\j
\JAncient chewing gum\j
\JBlack hole discovered in February\j
\JA new particle and a fifth force?\j
\JObituaries for February 97\j
#
"March 1997 Science Review",6,0,0,0
\JGlobal warming--impact on Antarctica\j
\JCloned monkeys\j
\JArchaeological findings in Germany\j
\JSeeing yellow\j
\JSmall galaxy disappears\j
\JGPS and the Himalayas\j
\JUnder two suns?\j
\JEarth getting hotter?\j
\JIce Age an earlier phenomenon\j
\JLife on Martian meteorite?\j
\JThermoluminescence dating doubts\j
\JWeird flier uncovered\j
\JEye evolution\j
\JHepatitis G--not guilty?\j
\JFlu virus of 1918 identified\j
\JObituary for March 97\j
\JComing events (Mar 97)\j
#
"April 1997 Science Review",7,0,0,0
\JOne hundred years of electrons?\j
\JEuropa's ocean\j
\JAtomic force microscopes\j
\JJapan's earthquake predictors don't work\j
\JPeer pressure and gene pressure?\j
\JA gene for Alzheimer's?\j
\JBreast cancer genes\j
\JCancer gene nabbed\j
\JCrash victims, new method to identify\j
\JFish antifreeze gene\j
\JLife is fractal\j
\JPlants on the march?\j
\JClean cars get further away\j
\JNew ancestral apes?\j
\JAdaptive radiation\j
\JA snake with legs?\j
\JJurassic Park ruled out\j
\JAntibiotic-resistant bug found\j
\JGood bugs in our computer displays?\j
\JBacteria on a blue chip\j
\JKasparov vs Deep Blue\j
\JCheetahs slower\j
\JHubble trouble?\j
\JVolcano erupts in Australia\j
\JIs the universe twisted?\j
#
"May 1997 Science Review",8,0,0,0
\JControversy in space\j
\JGlobal warming theory in doubt\j
\JIce Ages theory gains ground\j
\JShoemaker-Levy--an ominous sign?\j
\JHans Bethe speaks against hydrogen bomb\j
\JAIDS updates\j
\JGenetics news\j
\JGene patents\j
\JA complete genome\j
\JDNA--double double helix\j
\JA gene for ADHD?\j
\JDNA of worms\j
\JEntire human chromosome in a mouse\j
\JMS cure in mice\j
\JAlzheimer remembered\j
\JBreast implants dangerous?\j
\JPrions--how they work\j
\JCockroaches linked to asthma?\j
\JChernobyl still a risk\j
\JDinosaurs ancestors of birds?\j
\JLargest dinosaur unearthed!\j
\JHumans once giants?\j
\JBrain surgery in Stone Age\j
\JModern humans in Europe, 800 000 y.a.\j
\JMapping fossil hot spots\j
\JCorn plant's defence\j
\JNereus, a close look\j
\JHubble looks at M84\j
\JDark matter mystery gets worse\j
\JChicken virus attacks Emperor penguins\j
\JMitochondria, the evolution\j
\JRed algae rewrite family tree\j
\JJupiter update (May 97)\j
\JNew radio telescope\j
\JDeep Blue beats Kasparov\j
\JALH84001 less likely to contain signs of life\j
\J51 Peg's planet in doubt again\j
\JLeptoquarks less likely\j
\JBreast cancer 'gene' not such a sure sign\j
#
"June 1997 Science Review",9,0,0,0
\JFermat's last case?\j
\JBlue Planet Prize\j
\JCrafoord Prize\j
\JMir's problems\j
\JRed Centaur asteroids spark interest\j
\JAsteroid in earth's Lagrangian points\j
\JLost friendly planet?\j
\JHubble sees double vision\j
\JBoomerang nebula--coldest spot?\j
\JRobots on Mars\j
\JCyberlaw code of conduct\j
\JFrench laws and the internet\j
\JCybernews bits\j
\JUrine test of benefit\j
\JGrowing blind\j
\JMarijuana--just as bad\j
\JNeutrinos--do they have a mass?\j
\JNew genomic data\j
\JHUGO news\j
\JGene and Parkinson's disease link\j
\JDown's syndrome and trisomy\j
\JPlants and chromosomes\j
\JOrigins of new species\j
\JDogs evolved earlier than thought\j
\JFossils in the news\j
\JNuclear news\j
\JEl Nino rides again!\j
\JGaia hypothesis update\j
\JWorld population growing more slowly\j
\JRussians dwindling faster\j
\JUnited States 1997 census\j
\JScrapie, an end to it?\j
\JBill to outlaw cloning\j
\JMen are inferior after all?\j
\JArecibo awakes\j
\JWasps help date rock painting\j
\JSolid lubricants\j
\JDNA fingerprints from fingerprints\j
\JEquatorial ice cores\j
\JArk debate, Plimer vs Roberts\j
\JNo ice on the moon after all\j
\JDeep Blue goes data mining\j
\JObituary for June 97\j
#
"July 1997 Science Review",10,0,0,0
\JPathfinder touches down!\j
\JMathilde--not your average asteroid?\j
\JBig holes in Vesta\j
\JDeath by asteroid\j
\JJapan's new telescope in space\j
\JSulfur and earth's outer core\j
\JSulfur in a solar nebula\j
\JEuropa has an atmosphere\j
\JRhea and Dione can have ozone holes\j
\JDating the Old World monkeys\j
\JChimp retirement plan\j
\JNeandertal Man partly cloned\j
\JAIDS transmission by kissing\j
\JLanguage acquisition\j
\JBaby talk language\j
\JFirst gram-positive bacterium sequenced\j
\JClinton supports broad genetic safeguards\j
\JGenetically engineered viruses\j
\JMonkeypox makes a comeback\j
\JSarawak's new killer disease strikes children\j
\JYeast prion model\j
\JNew Zealand rabbits escape calicivirus\j
\JGenes may be patented\j
\JQuantum physics update\j
\JCambrian explosion\j
\JMagnetic fields and cancer\j
\JParadoxical ice shelves\j
\JPenguins and voluntary hypothermia\j
\JWhite House looks at greenhouse\j
\JWhat drives the Ice Ages?\j
\JNew athletic training method?\j
\JChemistry update\j
\JExtinction on high seas?\j
\JEcosystem inter-reliance\j
\JCloned transgenic lamb\j
\JMir--a suitable case for treatment?\j
\JNo watery comets after all?\j
\JResistant antibiotics\j
\JAntibiotics useful for cardiac conditions\j
\JObituary for July 97\j
#
"August 1997 Science Review",11,0,0,0
\JRobots for the 21st century\j
\JGenome news\j
\JCompleted bacterial genome\j
\JWorms in the news\j
\JBirds indicate warmer climate\j
\JGlobal warming--too early to judge\j
\JNew discoveries on the sun\j
\JSupernova, quest to understand\j
\JEx-sun 1987A still going strong\j
\JNew planet?\j
\JHidden fluoride found\j
\JXenon still missing\j
\JMissing dark matter present as hydrogen?\j
\JArctic sea floor uncovered\j
\JVaccinating the sheep\j
\JWeevil news\j
\JPest weevils\j
\JBeetles help fight loosestrife\j
\JBacteria in the wheat blight fight\j
\JColorful fossils\j
\JArgon dating\j
\Jmc2=e\j
\JSojourner completes 30-day mission\j
\JDistant close-up on Mars\j
\JVenus still active?\j
\JExtra plates on earth\j
\JNew Zealand rabbits get calicivirus after all\j
\JBacterial resistance\j
\JVancomycin unravelled\j
\JWatery comets after all?\j
\JBlindness cured by e-mail\j
\JMalaria still under attack\j
\JNew worms\j
\JNew subnuclear particle?\j
\JCannibals back in vogue\j
\JDams cause environmental damage\j
\JSafer superconductors\j
\JArmadillos are unique\j
#
"September 1997 Science Review",12,0,0,0
\JPredicting heart attacks\j
\JComputer diagnosis?\j
\JGene therapy gets smarter\j
\JConcern over gene-therapy experiment\j
\JBritish Columbia traveling north\j
\JLead linked to dental decay\j
\JBuilding better bones\j
\JBroccoli and cancer prevention\j
\JWeeds and depression\j
\JSearch for safer cancer drugs\j
\JGreen tea of benefit\j
\JKombucha tea cause of illness\j
\JMale fireflies eaten for repellent\j
\JSound robot design\j
\JFish and ultrasonic sound\j
\JCiclid fish in danger of extinction\j
\JFirst World AIDS, Third World no aid?\j
\JModified virus to fight AIDS?\j
\JDrug solution to AIDS?\j
\JCPR debate\j
\JSHEBA's winter\j
\JCrystal star?\j
\JQuark star?\j
\JEvolution fraud rediscovered\j
\JLungfish our nearest relative?\j
\JBirds get older\j
\JFallacies about evolution\j
\JMicroprocessor size limit\j
\JFossil contention\j
\JLargest specimen of Tyrannosaur unearthed\j
\JBetter superconductors\j
\JCleaner water, better medicines\j
\JMounds as old as the pyramids?\j
\JNew tumor suppressor gene?\j
\JDNA all wrapped up\j
\JLeft-handers rule in space\j
\JE. coli sequenced\j
\JCassini trials\j
\JScience prize news\j
#
"October 1997 Science Review",13,0,0,0
\JNobel Prize in Physiology or Medicine (1997)\j
\JNobel Prize in Chemistry (1997)\j
\JNobel Prize in Physics (1997)\j
\JEconomics Nobel Prize (1997)\j
\JNobel Prize for Literature (1997)\j
\JNobel Peace Prize (1997)\j
\JGairdner awards for 1997\j
\JIg Nobel awards\j
\JFormer Nobel laureates call for greenhouse cuts\j
\JEmissions trading system proposed\j
\JA million solar roofs\j
\JEl Nino effects--not an ill wind after all?\j
\JNew climate cycle\j
\JAntibiotic resistance in bacteria\j
\JTB from a contaminated endoscope?\j
\JAntibiotic resistance is permanent\j
\JEngineered bacteria to fight tumours\j
\JPathfinder and Sojourner lose contact\j
\JGetting to Mars more cheaply\j
\JCassini on its way\j
\JUranus has two new moons\j
\JConserving Madagascar\j
\JBrazil's new park\j
\JBreast cancer not caused by PCBs\j
\JX-Ray laser in development\j
\JZebra mussels get a setback\j
\JPig organs infected\j
\JX chromosomes mutate more\j
\JGenetic transmission\j
\JUpright ancestor gets older\j
\JShrinking a genus\j
\JMore vitamin C means fewer cataracts?\j
\JHow the universe will end\j
\JTiniest transistor\j
\JNeurochip development\j
\JCollagen brought low\j
\JSparrows not dinosaurs after all\j
\JDiscovering more about ozone holes\j
\JAntimatter disproved?\j
\JNo successor to Clementine mission\j
\JTsunamis prediction\j
\JNew world-record prime number\j
\JSouth-East Asian smoke clouds\j
\JHow second-hand smoke kills\j
\JAspirin and heart disease\j
\JObituary for October 97\j
#
"November 1997 Science Review",14,0,0,0
\JLatex allergy\j
\JCochineal an allergen\j
\JPolyunsaturated oils no guarantee\j
\JSaccharin cancer causing?\j
\JPathogens in food\j
\JOral vaccine against botulism\j
\JTest tube vaccine\j
\JDNA as a vaccine?\j
\JAcupuncture proven effective\j
\JCan computers develop immunity to viruses?\j
\JImmunity to student plagiarism?\j
\JTuberculosis in the news\j
\JWine good for the heart?\j
\JEthanol increases toxin levels\j
\JDigital x-rays cheaper\j
\JAnticancer drugs making liver transplants stick\j
\JSouth American fossils in Madagascar and India\j
\JWater trapped in earth as crystals\j
\JBig quakes may be gentler\j
\JGenome of tuberculosis\j
\JGenome of a spirochaete\j
\JGenome of an archaebacterium\j
\JDog genome gets closer\j
\JTubulin unveiled\j
\JChicken flu virus scare\j
\JWhy red wine is good for the heart\j
\JImmunoglobulin E a killer?\j
\JGenetic mutation responsible for allergies?\j
\JAsbestos transformed\j
\JKyoto Environmental Conference report\j
\JGlobal warming confirmed\j
\JAfrican weather\j
\JCold snap 8200 y.a.?\j
\JNew toxin to combat insect pests\j
\JPill to combat mosquitoes\j
\JGorilla census\j
\JJapan Prize results\j
\JTop 10 scientific breakthroughs (1997)\j
\JFirst transgenic cloned sheep\j
\JPrion chaperone identified\j
\JSmall comets in doubt again\j
\JMathilde pictures released\j
\JMartian life gets more distant\j
\JRoyal Greenwich Observatory closed\j
\JHigh x-ray bursts detected over Sweden\j
\JEnd of a star\j
\JDeath of a planet--or many planets?\j
\JSwitzerland: the Gene Protection Initiative\j
\JGreen tea kills cancer\j
\JPine cone intelligence\j
\JObituary for December 97\j
#
"Time machines get a step closer",16,0,0,0
(Jan '97)
Hendrik Casimir predicted the weak force between two plates in a vacuum which we now call the Casimir effect in 1948. Steven Lamoreaux at the Los Alamos National laboratory in New Mexico has just succeeded in measuring the force of the effect, using a torsion \Jpendulum\j. The result: a force within 5% of the predicted level was measured, a very good result indeed.
The importance of this is that any time machine that we can now predict will need to use two sets of plates experiencing the Casimir effect. But until we have a working time machine available to see what happens, the best we can say is that it is early days yet.
The Casimir effect is only measurable when two parallel plate are set up, just a fraction of a \Jmillimetre\j apart in a vacuum, and the result is that a weak force then operates to push them together. Empty space is not really empty, according to quantum theory. Instead, virtual photons are continually popping into existence and then disappearing again.
In the narrow gap between the plates, the only photons which can exist are those with wavelengths which are a equal to the gap distance divided by an integer. All other photons are excluded from the gap, and this means there are more photons pressing on the outside of the gap than on the inside, producing the force we call the Casimir effect. According to Lamoreaux, the force he measured, with a separation of just 0.75 micrometre, was about one billionth of a newton.
This is the third major breakthrough in physics that has been achieved with a torsion \Jpendulum\j, after a wait of almost two centuries. Charles \JCoulomb\j used a torsion \Jpendulum\j to measure the forces between electrical charges in 1785, and soon after, Henry Cavendish had used a similar device to measure the force of gravitation in 1798.
#
"New drugs",17,0,0,0
(Jan '97)
Scientists from Abbott Laboratories announced on January 17 at the 4th Conference on Retroviruses and Opportunistic Infections that they have begun clinical trials on a new anti-\JHIV\j drug, which is called ABT-378. They claim it is 10 times more potent than one of the most powerful AIDS drugs now on the market. Test tube studies of the drug, an inhibitor of the protease \Jenzyme\j that \JHIV\j uses to copy itself, suggest that the AIDS virus has an unusually difficult time developing resistance to it.
Epothilone A is produced by a myxobacterium and can be produced in comparatively large quantities by fermentation, shows promise as an anti-cancer treatment. Two groups of scientists are working on the synthesis and modification of the molecule.
#
"Humans older than thought?",18,0,0,0
(Jan '97)
One way of defining humans is to call them "tool-making animals". Birds may use thorns, chimpanzees may use sticks, but only humans actually change and modify and make tools from other things, say most scientists.
On that basis, "made" tools are by definition the work of members of the \Jgenus\j \IHomo\i, but now there is evidence of Oldowan-style tools, reliably dated to 2.5 - 2.6 Myr (million years). The Oldowan Stone tool industry was named for 1.8-million-year-old (Myr) artifacts found near the bottom of Olduvai Gorge, \JTanzania\j. Later archaeological research in the Omo (\JEthiopia\j) and Turkana (\JKenya\j) also yielded stone tools dated to 2.3 Myr. Until the recent find, this seemed to fit with the earliest dates for bones and teeth which showed all the signs of coming from \JHomo\j, and so all was well.
These new occurrences are now securely dated between by several means, making the stone tools the oldest known artifacts from anywhere in the world. The artifacts are described as showing surprisingly sophisticated control of stone fracture mechanics, equivalent to much younger Oldowan assemblages.
So either \IHomo\i is older than we thought, or, as some scientists are now suggesting, the Oldowan tools were made by the robust Australopithecines, generally now referred to as \IParanthropus robustus\i and \IParanthropus boisei\i. If that is the case, then humans were not the only tool makers.
#
"Europa is seismic",19,0,0,0
(Jan '97)
In a report released on January 17, NASA has announced that the \JGalileo\j images taken in December 1996 reveal ice-spewing volcanoes and the grinding and tearing of tectonic plates have reshaped the chaotic surface of Jupiter's frozen moon Europa. The images were captured when \JGalileo\j flew within just 692 kilometres (430 miles) of Europa, and while they do not show currently active ice volcanoes or geysers, they do reveal flows of material on the surface that probably originated from them.
\JGalileo\j imaging team member Dr. Ronald Greeley commented: "This is the first time we've seen actual ice flows on any of the moons of Jupiter." "These flows, as well as dark scarring on some of Europa's cracks and ridges, appear to be remnants of ice volcanoes or geysers."
As we indicated last month, signs of volcanism on Europa's surface make it more likely that there is warm liquid \Jwater\j, somewhere below the ice, \Jwater\j which may just support life. Said Greeley: "There are three main criteria to consider when you are looking for the possibility of life outside the \JEarth\j-the presence of \Jwater\j, organic compounds and adequate heat," said Dr Greeley. "Europa obviously has substantial \Jwater\j ice, and organic compounds are known to be prevalent in the \J\Jsolar system\j\j. The big question mark has been how much heat is generated in the interior."
"These new images demonstrate that there was enough heat to drive the flows on the surface. Europa thus has a high potential to meet the criteria for exobiology."
The icy crust of Europa shows signs of having been disrupted by the motion of tectonic plates, with different sorts of movement in different places. This is different from the pattern seen on the surface of Ganymede, leading scientists to speculate that these two moons may have different internal structures.
\JGalileo\j scientists will have a better chance to understand Europa's interior when the \Jspacecraft\j gathers gravity data on another fly-by next November. The gravity field is measured by tracking how the frequency of \JGalileo\j's radio signal changes as it flies past the moon. This was not possible during the recent fly-by because radio conditions were degraded as Jupiter passed behind the \JSun\j from \JEarth\j's point of view.
After a swing past Jupiter, \JGalileo\j's next targeted fly-by will take it again past Europa as it passes within 587 kilometres (364 miles) on February 20.
#
"Moons full of life?",20,0,0,0
(Jan '97)
Possible planetary objects have now been discovered orbiting no less than nine different main-sequence stars. These companion objects, some of which might actually be \Jbrown dwarf\js, all have a mass at least half that of Jupiter, and are unlikely to support \JEarth\j-like life: Jovian planets and \Jbrown dwarf\js support neither a solid nor a liquid surface near which organisms might dwell.
Smaller rocky moons around these companions are another matter altogether, if the \Jplanet\j-moon system orbits the parent star within the so-called 'habitable zone', where life-supporting liquid \Jwater\j could be present. The companions to the stars 16 Cygni B and 47 Ursae Majoris might satisfy this criterion, say D M Williams, J F Kasting & R A Wade, in a letter to \INature\i this month.
Such a moon would need to be larger than 0.12 \JEarth\j masses, so as to retain a substantial and long-lasting \Jatmosphere\j, and would also need to possess a strong magnetic field in order to prevent its \Jatmosphere\j from being sputtered away by the constant bombardment of energetic ions from the \Jplanet\j's magnetosphere.
#
"Black holes in every galaxy?",21,0,0,0
(Jan '97)
A Hubble Space \Jtelescope\j \Jcensus\j reveals that black holes are common in galaxies, according to a January 13 release on the \JInternet\j. Three black holes have been identified in three normal galaxies, and the team responsible suggests that nearly all galaxies may harbour supermassive black holes which once powered quasars which are now no longer active.
They took a \Jcensus\j of 27 nearby galaxies with NASA's \JHubble Space \JTelescope\j\j and the ground-based Canada-\JFrance\j-Hawaii \JTelescope\j (CFHT) on Mauna Kea, Hawaii, which are being used to conduct a spectroscopic and photometric survey of galaxies to find black holes which have consumed the mass of millions of \JSun\j-like stars.
The key results are that:
ò Supermassive black holes are so common that nearly every large galaxy has one.
ò A black hole's mass is proportional to the mass of the host galaxy, so a galaxy twice as massive as another would have a black hole that is also twice as massive. This discovery suggests that the growth of the black hole is linked to the formation of the galaxy in which it is located.
ò The number and masses of the black holes found are consistent with what would have been required to power the quasars.
Two of the black holes weigh 50 million and 100 million solar masses, and they lie in the cores of galaxies NGC 3379 (also known as M105) and NGC 3377 respectively. These galaxies are both in the "Leo Spur," a nearby group of galaxies about 32 million light-years away and roughly in the direction of the Virgo cluster. Some 50 million light-years away, also in the Virgo cluster, NGC 4486B has a 500-million-solar-mass black hole. It is a small \Jsatellite\j of the very bright galaxy, M87 in the Virgo cluster. M87 has an active nucleus and is known to have a black hole of about two billion solar masses.
These new results suggest that smaller galaxies probably have lower-mass black holes, below Hubble's detection limit. The survey shows the black hole's mass is proportional to the host galaxy's mass. Now cosmologists will need to work on explaining why the black holes are so common, and why they seem to be proportional to the masses of the home galaxies.
The Hubble \Jtelescope\j's high resolution allowed the team to measure the velocities of stars orbiting the black hole. A sharp rise in velocity means that a great deal of matter is locked away in the galaxy's core, creating a powerful gravitational field that accelerates nearby stars.
The February 1997 servicing mission to the Hubble \Jtelescope\j will involve installing the Space \JTelescope\j Imaging Spectrograph. This spectrograph will greatly increase the efficiency of projects, such as this black hole \Jcensus\j, that require spectra of several nearby positions in a single object.
\BAnd in yet another galaxy . . .\b
The nucleus of the spiral galaxy NGC 1068 has always been obscured from direct observation by gas and dust. But radio images now suggest that it conceals a black hole of 10 to 20 million solar masses, and that the gas around it is swirling into the hole so rapidly that the nucleus is radiating at close to the theoretical limit, according to a report in \INature\i in early January, by Mitchell C. Begelman and Joss Bland-Hawthorn. Black holes, it seems, are all the go .
#
"Nobel winners (1996)",22,0,0,0
The 1996 Nobel Prize Winners were named in October, too late to appear on the \JCD\j-ROM, and somehow were not listed in our December update. So here they are:
PEACE
Carlos Felipe Ximenes Belo and JosΘ Ramos-Horta "for their work towards a just and peaceful solution to the conflict in East \JTimor\j."
CHEMISTRY
Robert F. Curl, Jr., Sir Harold W. Kroto, and Richard E. Smalley "for their discovery of fullerenes."
PHYSICS
David M. Lee, Douglas D. Osheroff, and Robert C. Richardson "for their discovery of \Jsuperfluidity\j in \Jhelium\j-3."
ECONOMICS
James A. Mirrlees and William Vickrey "for their fundamental contributions to the economic theory of incentives under asymmetric information."
PHYSIOLOGY or MEDICINE
Peter C. Doherty and Rolf M. Zinkernagel "for their discoveries concerning the specificity of the cell mediated immune defence."
LITERATURE
Wislawa Szymborska "for poetry that with ironic precision allows the historical and biological context to come to light in fragments of human reality."
\BNobel Winners for \JBiology\j (1996)\b
Doherty and Zinkernagel's prize recognises research which they conducted when they worked together in the mid-1970s at the John Curtin School of Medical Research in \JCanberra\j, \JAustralia\j. Their research on animals showed that white blood cells (lymphocytes) must recognise both an invading virus and certain self molecules-the so-called major histocompatibility (MHC) antigens-in order to kill the virus-infected cells. This concept of simultaneous recognition of both self and foreign molecules formed the basis for a new understanding of the specificity of the cellular immune system.
Their research had an immediate and long-lasting effect on immunological and clinical research. Their findings on the specificity of the T-\Jlymphocyte\j-mediated immune response proved to be a fundamental advance in the understanding of how the immune system is able to recognise microorganisms other than viruses, and in the understanding of how the immune system reacts against certain kinds of self tissue. In late January, Doherty was named as "Australian of the Year".
#
"Mathematics update",23,0,0,0
(Jan '97)
It happened in December, but news has only broken slowly about the discovery of another Mersenne prime, a type of number of special interest to people who care about number theory. They get their name from Marin Mersenne (1588-1648) who studied them. These numbers were known long before Mersenne, but he spent a great deal of time trying to find more of them.
Mersenne's name is now associated forever with numbers of the form 2\Un\u-1, and when such a number is found to be prime, it is called a Mersenne prime. When a number in this form is composite (that is, non-prime, having factors other than 1 and itself), it is just called a Mersenne number. The largest Mersenne prime so far known, the 35th found, is a number derived from n = 1398269, discovered in the Great \JInternet\j Mersenne Prime Search.
In the past, finding new Mersenne primes involved a great deal of time and effort, or a large computer, but now anybody can now join in adding to the list of Mersenne primes by contacting George Woltman's Great \JInternet\j Mersenne Prime Search at http://www.mersenne.org/prime.htm and reading what they find there.
#
"Obituaries for January 97",24,0,0,0
Clyde Tombaugh died on 17 January, 1997, at his home in Las Cruces. He was 90, but in the past year, attended lectures in a wheelchair fitted with oxygen tanks. Tombaugh will be remembered mainly as the astronomer who discovered Pluto in 1930, using a blink comparator. In the blink-comparator, a light would rapidly switch between pairs of photos, taken some time apart. This had the effect of making the stars look fixed, while against this background, a planetary body moving in its \Jorbit\j would appear to jump backwards and forwards, as the lights "blinked". On 18 February 1930, Tombaugh found the mystery \Jplanet\j, which the Lowell staff later called Pluto after the Greek god of the dimly lit underworld \JHades\j.
Tombaugh was self-taught, and spent much of his youth observing the sky. At the age of 22, he built a 9-inch \Jtelescope\j from old farm machinery and parts salvaged from his father's 1910 Buick. (When the Smithsonian asked Tombaugh, many years later, for his original \Jtelescope\j, he declined the request, saying that he was still using it!) When he sent drawings of Jupiter and Mars to the Lowell Observatory in Flagstaff, \JArizona\j, they offered him a job. Tombaugh helped search beyond Neptune for a "\JPlanet\j X," which astronomer Percival Lowell had predicted based on perturbations in Neptune's \Jorbit\j.
After finding Pluto, Tombaugh discovered six star clusters, two comets, hundreds of \Jasteroids\j, several dozen clusters of galaxies, and one supercluster. After helping test captured German \JV-2\j rockets in 1946 at White Sands Missile Range, he went to New Mexico State University, where he taught until 1973.
#
"Archaeological finding in Beijing",25,0,0,0
(Jan '97)
Chinese scientists in \JBeijing\j have uncovered the first evidence of prehistoric human activity in what we now call \JBeijing\j. The discovery of stone tools and other artifacts, estimated to be around 20 000 years old, is likely to offer clues about how early human settlers populated the 300 000-square-\Jkilometre\j North China Plain.
Workers have spent the past month digging up pieces of flint, charred animal bone fragments, and charcoal from a 150-square-metre site in \JBeijing\j's main business district at the construction site for an office complex called Dongfang Square on Wangfujing Street. The people who left the remains were in the habit of roasting game and knew how to make and use stone implements for cutting and chopping. The Chinese are hailing it as a link to "Peking man", but there is still quite a sizeable gap to fill.
#
"Short science updates (Jan 1997)",26,0,0,0
(Jan '97)
In \JJapan\j, Yasuyuki Shirota of Hiroski University is trying a smaller form of \IJurassic Park\i: bringing a giant moa, \IDinornis giganteus\i, back to life. Shirota will be using DNA taken from a dead moa's \Jfemur\j, introducing the genes into chicken embryos. He hopes to find some of the homeobox genes that should be present in the DNA, and to use these to generate a moa-like chicken.
In \JAthens\j, archaeologists report that they have discovered the remains of the \JLyceum\j, the school opened by Aristotle.
NASA saw the top stories of 1996 as: "Did Mars Once Harbor Life?", a story we have already covered, "Shannon Lucid Sets US Space Flight Record", referring to her 181 days on board the Russian Mir craft, "A New Wave of Martian Exploration Begins", about the successful launches of the Mars Pathfinder and Mars Global Surveyor missions, "\JGalileo\j Unveils Jupiter and its Moons", (described above), "\JHubble Space \JTelescope\j\j Continues to Amaze Astronomers", referring to the discovery that old stars go egg-shaped, as well as finding images of galaxies under construction and stars in their death throes, "Next Generation Launch Vehicle Chosen for Development", announcing the X-33 test vehicle, a one-half scale version of the planned Reusable Launch Vehicle. This model is to have had fifteen flights by the end of December 1999. Called "VentureStar," the vehicle will launch vertically like a rocket and land horizontally like an aeroplane.
Wheat was in the news as January ended. An article in \IBioScience\i by David Pimentel showed the \Jwater\j needs to grow a \Jkilogram\j of a number of crops. You need 500 litres (500 kg), to grow 1 kg of potatoes, and wheat and \Jalfalfa\j both require 900 litres, rice demands 1910 litres, soya beans 2000 litres, a single \Jkilogram\j of chicken uses up 3500 litres, but the same amount of beef required 100 000 litres of \Jwater\j, mainly to grow the crops to feed the \Jcattle\j. In the fairly dry environment of the Mir space station, NASA \Jastronaut\j John Blaha harvested some space wheat, but there was no seed. Maybe they didn't give it enough \Jwater\j? Meanwhile, the US Air Force has been looking for a gentle way of stripping paint from Stealth bombers when they need a new coat of paint. The answer, they say, is to use pure starch, extracted from wheat.
\JRussia\j's main component in the International Space Station, the service module, is now some eight months behind schedule. Without this module, the whole space station will be unable to function.
And even more serious, the Trojan swarm, a large population of \Jasteroids\j, possibly bigger than the asteroid belt orbits the \JSun\j out near the \Jorbit\j of Jupiter. A new study of the swarm's orbits suggests that the orbits are not indefinitely stable and that more than 200 'escaped' Trojans over 1 km in diameter are already roaming the \JSolar System\j, a few of them on \JEarth\j-crossing orbits. Reminding us of the Cretaceous-Tertiary impactor, we have to wonder if any of these are coming our way. Another January report suggests that the asteroid belt we see today is a mere remnant of what was once out there, so perhaps the worst is over.
#
"Green cells for a greener Australia?",27,0,0,0
(Jan '97)
\JAustralia\j is one of the worst offenders when it comes to carbon dioxide emissions. It has a first world demand for \Jenergy\j, no nuclear power, limited hydroelectric power, and places little reliance on oil as a fuel for its electricity generation, using the country's abundant \Jcoal\j supplies instead. Successive Australian federal governments have so far done very little beyond the level of \Jrhetoric\j about the problem.
\JCoal\j produces more CO\D2\d per joule when it is burnt than \Jhydrocarbons\j such as oil and \J\Jnatural gas\j\j. These fuels produce some of their \Jenergy\j from the \Joxidation\j of the \Jhydrogen\j in the \Jhydrocarbons\j, yielding harmless \Jwater\j vapour, while \Jcoal\j yields almost nothing but CO\D2\d. So as countries around the world start doing their bit to reduce greenhouse gases, \JAustralia\j's poor record and lack of action is looked at more and more harshly.
Alternative \Jenergy\j sources have long been used in small amounts in \JAustralia\j, but the areas of high tidal range are far from the main cities, and so are the most windy areas. Rainfall is generally too low for any increase in hydroelectric power, and \JAustralia\j's old mountains are too low in any case to provide the sorts of levels needed. And while \JAustralia\j has abundant uranium deposits, there is a general public reaction against nuclear power. That leaves just one major resource that \JAustralia\j has in abundance: sunlight.
The problem has been a lack of efficient solar cells, but now that task is coming under control. Situated partly in the tropics, any Australian developments are likely also to be of immediate use across much of the third world, so some of the recent breakthroughs from the international team at the University of \JNew South Wales\j look very exciting.
The need is for a large, efficient, cheap, cell which has a low maintenance demand. It has to use cheap materials in small amounts, it needs to be made in large units so connection costs are reduced, and it has to be efficient so that a good \Jenergy\j yield is delivered.
The University of \JNew South Wales\j in Sydney has drawn together interested researchers from all around the world, working on a number of promising directions at the same time, under the leadership of Professor Martin Green, who developed the first "Green cells" some years ago. The workers are trying to improve the performance of photovoltaic cells to get them closer to the limits of performance, put thin films of silicon on cheap materials while maintaining performance, and also develop the hardware that is needed to use the cells in practical environments.
#
"Photovoltaic cell--what is it?",28,0,0,0
(Jan '97)
This can be the subject of some mind-bending \Jmathematics\j, or it can be kept simple, and slightly less precise. This is the fuzzy version for beginners, and is really only an approximation, even for silicon cells. A photovoltaic cell (as it is described here) is a semiconductor device, made of silicon which has been carefully purified, and then equally carefully unpurified with selected impurities. Some of the silicon is "doped" with atoms which leave "holes" for electrons to fit into ("p-type"), some is doped with atoms which supply extra electrons ("n-type"). Neither form of silicon is electrically charged, but the additional electrons and "holes" are essential to the operation of a semiconductor like this.
The two types of silicon are combined to make a diode, a device which typically passes electrical current in one direction only. Electrons in the flat thin diode that we call a photovoltaic cell are "excited", or given \Jenergy\j by light shining on the cell, so that they "jump" to a higher \Jenergy\j level, and can then move across the gap or junction between the two types of silicon. The UNSW researchers have also tried adding a second doped surface below, producing a "bifacial design" which delivers even more power-it has, they say, limited but novel applications.
There are other materials which could be used as well, but these seem all to have been eliminated, either as laboratory curiosities, not able to compete, either on grounds of efficiency or cost, or on account of the dangerous nature of the materials used in the cells. Some, like cadmium telluride, show some promise, but cadmium and tellurium are not nice substances to work with, so while there are large-scale manufacturing efforts on cadmium telluride and copper indium selenide thin film cells, the future looks like being largely a silicon one.
\BImproving performance\b
The obvious first step to getting better results is to increase the surface area of the cell. You can do this either by increasing the size of the cell, or by reducing the shadows that fall across the surface of the cell.
One of the nastier problems about these cells is that you need metal conductors to gather up the electrons and carry them away, and metals do not pass light through. Not, that is, if the metal is thick enough to carry any significant current, and that means the cells begin to defeat their own purpose. The UNSW team have got around this by cutting grooves into the surface of the cells with a \Jlaser\j, and laying the metal strip on its edge. This simple step has raised efficiency (the amount of light converted to electricity) from 15% to 20%.
Already a BP Solar commercial plant in \JSpain\j is producing cells equivalent to 2 megawatts a year. These are commercial cells produced in commercial quantities, but even these have efficiencies of up to 17.5%. Other licensees include Solarex in the USA, Samsung in Korea, Central \JElectronics\j Laboratories in India and Deutsche Aerospace in \JGermany\j.
\BCheaper materials\b
The biggest cost in making silicon cells is making the very pure silicon. But while a solar cell is typically around 200-500 micrometres thick, more than 90% of the output comes from the top 15-20 micrometres. So it makes good sense to try laying down a thin film of the expensive material on some cheaper surface, while (hopefully) maintaining the efficiency. This last requirement has been made just a little bit harder, by trying at the same time to reduce the degree of purity in the silicon, thus making the material cheaper still.
Silicon comes in a number of forms. Monocrystalline silicon is expensive, and has to be sawn, but it offers a very high efficiency. Polycrystalline silicon also has to be sawn, but it is cheaper, being cut from cast ingots of silicon. The cells made from this material are slightly less efficient, and they still need to be thick. Amorphous silicon is grown as a thin film, but it lacks the crystalline structure, so the holes and electrons are more likely to recombine before they can be usefully collected, thus wasting the \Jenergy\j used to separate them and reducing the overall efficiency of the cell.
The present direction of the UNSW team has been to go for a multilayer structure, laid down on the back of a sheet of glass. Their aim is to produce 15% efficient factory-produced solar cells at a cost of US$2 per peak watt within five years. But who knows, they may go even better: in 1995, the team showed what they were made of when a cell made by Dr. Aihua Wang and her husband, Dr. Jianhua Zhao, postdoctoral researchers working with the UNSW Centre for Photovoltaic Devices and Systems, turned in a 21.5% efficiency in independent tests at Sandia National Laboratories in New Mexico.
This performance eclipsed the previous best, also by Australian workers, of 17%. But while that was only a 4 cm\U2\u cell, in 1996, the team demonstrated another record-breaker, a 23.7% efficient large area cell (22 cm\U2\u). They went on to produce a large module which turned in a 22.7% conversion efficiency, proving that they can scale the success up effectively.
The roof of a house is a wasted space. Every dwelling has one, and few of them make any use of those spare acres and hectares of real estate. The roof keeps the rain out, but it does not actually produce anything. The UNSW team have their eye on all those roof spaces, working on tiles that collect light and deliver it to a smaller cell, mounted on the tile. Called the static concentrator tile, this will be a highly efficient contributor to future Australian power needs, since the typical Australian roof is sloped at the right direction to gain most from power of this sort.
This sort of power may be all very well in the home, but it won't do very much about running your toaster or TV. The power that comes out of these cells is direct current, the sort of stuff we get from car batteries, not the alternating current we get from the mains, and the voltage is all wrong as well. But that need not be a problem: all we need is an inverter, to convert the DC low voltage to AC mains voltage, to synchronise it with your mains supply, and trickle it into your system.
But if you have gone that far, why not go the whole hog? Why not pass the spare power on, so others can use it when you can't? That means "distributed utility connected photovoltaic systems", which use the grid as a storage device for your spare power. You make as much as you can, use what you need, and deliver the rest of the power to, or withdraw power from, the grid.
So while you are out at work, your roof at home can be merrily converting the sunlight on the roof into power needed to meet the peak demands of industry and commerce (including your work-place), all without any carbon cost, any contribution to the \J\Jgreenhouse effect\j\j, once the units are manufactured.
There are a few drawbacks, of course. The equipment is expensive, so users will either need to be subsidised in some way for adding the fittings, or paid a higher rate for the power they generate. One utility in \JGermany\j has offered just under $2 per \Jkilowatt\j-hour. This allows for these systems to have a reasonable financial pay back period. Net metering, where you pay (or are paid) for the balance of power used or delivered may be fairer, but they lead to prolonged payback periods.
There is also a risk for power workers: if the power is isolated in a section of line, but all the houses in the area are pumping electricity back into the lines, somebody could be hurt or killed. So there will need to be a fail-safe form of monitoring to ensure that all these miniature power stations are turned off when the grid goes down.
One thing is certain: around the world, the demand is there. In 1990, \JGermany\j started its "Thousand Rooftop Program" which saw between 1 and 4 kW of photovoltaic modules installed on more than two thousand residences by 1995, and other countries have also followed suit with residential rooftop systems.
\BOne for the road\b
One of the most promising prospects for the future is the solar car. When the Swiss \ISpirit of Biel II\i won the 1990 World Solar Challenge across \JAustralia\j, it was on Australian technology, using cells made by Telefunken under licence from the UNSW group, and nine of the first ten cars in the 1993 US Sunrayce used cells based on the same UNSW technology.
The large cells which were built in mid-1996 starred again when improved technology was used in producing large batches of cells used by the first and third-place getters in the 1996 World Solar Challenge. For the Sydney team and their licensees, this is now turning into a mature technology, rather than merely being an experimental science.
And just in time, too. Californian legislation required that 2% of all cars sold must, by 1998, have zero emission, up to 5% in 2001 and 10% by the year 2003. While the first two dates have now been dropped, the third remains in force, the effect of the legislation on electric car development has been huge. Other states are sure to follow, and general Motors new EV\D1\d has a regulated top speed of 130 km/h, can accelerate to 100 km in 9 seconds and has a range of over 100 km between recharges. That should do for a run to and from the workplace-especially if you have a few photovoltaic cells from the University of \JNew South Wales\j on your roof.
Just think of the social changes-undercover parking may even become a thing of the past!
#
"Car navigation system update",30,0,0,0
(Jan '97)
In the 1964 James Bond movie, "Goldfinger", we were introduced to the idea of a mapping system in a car. The fictional spy's device had a circular screen, with the user's position shown in the centre, superimposed on a map, while the evil-doer's car was represented by a "blip".
While we may not yet be able to track the Goldfingers of this world from Britain to \JSwitzerland\j, as James Bond did, a third of a century ago, we are getting closer to a worthwhile in-car navigational system which can tell us where we are, and where to go to reach a selected destination. The easy part, getting people from one town to another along the open road, has been in place for some time. The real problem, providing guidance to an exact destination in a large city, is still being worked on, but the answers are falling into place fast.
General Motors began to offer a route guidance system option called Guidestar in its Oldsmobile line of cars in 1994. At first the option was to be limited to cars sold in the \JSan Francisco\j Bay area, but within a few months, it spread to include Miami, Orlando, Washington and \JBoston\j. Guidestar, being an American system, offers scales from 220 yards across the screen to four miles - about 200 metres up to 6.5 km.
Soon after the GM announcement, the Avis Corporation said that it would be offering the GM Guidestar on some of the Oldsmobiles it rents. Hertz Corporation followed suit with an announcement that it would offer a route guidance system called Neverlost in selected cars and in selected geographic markets. The in-vehicle navigation system had been launched, in a manner of speaking.
Like every early technology, you get what you pay for, but sometimes you pay for more than you get. The American \JAutomobile\j Association's automated version of its Triptik routing product, like the Royal Dutch Touring Association's Routewijzer, can be classified as place-to-place, or inter-town routing. They start at a well-known location in or around the city of origin, provide directions on the main roads to the destination city, and then leave the traveller at a landmark in the destination city. Before and after that, you are on your own.
Now the race is on to provide a system that will allow you to find your way to the first landmark, and then to navigate within the destination city. To do this, navigational systems will need to be able to sense and/or display all of these:
ò where the car is right now, and where it is pointing;
ò what traffic conditions lie ahead (one-way streets, turning restrictions etc);
ò what turn should be taken next, and what landmarks to look for;
ò what adjustments to make if the driver takes a different turning.
\BWhere the car is right now, and where it is pointing\b
Position can most easily be measured from the \Jsatellite\j-based Global Positioning System (GPS), a system which has long been available to the US military, \Jaircraft\j and \Jwater\j craft, but which is only just becoming available to the driver of an ordinary civilian motor vehicle. GPS relies on getting signals from several of the 24 satellites in the GPS chain, orbiting the \Jearth\j, by measuring how long it takes for signals at 1575 MHz from the satellites to reach the receiver.
In the past, GPS has been limited by the deliberate insertion of random figures known only to the US military, so that only the military had half-metre accuracy (from their own encrypted signal), while other users had to be satisfied with hundred-metre accuracy, which was good enough for most purposes, but not good enough for navigation in a city. In March 1996, all of that changed when Vice-President Al Gore and US Transportation Secretary Federico Pe±a ushered in "a new era of travel, time-saving and communication" when they signed a Presidential Decision Directive allowing the civilian and commercial use of GPS.
"Most people don't know what GPS is," Secretary Pe±a said in a press release. "Five years from now, Americans won't know how we ever lived without it. GPS will change the way we live in the way cellular phones, \Jfax\j machines and the \JInternet\j have impacted daily life."
There are still limitations to the civilian use of GPS: in a city, the satellites needed for accurate navigation may not be in sight. That is why the car needs to sense where it is going, and how far it has gone since the last accurate "fix". There are also problems with interference, but plans are in place to add a second set of civilian signals, in early 1997, at either 1309 MHz or 1207 MHz, should increase the reliability of GPS.
The databases required for routing the right direction down a one-way street, and recommending legal or physically possible turns at intersections, are gradually spreading across the continent.
\BTraffic conditions ahead: the need for a \Jdatabase\j\b
The roads \Jdatabase\j is like virtual reality without the headsets. Every detail that a driver may need has to be stored in there. Intersections have to be distinguished from overpasses, traffic restrictions, turn restrictions, speed limits, direction of travel, all have to be there. If you may need it to plan the best route, the \Jdatabase\j needs to know it.
This means buying or obtaining maps, and laying down layer after layer of information, linking and cross-checking, to make sure that the user does not find himself looking at the mapper's nightmare: "the business end of a busy freeway off-ramp". But even there, the work has not finished: to be competitive, the \Jdatabase\j will need details on petrol stations, cash machines, hotels, tourist attractions, and more. In the future, the mapmakers may earn more from "Yellow Pages"-style listings than they do from selling the actual maps, with every business needing to be locatable by their customers as they drive by, guided by their all-knowing navigation systems.
Distribution will be a problem. A national \Jdatabase\j is not feasible because it is too large. It won't fit on a \JCD\j or any other media that are in the pipeline. In any case, the \Jdatabase\j is regional, and people will only want the areas they are planning to travel in, when they are going there, for the databases will have an accuracy shelf-life of only one or two years before they need updating. The car rental companies, Hertz and Avis, are using hard disks instead of removable media, but this is a special case.
Perhaps the answer will be to market regional CDs and something like to the strip maps put together by \Jautomobile\j associations for long-distance travel, covering a strip which is perhaps 100 km wide, running between two localities, perhaps specified by inputting a set of postcodes or zip codes to specify the strip. This will become easier to market as systems become more widely used and data formats become standardised and interchangeable.
\BDealing with variations\b
Any system mounted on a car, or any car system relying on a remote system, will need to use "dead reckoning", deductive reckoning of the position, based on the last accurate positional fix, plus movements since that time, assisted by any turns - if the system thinks it is five metres short of a left turn, but the driver has turned, it will need to accept that, in all probability, it is now entering the left turn in question.
Doing this requires, as well as a GPS system, a gyro compass, wheel sensors for map matching, an on-board computer powerful enough to perform complex route calculation and linking these components to a mass storage device and a display. The main problem is that cities tend to block access to satellites, hence the need to build on logically from the last accurate fix.
#
"Navigation solutions for the car",31,0,0,0
(Jan '97)
There is more than one way of skinning the navigational cat, and all of these solutions are being explored, or have been explored in the past:
\BDedicated Terminals\b
In the mid-80s, NavTech introduced a pay-per-use terminal called DriverGuide in the \JSan Francisco\j Bay area. Terminals were installed in a few hotels and made available free to hotel guests, as a sort of electronic concierge. The system was made up of a PC and printer built into a specially designed kiosk. Users could obtain address-to-address routes in the form of a narrative printout with turn-by-turn driving instructions.
The system was discontinued after a few hundred units had been installed. The systems, especially the printers, were difficult and costly to maintain and support, and the printed format required a passenger to read the instructions, or a risky drive. A later version, also from NavTech, had a touch screen, but this also failed to prove successful.
\BTravel Reservation Systems\b
Some airline booking systems will automatically generate printed information for travellers, outlining routes to be followed in getting to meeting places, but once again, this is usually in the form of narrative instructions, rather than as a map, so the same drawbacks are still apparent as with the dedicated terminals.
\BVoice Operators\b
This American scheme began as a way of promoting cellular phone services, and involves an operator using a computerised system to relay narrative advice on directions, or where convenient, faxing a hard copy to the caller. This has not been a raging success, by all accounts.
\BIn-vehicle Systems\b
These systems can be in the best James Bond tradition, with a map that moves, keeping the car's location in the centre of the screen, with either north being at the top of the screen, or the car's current direction being at the top of the screen. Other systems use voice instructions alone, or voice instructions plus a manoeuvre diagram for turns, entries, exits etc.
The digital map systems obviously require a large amount of storage for the maps, which means the system really needs to have a CDROM drive, and appropriate CD-ROMs for the area being travelled. The voice systems need a high level of accuracy, especially in inner city streets, where the "next left" may only be 15 or 20 metres from the previous left turn. That being said, the in-vehicle systems are the ones showing the greatest promise, and the greatest levels of development activity.
\BIn-hand Systems\b
These are dedicated computers, rather like a "Personal Digital Assistant", but with only a single function. These systems are restricted by the storage demands of maps. An area 10 kilometres wide and 6 kilometres long, a typical central slab of a moderately large modern city, will take 10 megabytes of storage, and the \Jdatabase\j of street names, numbers, directions for traffic and legal turns can easily take close to the same amount of storage. It might be a good idea to look very carefully before committing to an investment in this technology.
\BIn-vehicle systems with Pizazz\b
The industry is very much in the shake-out phase, and many of the fancy systems now being planned may find themselves overtaken by products such as GPSS, which is being widely distributed around the PC community. This is a Windows application intended for use within a sound-capable Notebook PC connected to a GPS \Jsatellite\j receiver carried within a car, or within a PC-compatible car computer system. Developed in Britain, it is now becoming available in other parts of the world.
It displays the car's position on a map while a voice speaks the position, saying something like: "We are 25 miles west of London and 1.5 miles south east of Ascot"; and gives guidance: "destination is 300 yards ahead at your 12 o'clock". The direction is given as if to an air pilot, so "6 o'clock" is directly behind the car.
GPSS can be used with mouse and keyboard, but can also work with voice recognition, letting the car driver keep watching the road. The system holds information about eating places, petrol stations and the like, so when the driver says: "Eating Place?", it should answer with, "OK, the nearest eating place is Little Chef, on the A329, 1 mile behind us, at your 7 o'clock". While there are still problems with voice recognition systems which translate an American saying "Recognise speech" as "Wreck a nice beach", the software is getting better all the time.
This is one area that may be worth watching and even investing in. It will certainly bear watching.
\BMessage systems\b
In a number of countries, low power transmitters can advise drivers of traffic conditions ahead, with enough time to allow drivers to plan and select an alternative route. Systems such as this may be extended, but they could also be extended with onboard or roadside equipment. Toyota is experimenting with CCD cameras to detect lanes on a road automatically.
\BMaking your own\b
Fugawi is a Japanese software system which allows users to scan in any map or add any bitmapped map, add a scale, and insert it into a GPS \Jdatabase\j by identifying three key points with a grid location based on any of a hundred standard systems. The systems corrects for skewing or stretching, and works with a variety of projections.
\BThe future\b
There is more to come: an experimental system is being tested in \JAustralia\j which will allow the navigation program to record times on alternative routes, and learn about the slow roads. While this might be catastrophic if all cars reacted in the same way, \Jday\j by \Jday\j, switching routes in unison, this is unlikely to happen in practice.
Closer to the present looks as though 1997 will be the year in which the promises of 1994 to 1996 start to bear fruit, as the databases are completed, and begin to interlock and overlap. With a gigabyte RAM chip on the drawing boards, with digital video disc (DVD) just around the corner, and with the software getting better all the time, self-contained navigation systems should really take off in 1998 - unless the new Nokia telephones with "built-in everything" show that they can access more brute processing force in a distant centre, and provide cheaper guidance from a distance.
As in every other new technology, you pay your money and take your chances. One thing is certain: paper maps will continue, if only because travellers can take them out of the car without losing any information: for now, the paperless car is about as likely as the paperless office.
NavTech did not have a \Jdatabase\j up and functional for \JAtlanta\j, but by the time of the Sydney Olympics in the year 2000, we can reasonably expect overseas visitors to arrive in \JAustralia\j, climb into their cars and navigate effortlessly around Sydney's maze of harbour crossings. And Paris, \JRome\j or London in the peak hour will be a breeze.
Sample map, taken from the GPSS advertising, showing their system in operation. This is one they give away free, and they say that while it may be a large-scale map, voice guidance has a resolution that will allow you to be guided to within 100 yards of your destination.
#
"Hubble gets a facelift",32,0,0,0
(Feb '97)
The second in a planned series of four servicing missions for the \JHubble Space \JTelescope\j\j took off on February 11. The main purpose was to install the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) and the Space \JTelescope\j Imaging Spectrograph (STIS), both of which were successfully placed.
The only real concern arose when one of the arrays of solar cells began to flap about, apparently as a result of air being vented from the air lock as shuttle crew moved out into space on one of their EVAs. In addition, one of the tape recorders was replaced with a state-of-the-art Solid State Recorder (SSR) and a gyro Rate Sensing Unit (RSU) was replaced with a Hemispherical Resonator Gyro (HRG) unit.
The Shuttle Discovery came in for a night landing at the Kennedy Space Centre in pre-dawn darkness on February 21, completing a 6.6 million \Jkilometre\j (4.1 million mile) mission of 150 orbits. It was the ninth night landing in Shuttle program's history. The crew of seven were able to celebrate a highly successful refurbishment of the \JHubble Space \JTelescope\j\j, which will be able to continue its fifteen-year mission. In five space walks, the crew replaced two scientific instruments, and also repaired the tattered thermal \Jinsulation\j of the observatory. As a parting gift, the shuttle boosted Hubble into a higher \Jorbit\j.
This service flight was the second, following the service flight of December 1993, and further missions are planned in mid-1999 and 2002. These past and future servicing missions were always planned, and Hubble has grapple points, as well as 76 handholds to make servicing easier. The two instruments taken out, the Goddard High Resolution Spectrometer and the Faint Object Spectrograph, have been replaced by the Space \JTelescope\j Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS).
STIS is able to view two-dimensional images of spectra in the ultraviolet, visible, and near infrared wavelengths. It gathers 30 times more spectral data and 500 times more spatial data than existing spectrographs on Hubble, which only look at one place at a time.
Because it covers an area, rather than a point, STIS can supply information about the relative movements of stars in a galaxy, which will yield better information about supermassive black holes. NASA suffered a little from public misunderstanding, as the Goddard Spectrometer failed just a few days before the mission, and they have been at pains to point out that the replacement was just part of the standard schedule.
NICMOS is fitted with corrective \Joptics\j to compensate for the spherical aberration in the HST's primary mirror. NICMOS will give us a clear view of the universe at ear-infrared wavelengths between 0.8 and 2.5 micrometres, wavelengths beyond the power of our eyes to see. Because distant objects in the universe are red-shifted, we will be able to see further than we have been able to do with the other Hubble instruments, with sensitivity to light at optical and ultraviolet wavelengths.
#
"A new radio telescope in space",33,0,0,0
(Feb '97)
This month saw the launch of a new Japanese \Jsatellite\j, a radio \Jtelescope\j created by the Institute of Space and Astronautical Science. The \Jsatellite\j is in an elliptical \Jorbit\j, varying between 1,000 and 20,000 kilometres (620 to 12,400 miles) from the \Jearth\j. This \Jorbit\j provides a range of distances between the \Jsatellite\j and ground-based telescopes, which is important for obtaining a high-quality image of the radio source being observed. It takes about six hours for the \Jsatellite\j to \Jorbit\j the \Jearth\j, and was successfully set into \Jorbit\j after its launch on February 12.
The \Jsatellite\j has unfolded into an 8-metre (26-foot) diameter orbiting radio \Jtelescope\j which will observe celestial radio sources together with a number of the world's ground-based radio telescopes. The \Jsatellite\j was launched from ISAS's Kagoshima Space Centre, the first launch with ISAS's new M-5 series rocket.
The principle is a simple one: readings taken at different points around the \Jorbit\j are added together, and the end result is the sort of image you would have if you had a dish the size of the \Jsatellite\j's \Jorbit\j, once the results are combined with the readings from other radio telescopes, all over the face of the globe. Observations will be able to be made at 1.6 GHz (18 cm), 5 GHz (6 cm) and 22 GHz (1.3 cm).
Following its successful launch, the \Jsatellite\j was renamed HALCA, an \Jacronym\j for 'Highly Advanced Laboratory for Communications and \JAstronomy\j'. The alternative \Jtransliteration\j from the Japanese is "Haruka", meaning far-away.
A Russian \Jtelescope\j will be added to the array as well: at last report, it was still scheduled for launch in 1998. The primary targets of the completed system are expected to be active galactic nuclei; \Jwater\j masers, OH masers, radio stars, and pulsars.
#
"Sheep cloning a success",34,0,0,0
(Feb '97)
February brought us news of the world's first cloned sheep. Workers at the Roslin Institute in Edinburgh have succeeded in a task which, up until now, has only worked with mice. The implication appears to be that what can be done with sheep can probably be done with humans, opening up a whole range of "Brave New World" scenarios, and jokes about Elvis have been resurrected as well-unlike Mr Presley, so far.
Credit is shared between the Roslin Institute, part of Britain's Biotechnology and Biological Sciences Research Council, and PPL Therapeutics, an Edinburgh biotechnology company, whose shares rose in value after the announcement.
Dr Ian Wilmut announced on February 22 that he had taken the DNA from a sheep's \Jovum\j, replaced it with the DNA from an adult sheep, and grew it into a living lamb. The DNA was taken from the udder of the "donor" sheep, but the major breakthrough was in getting the DNA to become inactive, or quiescent, to use the geneticists' term. He then took an \Jovum\j from another sheep, and removed all of the DNA from it.
The next step was to fuse the \Jovum\j with an adult udder cell, after which the udder cell DNA took over the \Jovum\j, and controlled its growth and division. Wilmut then implanted the embryo in yet a third sheep, who gave birth to a lamb which is genetically identical to the original DNA donor. Dolly was born in July last year, and during the last week of February, was responsible for more headlines bearing the phrases "Hello Dolly" and "Send in the Clones", than \JHollywood\j and Broadway ever inspired.
The next few months should see a flurry of bio-ethicists straining to make themselves heard about what the implications of this are, but the guidelines are already in place to deal with issues which surround human clones, so problems are unlikely, at least in the short term. In the longer term, the costs of breeding a master race are likely to be so large that no "secret program" will be possible, but a number of \Jtelevision\j script writers were probably reading the scientific press as February ended.
\BScientific background\b
No less than three sheep were produced by much the same \Jcloning\j method, but only Dolly was able to command serious scientific attention, as the other two sheep had been developed from cells taken respectively from a fetus and from an embryo. Given the existence of Dolly, the other sheep were of limited interest.
The genes in a cell become committed at an early stage of development, and while you can take a nucleus from one egg and put it in another egg, with some hope of achieving a result, the standard view has been that you could not take an adult cell and expect the genes to develop in any sort of normal way to grow an individual. When a cell has differentiated, specialised to form some kind of tissue, that should be enough to block it ever growing to a whole individual.
Against this, single plant cells can be taken and cultured to produce a whole plant, but somehow, plants all seemed to be a whole lot less complicated. Even when Wilmut's group succeeded in establishing clones from cell lines taken from early embryos (blastocysts), the standard barriers to successful \Jcloning\j seemed still to be in place. Mouse embryos could be cloned if you took nuclei from the eight-cell stage, but no later, because the genetic material had been switched to a pathway of development, and could not be switched back.
Another problem was getting the cell division cycles of the donor cell and the host cell lined up-remember that a key feature is putting the donor nucleus into a cell where the surroundings are egg-like-so that later cell divisions do not produce cells with non-standard numbers of chromosomes. Wilmut's team managed to hold the donor cells in an arrested division state called the diploid G0 phase of the cell cycle by serum starvation.
The effect of this was to get a better synchronisation between the host cell and the new nucleus, but Wilmut's group may also have been fortunate in another respect-the sheep embryo does not begin committing cells (called \Itranscription of the \Jgenome\j\i) until the 8-16 cell stage, against the 2 cell stage in mice, so there is more time for synchronisation to develop.
If this factor is important, then it casts doubt over whether or not other \Jmammal\j species can be cloned in the same way, as the start of transcription of the \Jgenome\j varies between species. In the first few days of March, a monkey \Jcloning\j breakthrough was reported which may or may not prove significant in the longer run.
The best analysis suggests that the \Jcloning\j of human beings by this method would be possible in somewhere from one to ten years from now. The first calls for banning actually went out before the story broke in \INature\i, with an email from an unnamed "Harvard academic" who urged that Nature not publish details of the procedures until more thought had been given to questions of bio-ethics. Soon after the publication, President Clinton urged a moratorium on all such experiments. In Washington, Jeremy Rifkin demanded that the attempt to clone humans be placed on a par with rape, child abuse and murder.
On the other hand, Axel Kahn, a French geneticist (also in the news this month over transgenic corn) has suggested that side-issue of the technique might help a woman who had a serious mitochondrial disease to have a healthy child by inserting a nucleus into a donor cell. While this is not \Jcloning\j, it may still be more acceptable than some of the other scenarios which are floating around.
#
"Lead linked to bad teeth",35,0,0,0
(Feb '97)
Lead \Jpollution\j has long been linked to lower IQs in children. Now research by Francisco Gil and colleagues at the University of \JGranada\j in \JSpain\j suggests that lead \Jpollution\j is also linked to bad teeth. The study, based on the analysis of children's "milk teeth" shows that teeth with more lead have more cavities. Following on, they found that children with ten or more cavities and blood lead levels three times as high as children with no decay.
#
"Transgenic corn in France",36,0,0,0
(Feb '97)
Axel Kahn, a leading French geneticist (see also the Dolly story above), has resigned as president of \JFrance\j's Biomolecular \JEngineering\j Commission, which regulates the way in which genetically altered organisms are used, in response to the government's decision to ban the growth of transgenic corn in \JFrance\j. This was an about-turn by the government, which had, just a week earlier, approved the transgenic corn for consumption by animals and humans, and just two months after the European Commission gave the approved the corn's sale in Europe, largely at the insistence of \JFrance\j.
A number of transgenic corns exist: the banned hybrid is resistant to the European corn borer, and was being marketed by Novartis. Another company, Mycogen, has an independent application in to sell their own transgenic borer-resistant corn, and both companies expect to sell increased quantities of their new hybrid corns in the US this year, with several other companies also introducing other corn borer-resistant varieties.
#
"Keeping the worms down",37,0,0,0
(Feb '97)
Nematodes can be major pests of plant crops, and researchers have had little success in breeding plants that are resistant. Now a gene for \Jnematode\j resistance has been found. Curiously, it bears some resemblance to previously discovered resistance genes for \Jbacteria\j and other pathogens. The next step: to see if the gene can be spliced into a variety of crop plants. The step after: to get the plants approved for sale to humans.
#
"Plants can shout for help?",38,0,0,0
(Feb '97)
Plants in many groups are able to make oil of wintergreen-what the chemists call methyl salicylate. Evidence is mounting that this is released by diseased plants, and triggers other nearby plants to step up their defences. The latest evidence comes from studies on \Jtobacco\j plants which have been deliberately infected with \Jtobacco\j mosaic virus.
#
"Cancer gene--how it gets nasty",39,0,0,0
(Feb '97)
A gene called src was the first such gene to be discovered. Normally, it lives harmlessly in normal cells, but it can suddenly become active if affected by a virus or by mutations. New evidence, reported this month, brings researchers close to identifying the "switch": that turns this gene into its undesirable form.
#
"Birds, dinosaurs, and the big asteroid",40,0,0,0
(Feb '97)
Anybody who has seen \IJurassic Park\i will know that to some biologists, the dinosaurs are still with us, only now we call them birds. But if the birds are just fluffy dinosaurs, how did they get on when the Cretaceous turned into the Tertiary? Quite well, according to a study of the DNA in different bird groups, made public this month. It looks as though many bird lineages penguins, parrots, and chickens, arose before the \Jextinction\j events at the K-T boundary and flew, hopped and swam through it unscathed.
#
"Strong spider web",41,0,0,0
(Feb '97)
The black widow \Jspider\j of America (a close relative of the \JNew Zealand\j katipo and the Australian redback), \ILatrodectus mactans\i, produces the strongest silk of any \Jspider\j web yet tested. Anne Moore of Scripps College in Claremont \JCalifornia\j is the researcher.
#
"Spy satellite photos",42,0,0,0
(Feb '97)
Microsoft has arranged to publish very high resolution (to one metre) photos taken from Russian spy satellites during the 1990s. These are claimed to be the first Russian-\Jsatellite\j-origin pictures to appear on the \JInternet\j. The first ones up will show Los Angeles, Washington, D.C., \JSan Francisco\j, \JRome\j and London. They may be downloaded for a charge of $30 per square \Jkilometre\j, with the proceeds to be split between an American company which brokered the deal and the Russian space agency.
#
"Mobile phones and car accidents",43,0,0,0
(Feb '97)
A recent University of Toronto research study shows that drivers who are distracted while talking on a cellular phone are four times more likely to be involved in a car accident. Insurance companies have no plans to raise insurance premiums, because overall accident rates have not increased. There was no real difference between the use of a receiver or hands-free model of phone, suggesting that the problem is one of mental, rather than physical preoccupation.
#
"Faster modems on the way",44,0,0,0
(Feb '97)
Originally planned for delivery in July 1997, then promised for the end of 1996, the new 56 kbps modems are coming Real Soon Now-or soon enough for the advertisers to be starting to promise them. Don't throw out that old modem: Hayes (http://www.hayes.com) are offering trade-in deals, US Robotics are offering a plug-in module to upgrade your modem, and several firms are now selling "upgradeable" 28.8 kbps modems. The new modems, by the way, are unlikely to deliver those speeds for a while yet, at least not in the US, where FCC rules will limit the power of transmissions, and so the speed. Since so many connections on the Web involve the US, this means almost everybody will be affected by a rule that dates from the past, when almost all \Jtelephone\j traffic was analogue.
Late update: US Robotics announced in late February that it had just shipped the first of its 56-kbps modems. Postponement before that resulted from the need to fix some bugs in the software. Motorola said they will introduce its equivalent products in a couple of months. The trap is that there will be no internationally agreed standard until 1998 for this speed of modem, so manufacturers are said to be trying to sell as many units as possible, in order to pre-empt the market, and establish a \Ide facto\i standard. Watch out for some very good offers. And some very cheap ones.
#
"Encryption",45,0,0,0
(Feb '97)
The number 56 was in the news in another way as well, when Digital Equipment Corporation, Cylink Corporation and Trusted Information Systems got the go ahead from the Commerce Department to export products using the 56-bit Data Encryption Standard \Jalgorithm\j. Before that, American companies were limited to exporting products with a key length of 40 bits or less, except for applications in the financial services sector. This is because the US government sees encryption software as munitions. But while the manufacturers are cheering, maybe somebody should tell them that researchers at the Weizmann Institute in Israel were able last November to crack this encryption easily, deciphering the secret key from a 56-bit Data Encryption Standard \Jalgorithm\j, using "differential fault analysis".
A web TV device, using the much more secure 128-bit encryption, planned by Phillips and Sony, looks as though it will not be able to be marketed for some time to come, again because it is regarded as a "munition".
#
"Morse code disappearing",46,0,0,0
(Feb '97)
During January, the French coast guard sent their last radio message in \J\JMorse code\j\j, while February saw several other nations, including Britain, announcing that they would be phasing out the use of Morse for long-distance signalling during 1997.
Royal Navy signallers will still learn Morse, so they can use Aldiss lamps to signal from ship to ship or ship to shore during times of radio silence, but the traditional "SOS" signal, three dots, three dashes, and three dots, is almost a thing of the past.
#
"Carbon on the ocean floor",47,0,0,0
(Feb '97)
Where would you expect to find most of the world's carbon reserves? As \Jcoal\j? No. As oil and gas? Again, no. The answer is that most of it lies on the sea floors of the world, in the form of a gas \Jhydrate\j, a solid made of \Jmethane\j and \Jwater\j which can only exist at high pressure and low \Jtemperature\j. Scientists looking at global warming effects will need to take this new discovery into account. The hydrates can be recognised by the way they reflect sound waves, and now samples have been retrieved in a pressurised vessel and analysed.
#
"Ancient chewing gum",48,0,0,0
(Feb '97)
Ancient chewing gum, some 6500 years old and found in Sweden has been analysed. It seems to have been made by heating birch bark inside a sealed container to make a chewy tar. Scientists speculate that children and teenagers may have been the main users, and that it was used for some medicinal purpose, perhaps to help get rid of milk teeth.
#
"Black hole discovered in February",49,0,0,0
(Feb '97)
This time, at the centre of the galaxy M32. This galaxy has no quasars at its centre, which might indicate the presence of a black hole, but space-based spectral studies show stars travelling at speeds which indicate the presence of a large mass of around (3.4 ▒ 1.6) ╫ 10\U6\u solar masses, contained within a region only 0.3 parsec across. This is most likely to be a black hole, say scientists.
#
"A new particle and a fifth force?",50,0,0,0
(Feb '97)
In late February, reports were circulating about the possible discovery of a new heavy particle, a "leptoquark", in \JHamburg\j. If it is confirmed, the particle's existence will imply the long-sought "fifth force", and may allow physicists to work towards a unified theory of the natural forces.
Two experiments, ZEUS and H1 have been running for some years, and it appears that the results have come from this work. (You can get beautifully illustrated information on H1 at http://dice2.desy.de:80/h1/www/general/public/detector.html in your choice of French, German or English, and on Zeus at http://www-zeus.desy.de/publications.html. The H1 site provides a wide range of information about the project, but at the end of February, still had no news about what, if anything, had been discovered.
#
"Obituaries for February 97",51,0,0,0
(Feb '97)
Melvin Calvin (1911-97). Calvin was the chemist who elucidated the biochemistry of \Jphotosynthesis\j.
Paul Erd÷s (1913-96). The \Jmathematics\j community paid tribute at a meeting in \JSan Diego\j, \JCalifornia\j in January to one of its legendary figures, who died last September. Erd÷s is considered the most prolific mathematician ever, yet he had no permanent home and no formal job after 1954-only a vast succession of collaborators and a gift for problems that opened productive new areas of research.
Erd÷s will probably be remembered mainly for the Erd÷s number, a whimsical number given to mathematicians. The Hungarian-born mathematician Paul Erd÷s is considered to hold the world record for the number of papers he has written in collaboration with other mathematicians. Erd÷s himself is accorded the Erd÷s number 0, while any person who has collaborated in the past with Erd÷s on a paper has an Erd÷s number of 1, while a mathematician who has collaborated with a collaborator is given an Erd÷s number of 2, and so on.
#
"Global warming--impact on Antarctica",52,0,0,0
(Mar '97)
Our major theme this month seems to be the \Jweather\j. Not the usual annual change in seasons, but something rather longer-term. In Mid-March, \ITime\i magazine ran a piece on the possible risk that the West Antarctic Ice Shelf might break up, causing sea levels across the world to rise. Puzzled as to how you would tell west \JAntarctica\j from east \JAntarctica\j, we began to investigate the science behind the story.
The difference between east and west \JAntarctica\j is most obvious when you stand on the meridian of Greenwich, somewhere in \JAntarctica\j. From where you are, west to the \JInternational date line\j, is the part known as lesser or western \JAntarctica\j, while east of you, all the way around on the other side, is greater or eastern \JAntarctica\j.
If you sit somewhere south of Tahiti on the globe, and look at \JAntarctica\j, you can see west \JAntarctica\j nearest you, with a tail running up towards \JSouth America\j, the Antarctic Peninsula, like a spinal cord leading out of the brain, an effect that is enhanced by the \Jcerebellum\j of east \JAntarctica\j, piled over the top. In our map (reference to the map on the CDROM), east \JAntarctica\j is on the right, and west \JAntarctica\j is on the left.
Now before we look at what is happening to the ice shelves and \J\Jsea ice\j\j of \JAntarctica\j, we need to look at what happens to glaciers when the \Jweather\j gets warmer, and that means we have to move a little north of \JAntarctica\j, to \JNew Zealand\j. Glaciers are very sensitive to slight shifts in the \Jtemperature\j, so what we see in \JNew Zealand\j will help us to understand what is happening further south.
When the \Jearth\j's average \Jtemperature\j rises, these global thermometers withdraw, pulling back into the mountains, and when it cools again, the glaciers descend into the valleys once more. The 127 glaciers of the Southern Alps of \JNew Zealand\j have been shrinking since the end of the Little \JIce Age\j, losing 25% of their area in the last century, and taking their fronts almost 100 metres back up the mountains, suggesting a warming of 0.6░C during that time.
This same warming has led to large pieces of the ice shelves around \JAntarctica\j breaking up. These ice shelves float on the sea, and surround the larger "grounded" masses of ice on and around the continent. In 1986, more than 11 000 square kilometres of the Larsen Ice Shelf and 11 500 square kilometres of the Filchner Ice Shelf broke off and floated out into the Weddell Sea, along with 1600 square kilometres of the Thwaites \JIceberg\j Tongue.
This could be just a natural "calving", but the mass of ice released was equal to three or four years' accumulation of snow and ice across the whole of \JAntarctica\j. More than 1300 square kilometres of the Wordie Ice Shelf have broken away since 1966, and a 1993 Norwegian study revealed more icebergs than usual in southern waters. Then in February 1995, a 2600 square \Jkilometre\j \Jiceberg\j calved away from the Larsen Ice Shelf on the Antarctic Peninsula.
There has certainly been a warming in the area, and this would necessarily push the mean annual air \Jtemperature\j isotherms around \JAntarctica\j further south. As the ice shelves only seem to be stable past the -4░C mean annual isotherm, this could explain why the northern ice shelves are breaking up: they are certainly the most vulnerable. In the past fifty years, temperatures on the Peninsula have risen by some 2.5░C, but it is unclear whether this is a sign of global warming, or just a normal variation.
The famous greenhouse disaster scenario, of course, has all the polar caps melting, flooding the oceans and drowning the rich coastal fringes where most of the world's agriculture happens. Is this what we can see beginning? Not necessarily. The shelf ice that is breaking away was already floating on the sea, and even \JArchimedes\j could tell us that we don't have to worry about floating ice. It melts into the sea, and leaves no trace of a rise. We needn't worry about the ice shelves melting and drowning us in our sleep.
The other ice that is already in the sea, but sitting massively on the bottom of a deep marine basin is another matter. And so is the land ice tied up in ice sheets that are some 4.3 km (about two and a half miles) thick, but close to the sea. If those ice deposits get away, it will be like a very fat person climbing into the bath.
The problem is that there is a large weakness across and through the rock that lies beneath the West Antarctic Ice Sheet (WAIS). This is no mere gap, but a rift that could be as serious as the San Andreas Fault or the Rift Valley of eastern \JAfrica\j. It is an area of regular volcanic activity and perhaps more. Some geologists suspect that there is a plate boundary there, and the recent discovery of a \Jvolcano\j beneath the ice makes them even more certain that their suspicions are correct.
The \Jvolcano\j is rather muffled by the layer of ice, but what would happen if there was a major eruption? Could enough steam and \Jwater\j be released to make the grounded ice unstable? The evidence that other scientists have been gathering from ice cores in \JGreenland\j and \JAntarctica\j are telling us that \Jice age\js may begin and end very quickly. If the climate warms and ice sublimes or evaporates, the blocks will get lighter, they will begin to rise, and sea \Jwater\j can start to flood in beneath the ice, "greasing the skids" and speeding the flow of ice to the edge, increasing the instability. Is that what rings the changes, something as unpredictable as a volcanic eruption?
The east Antarctic ice sheets would produce a sea level rise of some sixty metres (200 feet) if they melted, but that seems a remote chance right now. The WAIS, on the other hand, would only produce a rise of six metres. But even six metres would be catastrophic for large parts of the world. Capital cities like London, protected from high tides only by tidal barrages in the Thames, much of the world's best agricultural land, river estuaries all over the world, and every \Jcoral\j \Jatoll\j, would all be threatened.
For now, the experts say that, while the WAIS is unstable, and probably contains 3.2 million cubic kilometres of ice, all resting in a deep marine basin, the ice is most likely to stay where it is, and the low-lying parts of the world remain safe.
So if the WAIS is not likely to be a problem, is the \J\Jsea ice\j\j shrinkage also something we can ignore? The qualified answer is yes, but only if we don't appreciate penguins. Some penguins need \J\Jsea ice\j\j so they can get access to the sea, both for the parents to feed, and for the young to launch themselves into the sea when they are ready to take on independence. As the \J\Jsea ice\j\j breaks up earlier in the southern summer, so the chances rise that the new juveniles will be trapped, with no easy jetty of ice to take them clear of the shallows and waves of the shore.
#
"Cloned monkeys",53,0,0,0
(Mar '97)
Following on last month's sensation, the cloned sheep called Dolly, a cloned monkey in the first few days of March was almost an anticlimax. The technique used was similar, but instead of taking the donor DNA from a mature adult cell, it was taken from an early embryo.
Don Wolf of Portland's Oregon Health Sciences University, assisted by colleagues, used standard in vitro fertilisation procedures to produce the embryos, then removed the DNA from the recipient egg cells, and added DNA taken from embryos at the eight-cell stage. Three types of recipient cells were used: those which were newly fertilised, those which were past the best time for fertilisation, and cells which were "ripe" for fertilisation. Only the cells which were ripe for fertilisation were successful as recipient cells.
\JCloning\j progress continues, with reports coming in during March of attempts to clone cows, in a process related to the method used at Roslin to clone Dolly, but with material taken from slaughtered cows. An Australian team has managed to produce 470 embryos from a single blastocyst, and hopes to end up with a method of \Jcattle\j production which is more efficient than the standard methods of artificial insemination.
\JCloning\j regress also continues: the \Jcloning\j of Dolly seems to have provoked a different and rather curious form of \Jcloning\j around the world, as strange laws are thrust before the public. The Norwegian Parliament passed a law banning \Jcloning\j of humans and other "highly developed organisms." The law, would have blocked the \Jcloning\j of Dolly herself, and the race must now be on to see which band of enthusiasts can produce the most ridiculous legislation. The only Norwegian researcher who appears to be affected is Stig Omholt, who clones bee embryos!
In America, the enthusiastic bandwagon jumpers in Congress introduced bills to outlaw human \Jcloning\j, even as they were advised that the procedures for human \Jcloning\j were not going to happen overnight, and that the matter should be carefully considered first. Debate on the ethics and \Jbiology\j of \Jcloning\j has become a commonplace of the media during March, and is likely to continue during most of 1997. Information and rational discussion will hopefully follow, not too far behind.
\BAscent of the chimps?\b
Two Australian researchers offered a controversial suggestion in the \IJournal of Molecular \JEvolution\j\i, about the ancestors of chimpanzees and gorillas. Looking at a variety of genes in a number of primates, rats, mice and marsupials, they have recalibrated the "\Jfossil\j clock", and find that the ancestral \Jchimpanzee\j and human stocks divided up less than 4 million years ago.
The problem is that this would mean that the chimps must have descended from \IAustralopithecus afarensis\i, who walked upright on the ground. Simon Easteal and Genevieve Herbert from the Australian National University have also suggested that \IAustralopithecus robustus\i (often called \IParanthropus robustus\i) is the ancestor of the modern gorillas, based on similar evidence. Neither suggestion has been received with joy by anthropologists, so we should not hold our breaths while waiting for universal agreement on this notion.
#
"Archaeological findings in Germany",54,0,0,0
(Mar '97)
At the start of the month of March, news broke of an astonishing archaeological find in northern \JGermany\j, in a \Jcoal\j mine in Sch÷ningen, 100 kilometres east of Hannover. Wooden spears, bones and flint tools were uncovered from a \Jcoal\j mine. The spear shafts were spruce, about two metres long, no more than 5 cm in diameter, and with a balance similar to that of a modern javelin, showing that the spears were intended for throwing, and given the large number of horse bones, the nature of the targets seems to be fairly obvious.
The finds can be accurately dated to an interglacial period 400 000 years ago by correlating the surrounding sediments to well-known geological layers, in an area where the history is well-known and well-understood. The finds are exciting because it includes wood which normally does not last very long. The oldest previous wooden spear was just 125 000 years old, so this is a massive extension of the time in which clever humans were around in Europe.
In particular, the find casts doubt on the usual view of the early robust humans (these people were probably similar to the rugged human found at Boxgrove, since the spear throwers would have needed to be quite strong.
\BSiberia Findings\b
The following \Jday\j we learned that tools similar to those found in the Olduvai Gorge have been found in \JSiberia\j, and that thermoluminescence dating suggests a date of between 240 000 years and 366 000 years. Thermoluminescence dating is a sure way of exciting strong feelings among archaeologists, and so must be open to doubt, but it still seems that people have been sending themselves to \JSiberia\j for a long time.
#
"Seeing yellow",55,0,0,0
(Mar '97)
Even at the end of the 20th century, we still do not have a clear understanding of how we really "see" colors. In particular, the way we see yellow seems to be important in determining whether yellow is perceived in the eyes, or in the brain, after separate signals from the eyes have been combined and processed in some way.
If you flash red on one eye and green on the other eye for a human, the subject "sees" yellow, suggesting that yellow is a brain perception, not an eye perception, because the nerve pathways from the eyes do not come together until the \Jcortex\j. Now it appears that the human visual system can also take a "yellow" stimulus and break it down into its red and green components.
If a flashed red line is superimposed on a moving green bar which is only visible for a short period of time, the flashed red line appears yellow. But if the flashed red line lasts for a longer period, it appears to follow after the moving green bar, but it appears red. The researcher responsible believes that this supports the idea that we "see" yellow in the visual \Jcortex\j, as proposed by Thomas Young, Hermann von Helmholtz, and James Clerk Maxwell. Only this model would explain why motion, which is perceived in the \Jcortex\j in primates, can interfere with the perception of yellow.
#
"Small galaxy disappears",56,0,0,0
(Mar '97)
A burst of intense gamma rays was detected by the Italian-Dutch \Jsatellite\j BeppoSAX, coming from the northwestern part of the \Jconstellation\j Orion. These mysterious events, known since the 1960s, have often been assumed to be caused by violent events on the surfaces of neutron stars, or by collisions between black holes and neutron stars, but this latest event was narrowed down to a section of the sky less than one arc minute across. (For comparison, the \Jsun\j and the moon each appear to stretch across 30 arc minutes, as seen from \Jearth\j.)
Careful observation of the area revealed that a small galaxy in the area had dimmed, and by March 8, the galaxy had disappeared altogether. This suggests that we may now be closer to knowing what causes the gamma ray bursts, but it will take more events, all carefully observed, before all of the wilder theories are laid to rest.
#
"GPS and the Himalayas",57,0,0,0
(Mar '97)
Global Positioning Systems (GPS) were the subject of a major article last month. This month, \INature\i has revealed a new use for GPS: calculating how fast the Indian sub-continent is pushing in under \JNepal\j. The best estimate, based on other measurements, is 20.5 ▒ 2 mm yr\U-1\u, while GPS delivers an overlapping estimate of 17.52 ▒ 2 mm yr\U-1\u. The researchers believe there may be enough strain already present in the rocks of western \JNepal\j to trigger another \Jearthquake\j like the Bihar/\JNepal\j quake of 1934, which had a \Jmagnitude\j of 8.1.
#
"Under two suns?",58,0,0,0
(Mar '97)
A dusty disc, possibly the precursor of a planetary system, has been identified around a young and massive binary pair of stars, BD+31░643, a thousand light years away, suggesting that planets may be able to experience sunrise and sunset at the same time, a notion which has long been popular with \J\Jscience fiction\j\j writers. The evidence comes in the form of a symmetrical band of light, around two stars which are about 1 million years old. Since 1984, only beta Pictoris, a main-sequence star, has been known to have such a dust disc, but if this suspicion is confirmed, then the search will be on for more "planetesimals".
#
"Earth getting hotter?",59,0,0,0
(Mar '97)
If you study surface air \Jtemperature\j records across the globe, 1995 was the hottest year on record, clear evidence, say some observers, for global warming. Over the period from 1979 to 1975, the trend has been for warming to occur at the rate of 0.13░C per decade.
Against that, \Jsatellite\j data on the lower \Jtroposphere\j imply a cooling trend of -0.05║C per decade, using data from the Microwave Sounder Unit (MSU). This, say the doubters, casts doubt on the validity of the doom-sayers' predictions. Now it appears that the cooling trend has been mostly generated by sudden drops which occurred on two occasions when there was a change in \Jsatellite\j. These drops, arising from a change in the measuring instrument, have probably produced a spurious cooling trend in the data. The real trend in MSU temperatures, say the latest studies, is likely to be positive, but fairly small.
#
"Ice Age an earlier phenomenon",60,0,0,0
(Mar '97)
We tend to think of \Jice age\js as being a recent phenomenon, something which arose in the days of woolly mammoths and hairy humans, but now we have evidence of two much earlier glaciations, close to the then Equator. These two Precambrian glacial periods left deposits of glacial materials beneath accurately dated \Jlava\j beds, but the present picture is of a very long ice-free period, punctuated by two severe \Jice age\js.
#
"Life on Martian meteorite?",61,0,0,0
(Mar '97)
The famous Martian \Jmeteorite\j ALH84001 was, we were told quite recently, formed at temperatures far too hot to have allowed any life forms we were told, although as recently as August 1996, we were assured that it definitely bore traces of Martian organisms.
Now the \Jtemperature\j of formation may be cool enough for life again. The variation of isotopic ratios throughout the rock varies rapidly, and this is being interpreted as evidence that the \Jcarbonates\j in the rock must have formed at relatively low temperatures, perhaps less than 100 degrees \JCelsius\j. The reason for the claimed low \Jtemperature\j, say petrologists, is that these rocks, at high temperatures, would have been subject to more diffusion, more mixing-in of the various \Jisotopes\j.
Further evidence comes from the magnetic fields in fragments of crushed crystals. The directions of the field appear to reflect those in the original crystal, but if the rock had ever been heated above about 320 degrees \JCelsius\j, its magnetisation would have been wiped out, and the magnetism would then have been re-established in a new common direction.
The argument continues . . . just a few days later, the team making the original claim were back in the fray, claiming to have detected traces of "biofilms", carbon-based coatings more similar to dental plaque than anything else, which may have been left by \Jbacteria\j at some time in the past. As with all of the other claims, the voice of the sceptic is strong in the land.
#
"Thermoluminescence dating doubts",62,0,0,0
(Mar '97)
Thermoluminescence dating was among the methods which came in for some criticism at the Australian Archaeometry Conference on archaeometry which was held in Sydney during February, but several other methods were also questioned. Science is all about having healthy doubts, but some of those attending later reported that the sceptics had more fun than those being exposed to scepticism.
#
"Weird flier uncovered",63,0,0,0
(Mar '97)
Palaeontology is perhaps the last branch of science, in the era of Big Science, where amateurs also have a part to play. German amateur palaeontologists have collected some remarkable new specimens of the world's oldest known vertebrate flying animal. The 250-million-year-old \ICoelurosauravus jaekeli\i was originally known only from partial specimens, but new and more complete fossils have now been collected by dedicated \Jfossil\j hunters who have willingly made their material available for expert study, allowing the discovery that the bones of the wings were quite different from the structures we now think of as standard.
Instead of extensions of the existing bones of the animal's body, bundles of bony rods formed in the wing's skin and opened like an old Japanese fan, radiating outward from the shoulder area. Just for once, \Jevolution\j seems to have made a breakthrough and solved a problem without adapting a pre-existing structure.
#
"Eye evolution",64,0,0,0
(Mar '97)
Wings may have evolved more than once, but the evidence is getting better to say that the eye evolved just once in the history of life on \Jearth\j. Two years ago, it was shown that a mouse-eye gene, spliced into fruit flies, caused them to grow extra eyes in strange places, suggesting that the mouse and the fruit fly inherited their eye from some distant common ancestor.
Now the procedure has been repeated, with equal success, using a \Jsquid\j's eye gene. Two organisms with the same gene might just have been happenstance, but three organisms? The notion of a common ancestor seems rather more attractive.
#
"Hepatitis G--not guilty?",65,0,0,0
(Mar '97)
Researchers in America appear to have ruled out the mystery virus which is referred to as hepatitis G, as a genuine cause of hepatitis. The virus is found in about 1-2% of blood donors, and seemed to be associated with certain cases of hepatitis. But while the virus looks "clean", some researchers are remaining cautious, saying that it is harder to prove that something \Idoesn't\i cause disease than to prove that something \Idoes\i cause disease.
#
"Flu virus of 1918 identified",66,0,0,0
(Mar '97)
The flu virus which killed more people, soon after the First World War, than died in the whole of that war, has now been identified in lung tissue preserved from the corpse of a 21-year-old American private who died of the disease. The viral RNA extracted from the tissues has proven to be remarkably similar to flu viruses found in pigs. Previously, scientists assumed that human flu viruses came from bird populations: if the killer came from a pig, then health authorities will need to monitor pig populations more carefully.
On the other hand, it may soon be all up for the flu virus, as researchers begin to identify the genes that are needed for the flu virus particles to escape from one cell, and break into another. In another report, two such genes have now been identified, coding for the proteins, hemagglutinin and neuraminidase, both of which assist the viruses to break through the cell membrane. These two genes will be tempting targets for those looking to create a wonder drug which strikes at the foundations of the attack strategy of this virus.
#
"Obituary for March 97",67,0,0,0
No deaths of eminent scientists have been reported this month, but this month may, however, have seen the death of one of the world's deadliest killers, the \Jtobacco\j industry. Each year, an estimated three million people die of \Jtobacco\j-related causes, and by the time today's teen smokers are middle-aged, there will be ten million deaths a year from \Jtobacco\j. The cause of the possible death is the defection from the other manufacturers of a small American \Jcigarette\j manufacturer, the Liggett Group, which produces Chesterfields. Liggett has agreed to hand over certain documents, causing the other \Jcigarette\j company lawyers to start restraining actions. This is clear evidence, say observers, that the documents will prove to be most interesting reading.
#
"Coming events (Mar 97)",68,0,0,0
In the past, it has always been the creationists who have taken the evolutionists to court. While the evolutionists have always won the \Jday\j, the creationists have always bounced back again, in a slightly different guise.
Starting April 7, an Australian geologist, committed to the ideals of \Jevolution\j, will be pursuing one Australian creationist through the courts, using \JAustralia\j's federal Trade Practices legislation, alleging that a group called Ark Search Incorporated and its head, Allen Roberts have engaged in "misleading and deceptive conduct".
As side dishes, an American former creationist has joined in the action, claiming that the Ark Search literature breaches his copyright, and Plimer, who holds a chair in \Jgeology\j at Melbourne University (where he is head of the School of \JEarth\j Sciences) will be questioning Roberts' claim to hold a legitimate \JPh\j.D. from a \JFlorida\j (USA) university which has proved rather hard to locate in any physical way.
An ominous move, from Roberts' point of view, is that a number of leading Australian "creation scientists" are distancing themselves from him, at least in the run up to the case. At the same time, Plimer has been criticised by a number of Australian scientists who claim that, even with an apparently watertight case, Plimer may be defeated by legal manoeuvring, giving the creationist cause their first "win", which they would then interpret as court approval for their point of view. Plimer, on the other hand, says that only in a \J\Jcourt of law\j\j can he challenge Roberts without risking a long \Jdefamation\j suit in front of a possibly confused jury-this case will be before a judge alone.
It looks as though controversial trials in the future will not be complete without a Web site or two. The creationists' case will be found at http://www.christiananswers.net/aig/aighome.html, and the Australian Skeptics will present their views at http://www.skeptics.com.au
Given the strong feelings which are likely to emerge during the case, it will be interesting to see how long it takes for one or other of these sites to be the subject of separate court proceedings, although the creationists' site is already outside any Australian \Jjurisdiction\j, and the skeptics are sure to have friends in other countries who can offer them safe haven.
#
"One hundred years of electrons?",69,0,0,0
(Apr '97)
So far as the history books are concerned, April 30 was the centenary of the \Jelectron\j, since it was on April 30, 1897, that Professor J. J. Thomson revealed to the Royal Institution in London his work on the mass/charge (m/e) ratio for the \Jelectron\j.
But the \Jelectron\j actually has a longer history than that. Around 1860, Julius Plucher in \JGermany\j found that he could deflect the rays in one of Johann Geissler's vacuum tubes with a magnet, which suggested that these "\Jcathode\j rays" were actually made up of a stream of particles. Unfortunately, Johann Hittorf, Plucher's student, was able to show that the rays made sharp shadows, which suggested to people that the "rays" were made of waves like light, and in the end, the German physicists went for the wave viewpoint.
In England, Sir William Crookes had a marvellous time playing with the same sorts of tubes, gaining results which strongly favoured the particle view of \Jcathode\j rays. In \JGermany\j, Heinrich Hertz and Philipp Lenard found that the \Jcathode\j rays from the \Jcathode\j ray tube could pass through a thin metal "window" in the end of the tube, clear evidence, they thought, that the rays really were rays. The English particle camp was divided: Arthur Schuster thought that the particles had to be the same sort of thing as the massive ions released in \Jelectrolysis\j-\Jhydrogen\j ions in our terms, while J. J. Thomson thought otherwise, but stayed fairly quiet.
In January 1897, Emil Wiechert argued that the electrons were particles much smaller than \Jhydrogen\j ions, and in April, Walter Kaufmann gained results which were similar to those Thomson was about to announce, but Kaufmann was worried about the tiny mass value that was implied, so he did not make his findings known, leaving the way clear for Thomson at the end of the month.
The most important points Thomson raised included that his observation that the value of m/e (or as we say today, e/m) remained the same, regardless of the gas which was left behind when you pumped out the vacuum tube. Getting a perfect vacuum on \Jearth\j is, of course, impossible, and it was accepted that the electrons came from the gas that was left behind when most of the gas was taken out. If the charged particle was one of the gas atoms, then you would expect to get variations in the e/m ratio as the mass varied.
But was April 30 the true centenary? The word "\Jelectron\j" has been around since 1891. The idea had been formally discussed by James Clerk Maxwell in 1873, and had been mooted by Michael \JFaraday\j even earlier than that.
More importantly, while the 1897 result told us that there was a constant ratio of mass to charge, there was still a problem: was the small value of m/e (in Thomson's terms) due to the small mass of the \Jelectron\j, or did it result from a large value for the charge on the \Jelectron\j? In 1899, Thomson would answer this in part when he published a paper with the title \I"On the Existence of Masses Smaller than the Atoms"\i, showing that the mass of the \Jelectron\j (by then the accepted term) was about 2000 times less than the mass of the lightest atom, \Jhydrogen\j.
Others might argue that the real birth of the \Jelectron\j only came later, when Robert Andrews Millikan did his famous oil-drop experiment, but we have now known for certain, for just over a hundred years, that the \Jelectron\j was a particle that we could work with.
When you look around the home, we haven't done too badly with that piece of knowledge during the past century. Quantum mechanics, nuclear physics and \Jelectronics\j have all stemmed from that one key measurement. Your computer, \Jtelevision\j, radio, \J\Jmicrowave oven\j\j and mobile phone all rely on discoveries which followed from Thomson's work.
#
"Europa's ocean",70,0,0,0
(Apr '97)
It is almost twenty years since those first hazy \J\JVoyager project\j\j photos suggested to some scientists that there might be liquid \Jwater\j below the surface of Europa's icy surface. It was only on April 9 that we got our first clear evidence that there really is liquid \Jwater\j there, \Jwater\j which might support life. The pictures, released by NASA's Jet Propulsion Laboratory, were taken from 586 kilometres (363 miles) above Europa, and they cause immediate excitement.
Yet this is not surface \Jwater\j, crashing and pounding on the Europan shores, but \Jwater\j deep beneath the ice, where it is protected by a blanket of giant icebergs. So how can we be so sure that there is liquid \Jwater\j when we cannot see it? The pictures show us a terrain that scientists think could only have formed if the \Jwater\j had occasionally erupted through, melting part of the surface and churning up the icebergs as it went. The excitement of this discovery comes from the realisation that where there is \Jwater\j, there could be life, though it still remains an outside chance at best.
Europa is the smallest of the four "Galilean" moons of Jupiter, the moons which were seen by \JGalileo\j Galilei when he turned his primitive \Jtelescope\j to the night sky, and found, on January 7, 1610, what he called the "Medician stars", a bit of unsubtle flattery of the powerful Medici family.
Europa has a diameter of 3138 km, a bit smaller than our own moon, and it is the sixth largest moon in the \J\Jsolar system\j\j, and the smoothest object in the \J\Jsolar system\j\j, with no surface features more than 1 km high. With a visual \Jalbedo\j of 0.64, it is about five times as bright as our own moon, as you would expect from a large ice-coated object.
There are two main types of terrain on Europa. One is mottled brown or grey hills, the other is made up of broad plains, criss-crossed with cracks, some curved and others straight. This second type of surface looks remarkably like the surface of our \JArctic\j Ocean.
The jumbled terrain, say scientists, has ridges that have been squeezed up, rifts and cracks, crumpling and other features which are best explained by a thin layer of ice over a liquid ocean. The ice crust may be "no thicker than 150 km", say the scientists, though others favor a model with just a couple of kilometres of ice over a muddy ocean. There are also a few scientists who think the landforms have come from ice shifting around on ice.
The most important thing is the surface, which has very few craters, and very few large craters, supporting the idea that the surface is very young, that the shaping processes are still going on. But why should there be any liquid under the ice? The answer lies with tidal warming, the same effect which drives the volcanoes of Io. As Europa swings around the other satellites and Jupiter itself, tidal pulls release \Jenergy\j which warms or melts material at different places. While the outside ice is chilled by the cold of space, ice is also a good insulator, as every igloo-dweller knows, so there could easily be \Jwater\j down there somewhere.
Present theories assume that the core of Europa is probably mainly iron and sulphur, rather like Io. The density of Europa is 3.01 g/cm\U3\u, so the core is probably smaller than Io's core. There is also a very thin oxygen \Jatmosphere\j over the surface of Europa.
\JGalileo\j has been flying since October 1989, and reached Jupiter in December 1995, but there is more still to come: Europa gets another visit from \JGalileo\j in November this year, and the extended mission announced recently will involve \JGalileo\j in a further eight visits to Europa, as well as extra visits to Callisto and Io. During the later visits, scientists will be looking for the final proof of liquid \Jwater\j, a \Jgeyser\j of \Jwater\j and mud, bursting through to the surface.
That, they say, would be really something.
#
"Atomic force microscopes",71,0,0,0
(Apr '97)
"No one has ever seen, nor probably ever will see, an atom, but that does not deter the physicist from trying to draw a plan of it, with the aid of such clues to its structure as he has." That, at least, was the opinion of Maria Goeppert Mayer, later to share a Nobel Prize in physics for her detailed understanding of atoms, writing in 1953. No scientist of her \Jday\j would have disagreed with that point of view, but now scientists can image single atoms, they can even move them around and organise them to spell out simple messages. The atomic force \Jmicroscope\j is showing all sorts of new promise.
First described in 1986, this \Jmicroscope\j can resolve objects as small as a nanometre across. It does so by touch, feeling the surface it passes over, rather like an old-fashioned stylus on a vinyl record. Now, even DNA can be examined in this way, according to reports at the American Physical Society's annual meeting in March, too late for results to be included in our last update.
The big problem with dragging a probe over a surface is that you may snag or damage something, but you can get around that by vibrating the tip. Neil H. Thomson reported that this form of tip allowed him to get images of RNA polymerase working its way along a DNA molecule to make RNA that would be converted into a protein, just as it has always been drawn in the \Jgenetics\j textbooks. The difference? Instead of \Ideducing\i what must be going on, Thomson and his colleagues actually \Isaw\i what was going on.
Tip: watch out for more spectacular discoveries from the atomic force \Jmicroscope\j, or its cousin, the magnetic \Jresonance\j force \Jmicroscope\j, in the next year or two.
\JJapan\j has been working on better forms of \Jearthquake\j prediction since 1965, and currently spends around US$150 million each year on this research, but now the program appears to be in some doubt. The January 1995 Kobe \Jearthquake\j came in an area which had not been closely monitored, it cost many lives, and it was totally unexpected. So now Japanese officials are beginning to ask if there is any useful future in trying to predict earthquakes.
While the report has been partly leaked, it will not be officially released until the northern summer, so the next few months should see some interesting manoeuvring between those who want to stop the prediction studies, and those who wish to continue them.
#
"Peer pressure and gene pressure?",73,0,0,0
(Apr '97)
A recent study of twins in the USA suggests that there may be a genetic tendency to get a "high" from marijuana. Identical twins are much more similar in their responses than are fraternal (non-identical) twins.
So perhaps peer pressure gets people started, but then \Jgenetics\j takes over, encouraging some to stay with the drug, while others try once, and then decide it is not for them. Next step: identifying the drug, and working out how it operates.
Meanwhile, an Israeli study of 141 heroin addicts suggests that there may also be a gene for heroin addiction, a gene which produces a mutated form of a \Jdopamine\j receptor molecule known as D4. The gene was already known to be linked to \Jnicotine\j and alcohol abuse, so the finding is not particularly surprising.
#
"A gene for Alzheimer's?",74,0,0,0
(Apr '97)
The unravelling of Alzheimer's disease gets a step closer with the observation that most patients seem to have high levels of a mutant \Jenzyme\j, \Jcytochrome\j oxidase (CO), which is involved in providing \Jenergy\j to the cells. It has been known for some time that Alzheimer's patients have low levels of brain \Jenergy\j, and while the \Jenzyme\j is present in their brain cells, it does not work as it should, suggesting that it may be mutated in some subtle way.
CO is made up of thirteen proteins, and is found in the \Jmitochondrion\j, which has long been suspected as a key element in the transmission of Alzheimer's. The disease is more commonly found in the children of Alzheimer's women than in the children of Alzheimer's men, and mitochondria are only inherited from the mother-we get our mitochondria only from the \Jovum\j that we grow from, and never from the sperm that fertilises the \Jovum\j. Most importantly, mitochondria carry a small amount of independent DNA which they need to function properly.
When the mitochondria from Alzheimer's patients were cultured in cells lacking any mitochondrial DNA, the cells had lower \Jenergy\j production, and more oxygen free radicals, harmful molecules which are able to damage cells further. The CO mutations may not be the direct or indirect cause of Alzheimer's disease, but there are definite grounds for a strong suspicion.
#
"Breast cancer genes",75,0,0,0
(Apr '97)
How do cancer genes work? How do they cause cancers in the human body? That is a major question right now, and this month saw some clues in two reports on the second human breast cancer gene to be discovered. BRCA2, as it is called, can be found in a mutant form in breast cancers. If we could only find out how the gene operates normally, we would have some clues as to how the mutant form causes cancer.
The reports seem to give different results: the normal BRCA2 gene is shown in one study to be involved in the control of cell proliferation, while a second study suggests that the gene interacts with a protein called Rad51, which is a DNA repair substance. This finding suggests that the normal gene assists to maintain the cell's DNA.
If the gene interferes with the cell's ability to repair damaged DNA, this would mean that the cancer cells would be more easily attacked by radiation.
The research involved breeding mice which lacked the gene. If mice lack both copies of the gene, they die in early development, suggesting that the gene's loss prevents cell growth somehow. The association with Rad51 led researchers to treat 3.5-\Jday\j embryos with radiation, a treatment which killed the embryos with no active BRCA2 gene, while those with one or two copies of the normal gene survived.
Currently, the most popular explanation is that the gene acts in a DNA repair role, and that when this repair role is not there, damage in the DNA accumulates and prevents further growth.
#
"Cancer gene nabbed",76,0,0,0
(Apr '97)
A variety of human cancers, especially the aggressive brain tumors called gliomas, involve cells which lack part or all of \Jchromosome\j 10. This suggests that there may be a tumor suppressor on this \Jchromosome\j, and the hunt has been on to identify it.
New research has revealed a single genetic marker which was missing in two breast cancer samples, as well as from some prostate and brain tumor cell lines by one group of researchers, and in brain tumor cells by another group of researchers.
Dubbed PTEN (\Iphosphatase and tensin homolog deleted on \Jchromosome\j 10\i), or MMAC1 (\Imutated in multiple advanced cancers 1\i), this is by no means the first tumor suppressor gene to be discovered-there are about sixteen others already known-but early indications are that this marker is an important one. The two groups, one publishing in \INature \JGenetics\j\i, the other in \IScience\i, say that the gene appears to be associated with some major cancers of the human body.
#
"Crash victims, new method to identify",77,0,0,0
(Apr '97)
The same issue of \INature \JGenetics\j\i also reports on a genetic method of identifying human fragments found in air crashes. In August 1996, a jet carrying Russian and Ukrainian miners and their families to Spitsbergen Island crashed into a mountain, killing all 141 on board.
The body parts collected at the scene were shipped to Oslo, where it was realised that dental records and fingerprints would be of little help. So a different sort of fingerprinting, DNA fingerprinting, was brought into the investigation.
In just three weeks, they gathered 257 body parts, and sequenced eight stretches of "junk DNA" from each of the body parts, getting a total of 141 unique DNA types. This meant they could now link the separated parts to each other, but left the actual identification up in the air. By sequencing blood samples from the relatives of 139 victims, leaving just two to be identified in other ways
The team was lucky, in that the remains were preserved by the cold conditions at Spitsbergen, but their work shows what can be achieved under the right conditions.
#
"Fish antifreeze gene",78,0,0,0
(Apr '97)
The gene for fish antifreeze seems to have evolved twice, independently. The fish of the Antarctic developed their version of the gene for an antifreeze protein around fourteen million years ago, just about the time when the Antarctic Ocean dropped below freezing. The gene in Antarctic notothenoid fish is derived from another gene in the same fish, a gene that produces a digestive \Jenzyme\j, but this new gene is taken from some of the junk DNA, useless bits of code which are found in most sequences of DNA.
In \JArctic\j cod, there is a very similar protein, but in that fish, the antifreeze gene cannot be linked to any known sequence in the cod, leading scientists to conclude that the gene must have evolved in a different way at a different time, a remarkable piece of convergent \Jevolution\j.
#
"Life is fractal",79,0,0,0
(Apr '97)
To many scientists, the meaning of life is more important than the physics of life. In physics, most measurements are compared by the way they differ in order of \Jmagnitude\j, of powers of ten. From a microbe to a whale, there is a difference of about 21 orders of \Jmagnitude\j-a whale is 10\U21\u times as large as a microbe.
The importance of this observation is that many measurable features of living things are related mathematically to the body mass of an adult. Metabolic rate varies as the 3/4 power of mass, so that larger creatures have slower metabolisms, life span varies as the 1/4 power of mass, age at sexual maturation varies as the 3/4 power of mass, and length of \Jpregnancy\j varies as the -1/4 power of mass.
Logically, if these things are reflecting the three dimensions, the relationship should be 1/3, implying a cubic or cube root relationship. Ecologists James Brown and Brian Enquist of the University of New Mexico, and physicist Geoffrey West of Los Alamos National Laboratory have been exploring this, and find that they can model animals as a fractal network of linear tubes and account for all of these oddities. Life, it seems is fractal.
#
"Plants on the march?",80,0,0,0
(Apr '97)
Last month, we looked at the problems which came from changing from one \Jsatellite\j to another. The changes between infrared systems on three satellites led scientists to think there was no apparent change in \Jearth\j temperatures, when it now appears that the \Jearth\j was heating up after all. The differences, it seems were hidden in the changeovers between satellites.
This month, a report suggests that the \Jearth\j may be getting warmer, but this is based on the evidence of several sets of readings taken from different satellites which may or may not have been accurately calibrated. If the results are reliable, they show a huge increase in vegetation during the 1980s.
The National Oceanic and Atmospheric Administration (NOAA) launched a series of satellites, and data from three, NOAA-7, NOAA-9, and NOAA-11, indicate that plant life around the world has increased hugely during the decade. In some cases, the increase appears to be as great as 10% in the areas between 45░ and 70░ North, where snow has been disappearing earlier in spring.
Given the earlier example, a number of scientists are viewing the whole matter rather cautiously.
#
"Clean cars get further away",81,0,0,0
(Apr '97)
Last month also saw a close look at the clean electric cars which we hope will be around in the future. In that article, we indicated that the targets for electric car developments in the USA would probably be delayed, and now the \JWhite House\j's Partnership for a New Generation of Vehicles (PNGV) is also beginning to have its wheels drop off.
Targeting more efficient use of conventional fuels, the PNGV plans set targets, but a new review suggests that these targets, relating to particles emitted and fuel use, are just not feasible. The quick version: don't hold your breath while you are waiting for clean air.
#
"New ancestral apes?",82,0,0,0
(Apr '97)
April saw two claims for ancient ancestors of the \Jape\j line. \IKenyapithecus\i finds on an island in Lake Victoria, \JKenya\j, point to this previously discovered \Jape\j being on the line leading to modern apes, but at 14 million years, that may not matter too much, because this month also saw a claim for another \Jape\j ancestor, reliably dated at 20.6 million years.
The new candidate, a large tree-dwelling \Jape\j called \IMorotopithecus\i was also already known from \Jfossil\j pieces found at Moroto in \JUganda\j, but new shoulder bones and leg bones have been taken to indicate that this also had the body form of an \Jape\j, and that it was in the habit of hanging from its arms like modern apes.
#
"Adaptive radiation",83,0,0,0
(Apr '97)
Adaptive radiation is an idea that Charles Darwin developed while he was looking at finches on the Galapagos Islands. The idea is that one species, having arrived on an unexploited island, divided up into several groups which specialised in feeding in different ways, and before you know where you are, new species have arisen.
Nobody has ever actually seen it happen, but we can see traces of it having happened in most island groups. Nobody ever expected to see it happen either, because people have always assumed that the time scale would be too long. All the same, those who considered the issue have generally had strong views on whether genetic drift or natural selection would play a larger part in selecting the new species.
Enter a group of scientists wanting to study \Jextinction\j on a small island. They took a small population of \IAnolis sagrei\i lizards from the \JBahamas\j, natives of Staniel Cay, and transferred some of them to other islands in the area. These lizards live on tree trunks, but their new islands were almost treeless, and this should have doomed them to \Jextinction\j, but this did not happen. So the researchers decided to study the populations, and see how they changed with time.
Anole lizards which live on bushes have shorter hind legs: they lose on speed, but they get greater agility, and the Staniel Cay lizards were all long-legged. Yet after just 10 to 14 years, the surviving lizards have not only survived, they now have shorter hind legs than their ancestors. This change is so rapid that some people have been drawn to suggest that the variation is environmental, rather than inherited-a bit like the muscles of a body-builder.
Most importantly, the lizards differed more on those islands where the vegetation was most different, making it more likely that the observed differences were due to inherited changes.
#
"A snake with legs?",84,0,0,0
(Apr '97)
Everybody accepts that the snakes arose from lizard-like ancestors, but up until now, nobody had much idea what the ancestors of the snakes looked like. M W Caldwell and M S Y Lee reported to \INature\i this month about their findings on a Cretaceous \Jreptile\j, \IPachyrhachis problematicus\i, previously interpreted as a varanoid lizard like the Australian goannas and the Komodo dragon, is actually a primitive snake with a well developed pelvis and hind limbs.
The fossilised skulls show most of the derived features of modern snakes, and a body that is slender and elongated. But unlike other snakes, Pachyrhachis still had well developed hips and full hind legs.
#
"Jurassic Park ruled out",85,0,0,0
(Apr '97)
The basis of the "Jurassic Park" movies is that DNA from mosquitoes trapped in amber can be used to "clone" dinosaurs. Given the recent news about Dolly the sheep (see last month), the chances looked even better. A detailed study, published in April and carried out at London's Natural History Museum, reveals that no credibly ancient DNA has been extracted from Dominican amber.
On the good news front, a US Patent was announced for a method of collecting ancient \Jbacteria\j and fungi, some of which may be able to provide new variations on the standard \Jantibiotics\j. The process involves cleaning the amber, cracking it under liquid \Jnitrogen\j, and extracting material from stingless bees' stomachs. The bees have been in the amber for 25 million years, so any \Jantibiotics\j would almost certainly be unknown to modern \Jbacteria\j.
#
"Antibiotic-resistant bug found",86,0,0,0
(Apr '97)
A report came in from \JJapan\j this month, indicating the isolation of a strain of \IStaphylococcus aureus\i which is resistant to all known \Jantibiotics\j, including the "last resort" antibiotic, vancomycin. While a similar report was received from \JCaracas\j three years ago, that incident was never confirmed, but there appears to be no doubt about it this time. The superbugs are now not just a bacteriologist's nightmare.
The implications are worrying: people can now, for the first time in half a century, get a "staph" infection which cannot be treated.
Parallelling this report, news was received this month of an industrial accident in which a worker in the poultry industry was wounded and infected with vancomycin-resistant \Jbacteria\j. It appears that the \Jbacteria\j may have encountered avoparcin, a similar antibiotic to vancomycin, which is routinely fed to chickens, and so become resistant to vancomycin as well.
#
"Good bugs in our computer displays?",87,0,0,0
(Apr '97)
\IHalobacterium salinarium\i is a bacterium which grows in salt marshes. This organism produces an intensely purple protein pigment in its membrane called bacterial rhodopsin or bacteriorhodopsin. This protein is photosynthetic, and can be used by the bacterium to produce \Jenergy\j from sunlight.
It operates by "pumping" protons across a membrane barrier, and this has made the pigment a matter of intense interest for scientists in a number of fields. A thin film of the molecules produces an electrical field when light shines on it, allowing the pigment to be used in light detectors, for example.
In a similar way, an electric field acting on a film of bacteriorhodopsin causes the film to change colour, an effect called photochromism. While the usual change is a low-contrast shift from purple to blue, mutants of the bacterium are now known which produce a much greater contrast, going from pale blue to yellow.
The displays in laptop computers chew up most of their power lighting up the screen, so a screen made with material like this, and requiring no lighting, would be remarkably valuable. Already, people are beginning to talk about the mutant bacteriorhodopsin as "electronic ink", and wonder if this may not be the start of the electronic book.
#
"Bacteria on a blue chip",88,0,0,0
(Apr '97)
Getting ahead of ourselves very slightly, as this issue went to publication in mid-May, news came through that genetically altered \Jbacteria\j are being added to chips as a way of detecting impurities. The current bugs glow blue light when they are in the presence of \Jnaphthalene\j: when they are placed in a porous matrix on a chip, their glow will cause the chip to trigger an alarm.
Perhaps not quite as attractive as the canaries that Welsh miners used to take down the mine with them, but far more specific, these chips could be used to detect a wide range of environmental pollutants.
#
"Kasparov vs Deep Blue",89,0,0,0
(Apr '97)
Gary Kasparov has been playing chess against Deep Blue once again. At first, the two seemed fairly evenly matched, though there are claims that the machine should now have a rating of 3200, against Kasparov's 2800 rating.
We will bring you a full report of this contest next month, but on May 11, with three draws and a win each, Kasparov and Deep Blue faced up for a final game, in which Kasparov resigned after just nineteen moves. He later suggested that he would have beaten Deep Blue in a tournament play, but nonetheless, a computer has beaten the world's champion human chess player for the first time.
#
"Cheetahs slower",90,0,0,0
(Apr '97)
How fast is the fastest animal? According to most references, including the Webster's World Encyclopaedia, the cheetah can turn on an impressive 70 mph, or 110 kph, but recent studies have shown that this standard estimate is a mild exaggeration.
The cheetah remains the fastest animal on \Jearth\j, but the official record has now been corrected down to 65 mph (all of the measurements taken were in British units) which translates to about 103 kph-still a respectable clip to travel at.
#
"Hubble trouble?",91,0,0,0
(Apr '97)
Three of the four new instruments installed in the Hubble Space \Jtelescope\j are working fine after the recent upgrade, but NICMOS, the Near Infrared Camera and Multi-Object Spectrometer, has caused focus problems, leading to blurry pictures. If this cannot be fixed soon, it will begin to delay a number of studies, including work on young galaxies and on Pluto's moon Charon.
The problem may have been caused by a problem in the NICMOS cooling system-the camera is suppose to operate at -215░C, and will be sensitive to any slight failings in the cooling system. The result would be fuzzy pictures which will still be of some use, though not as much as scientists had hoped.
#
"Volcano erupts in Australia",92,0,0,0
(Apr '97)
\JAustralia\j's second active \Jvolcano\j was discovered this month when a ship was sailing past McDonald Island, 4500 km southwest of Perth. The scientists aboard \IAurora Australis\i report seeing steam rising from the island, about 30 km from \JAustralia\j's other \Jvolcano\j, on Heard Island.
#
"Is the universe twisted?",93,0,0,0
(Apr '97)
One of the most basic rules of modern physics is Einstein's assertion that the laws of science apply in the same way everywhere. That being so, you would expect the radio waves reaching \Jearth\j from other galaxies to be randomly polarized-instead, the 160 known radio-emitters seem to have a bit of a twist to the way they are polarized. For the moment, the general opinion seems to be that this is interesting, but nobody seems to be rushing to bring out a revision of Einstein's work.
The study, reported in \IPhysical Review Letters\i, reports on the findings made by Borge Nodland and John Ralston, for those wishing to seek more information on the Web: key words to use in your search include Socorro and Sextans.
#
"Controversy in space",94,0,0,0
(May '97)
In 1981, a NASA rocket was launched, carrying two Dynamics Explorer satellites into elliptical orbits. One of these was in a polar \Jorbit\j, swooping down to 550 kilometres, and climbing to 23500 kilometres, it carried an ultraviolet camera designed to gather information about auroral lights at the north and south \Jpoles\j.
Using false-color photography, the camera provided spectacular images of the \Jearth\j's "dayglow", an ultraviolet glow which results from interactions between the light of the \Jsun\j and atomic oxygen, high in the \Jearth\j's \Jatmosphere\j.
The pictures were superb, but marred by small black spots which appeared randomly across the pictures of our \Jplanet\j that were transmitted back to \Jearth\j. At first the speckles were assumed to be "noise", random interference in the camera system, possibly coming from a faulty component, but no cause could be found for them. Louis Frank and an undergraduate student, John Sigwarth, started to investigate more and more carefully.
They decided that the speckles had to be coming from a source on the ground, or a source on the \Jsatellite\j, or from radio interference. After eliminating all of the equipment on the ground, they ruled out radio interference because text information from the \Jsatellite\j came through cleanly. They began to suspect the on-board light counters which slowly built up each picture. There were two of these in the camera, and they transmitted alternate pixels, so maybe there was a fault in one of them that was creating the speckles.
It was a simple enough task to extract the two sets of pixels, and show that each picture showed exactly the same pixels. Next, they looked to see if the spots were constant between shots, and found indications that the spots were still there, but that they moved and changed between one picture and the next.
Sigwarth programmed the camera to concentrate on just small area, and showed that the spots did in fact remain, that they moved, and that they changed. There was no way that these spots could be set down as "noise" any more: they were real objects, and they needed to be explained. There was something moving between the \Jplanet\j and the \Jsatellite\j.
Now in a movie, this would be the point where a mysterious "expert" identifies the speckles as UFOs, and forbids any further investigation. In real life, scientists work like scientists, and deduce what the speckles must be, free from interference. The main point, Sigwarth and Frank realised, was that the speckles moved in the same direction across the face of the \Jearth\j. Whatever they were, they were moving objects which caused a disturbance that absorbed ultraviolet radiation, and the uniform motion suggested that these objects were small meteors of some sort.
Meteoric dust is generally travelling around the \Jsun\j faster than the \Jearth\j, so it catches up with the \Jearth\j from the "evening" side, showing what astronomers call prograde motion. That was what these things were doing, but what sort of \Jmeteor\j would cause large-scale absorption of ultraviolet?
In May 1983, Sigwarth presented a paper on their findings, just after graduating. The conservatively worded title was "Atmospheric Holes Possibly Associated with Meteors", drew a small audience, but over the next three years, they gave three more papers, with progressively greater interest.
As they continued trying to identify the cause, they eliminated dust effects, they decided that \Jchemical reaction\js to remove the atomic oxygen that produces dayglow was too unlikely, so that meant some other absorber. It had to be a substance common in space, which made \Jwater\j look like a candidate, and they found that \Jwater\j vapor absorbed ultraviolet at just those wavelengths. Each, they estimated, would have to weigh about 100 tonnes, and there must be about ten million of them each year.
Now here they ran into a problem. A billion tonnes of \Jwater\j a year, raining down on \Jearth\j? No way, said the experts, it just doesn't happen. Of course, this is usually the way in science: somebody comes up with a new idea, the majority oppose it, and over time, either the evidence is gathered to prove the new idea has "legs", or more often, some crucial piece of evidence is produced to rule the new idea out, once and for all. But while the argument progresses, especially if the idea stands up to the first attacks, the exchanges can become more bitter and abrasive.
Louis Frank says that if he knew somebody else with a way-out theory like this, he would urge them to drop it, but being involved, he realised that if he let the idea go, and it turned out in the end to be true, many scientists would have wasted time. He could not simply walk away from the truth as he saw it.
You can point to different scientists and say "They laughed at X", but while we recall the occasional X who was right, we never hear about the five hundred scientists called A, B, C, and so on, who were laughed at, proved wrong, and never heard from again. And worse was to come, for Frank realised that these things they were calling meteors just had to be comets.
This was hot controversy, and the decision to publish the comets claim, and to explore the possibilities, was taken in spite of contrary recommendations from the referees of the journal in which the "story" broke, \IGeophysical Research Letters\i. One of the referees was anonymous (as is usually the case), but the other was known to Frank, and he urged him not to publish: apart from anything else, if the rain had been constant, it would have supplied enough \Jwater\j over the years to fill all of our oceans. And if that was the case, where were the oceans on the moon and the other planets?
The result was more than strong. " I was driving a bulldozer through dozens of the neatly planted fields of science and everyone was upset", said Frank afterwards. After that, the idea just sat there, not going away, but not wildly popular either.
In May 1997, Frank was able to display images which show there really are objects out there, but there is still considerable argument from the experts as to what the objects are. The new images are much finer resolution, and rather than showing single-pixel spots which might still have been noise, now we can see image that are ten to twenty pixels across, and shown in consecutive exposures.
If we have twenty comets a minute smashing into the \Jatmosphere\j, say the experts, they should light up the night sky, and they raise other problems as well. But in an interesting variation on the usual model for a paradigm shift in science, where one theory is overthrown by another, the word seems to have got out to the public.
This could be worth watching over the next two or three years, as the first paradigm shift, driven by public opinion, will be a paradigm shift in the way science works as well.
For those who wish to see more of Louis Frank's case, including a range of spectacular pictures, he has a Web site at http://smallcomets.physics.uiowa.edu
In the mean time, it is worth noting that the calculated level of \Jwater\j entry would raise the sea level by a \Jcentimetre\j every 4000 years or so. If that is the case, then maybe we are due for sea level rises, even if global warming is not happening!
#
"Global warming theory in doubt",95,0,0,0
(May '97)
The American Geophysical Union, where John Sigwarth presented the first paper on the mysterious comets, and the publisher of \IGeophysical Research Letters\i, met in Baltimore in May, where new measurements on global warming were presented, throwing some minor doubts on our present models for the nature of global warming.
Up until now, atmospheric temperatures have not risen as fast as predictions said they should. One speculation was that a haze of sulphates (produced during the burning \Jfossil\j fuels) was present in the upper \Jatmosphere\j was acting as a blocker, reflecting some of the \Jsun\j's light.
Atmospheric scientist Peter Hobbs of the University of Washington, Seattle, has been analysing the chemistry and reflective properties of the particles. He reports that he found more carbon particles than sulphate particles. As carbon absorbs more than it reflects, this should warm the \Jatmosphere\j, rather than cool it. A possible explanation may be that the carbon particles, although they are absorbers, may be acting as seeds for \Jcloud\j formation, and clouds most certainly \Ido\i act as reflectors of light.
#
"Ice Ages theory gains ground",96,0,0,0
(May '97)
Sometimes the news for a theory is good, sometimes it is bad-in either case, it is just another example of the way science works. Scientists may argue passionately for their favorite theory, but in the end, it is facts which will decide the theory that will be accepted.
Once, people accepted quite happily that the main cause of Ice Ages was small variations in the \JEarth\j's \Jorbit\j. Then, over about the last decade, this theory has been in doubt, until some good news came through for the \Jorbit\j fluctuations, based on more careful checking of the dates of ancient corals.
The problem came from carbonate deposits brought up from Devil's Hole in Nevada in the USA, deposits which seemed to record warming at the wrong times, given what astronomers know of the patterns of the \Jearth\j's \Jorbit\j. Now Lawrence Edwards of the University of \JMinnesota\j and his colleagues have used a new clock, based on the decay of uranium 235 to protactinium 231, counting the individual atoms by mass spectrometry, and they have also dated \Jcoral\j records of se-level change from \JBarbados\j. Sea levels fall in an \J\Jice age\j\j, as more and more of the \Jearth\j's \Jwater\j is tied up as glaciers over the land.
The good news is that the \JBarbados\j corals show \Jice age\js at the right place, with the last interglacial at around 129 000 to 120 000 years before the present. The bad news: the Devil's Hole data stand up as well, suggesting that these must have recorded a more local warming, but nobody is sure why this is so.
#
"Shoemaker-Levy--an ominous sign?",97,0,0,0
(May '97)
So we are being pelted with tiny comets, the \Jweather\j has gone feral on us, so what hope is there for us? \JComet\j Shoemaker-Levy, which ploughed so spectacularly into Jupiter in 1994 is estimated to have been some millions of time larger than the small \Jwater\j comets that Louis Frank contemplates, so we should be comparatively safe, unless another Cretaceous-Tertiary boulder hits us, right?
Wrong. There are plenty of other large craters on \Jearth\j, and the estimated 10 km diameter boulder that probably wiped out the dinosaurs is not alone. And now some cheerful workers have simulated the results of such a hit, using a new supercomputer which is under development at the Sandia National Laboratory to simulate the formation of the suspected K-T crater, Chicxulub, in the Yucatßn peninsula in Mexico.
Rather than using a Hale-Bopp sized object (weighing in at some ten trillion tonnes), they used a conservative 1 billion tonne rock, the sort that hits the \Jearth\j about once every 300 000 years. After 48 hours processing, they came up with two animated movies showing the \Jcomet\j sliding in at an angle of 45░ and hitting the ocean. This hits with a power of some 300 gigatonnes of TNT-about ten times more than all of the nuclear weapons in the world at the height of the \JCold War\j.
The "splash" involves the vaporisation of between 300 and 500 cubic kilometres of ocean, some of it blasting into space. But if you think that sounds like bad news, there is worse. The same computer code was used two years ago to predict what would happen when Shoemaker-Levy hit Jupiter, and produced a good model of what happened when the \Jcomet\j eventually hit the giant \Jplanet\j. So the simulation is not only spectacular, it is almost certainly highly accurate. Catch the movie if you can: it should be better than the real thing.
#
"Hans Bethe speaks against hydrogen bomb",98,0,0,0
(May '97)
Hans Bethe, one of the original builders of the atomic bomb, is now 90. It is now almost sixty years since he worked out how nuclear fusion powers the \Jsun\j, and he is still worrying about the same issue.
The winner of the Nobel Prize for Physics in 1967 says that it is time to stop work on any further nuclear weapons. In a letter sent to President Clinton in April, released to the public in May, Bethe called for the end of all physical nuclear tests, no matter how small, and an end to "computational experiments or even creative thought designed to produce new categories of weapons".
Bethe's letter was released this month by the Federation of American Scientists as the US Senate was getting ready to debate the ratification of the Comprehensive Test Ban Treaty. They believe that the Treaty, even if ratified could leave the way open for the military to develop "pure" fusion weapons which do not use a fission device as their trigger.
It is well-known that the standard fusion device uses a fission explosion to get it started. It is less well-known (in fact, many people believe it is still a secret) that the fission explosion produces an intense burst of x-rays, and that these are then focused (in a method which thankfully does remain rather more secret) onto the fusion fuel, which is usually \Jlithium\j deuteride, starting the fusion process which we call a "\Jhydrogen\j bomb". (\JDeuterium\j is an isotope of \Jhydrogen\j with twice the usual mass.)
One of the drawbacks of this form of explosion is that it produces \J\Jradioactive fallout\j\j from the fission trigger, while a "pure fusion" bomb would presumably be a great deal "cleaner". Bethe believes that such a development is unlikely, but he is concerned that if people even contemplate such a study, then nuclear disarmament plans could be placed at risk, right across the world.
Court action was also started against the National Ignition Facility, the clean trigger for fusion which is planned to be built at the Lawrence Livermore National Laboratory, and against other nuclear weapons programs under the control of the US Department of \JEnergy\j. The action was launched by the Natural Resources Defense Council and 38 other environmental and activist groups, in the US District Court.
Bethe's argument against the National Ignition Facility has also been supported by Herbert York, formerly the director of the Livermore Laboratory. "You can't stop people from thinking," he said, "but you can tell them you're not going to pay them for it."
#
"AIDS updates",99,0,0,0
(May '97)
\BTriple whammy\b
A potent combination of three drugs, used together, seems likely to be effective in reducing the levels of \JHIV\j virus in the \Jlymph\j nodes, a major reservoir for the virus. The aim is to attack two enzymes that the virus requires. One of these, \Ireverse transcriptase\i, (RT) is used to convert the viral "\Jchromosome\j" RNA to a DNA form, which allows the virus to lie concealed as DNA. (The "normal" transcription is from DNA to RNA, hence the name given to this \Jenzyme\j.)
The second \Jenzyme\j is \Iprotease\i, an \Jenzyme\j which breaks down proteins. In the treatment, \JHIV\j patients (that is, people who were \JHIV\j-1 seropositive) were given two RT inhibitors, AZT and 3TC, along with a protease inhibitor called ritonavir. In the study, the three-drug mix lowered the viral concentrations by 99% in the \Jtonsils\j, a type of \Jlymph\j node. There are reasonable grounds to think that the same results might be achieved in the major \Jlymph\j nodes, which up until now have been out of reach for treatments.
But while there is hope, nobody has yet been cured, and the lowered levels will probably rely on continuing the treatment. And more importantly, there seems to be no ready way of dealing with the cells which contain \JHIV\j DNA, which can break out at any time, producing more virus particles to reinfect the body.
\BAIDS vaccine plan from Clinton\b
President Clinton, echoing President Kennedy's target for a moon landing, has called for a ten-year timetable to create an AIDS vaccine. Some researchers feel that the timetable is folly, that the vaccine will come when certain unknown discoveries are made at an unknown time, while the moon landing program was a straightforward exercise in \Jengineering\j. Others feel that the timetable adds a sense of urgency to the issue of finding a vaccine.
\BBaltimore challenges the standard view\b
Nobel laureate David Baltimore suggested during May that \JHIV\j may actually disarm the cytotoxic T-lymphocytes (CTLs), white blood cells which are commonly thought to seek out and destroy \JHIV\j-infected cells, all on their own. If Baltimore is right, this could explain why some experimental vaccines work in monkeys.
The immune system can only make CTLs to attack a specific pathogen if bits of that pathogen are held on the surface of the infected cell, acting as a flag to indicate that the cell has been taken over. According to research that Baltimore described, an \JHIV\j protein called nef seems to block the cells from sending this signal.
If monkeys and humans are infected with \JHIV\j strains that lack the nef gene, they do not seem to be affected in the usual way, presumably because the immune system is able to stimulate the production of CTLs which will then seek out and destroy any infected cells.
David Baltimore has announced that he will be moving to the \JCalifornia\j Institute of Technology as its new president. He will, however, remain as the head of the AIDS vaccine advisory committee at the National Institutes of Health, a post which he only recently took up.
#
"Genetics news",100,0,0,0
(May '97)
\BWatson and Hitler\b
Another Nobel laureate in the news addressed a molecular medicine conference in \JGermany\j, and raised howls of annoyance over his comments. James D. Watson was the co-discoverer of the structure of DNA, and the founder of the Human \JGenome\j project, so when he urged \JGermany\j to be less hostile to \Jgenetics\j research and to focus on the great benefits that applying \Jgenome\j research can offer humankind, it might have been better if he had not added that it is time to " . . . put Hitler behind us."
The Nazi regime of Adolf Hitler hijacked the name "\Jeugenics\j", and used it to justify all sorts of atrocities, including the infamous \Jconcentration camp\js, and the German reaction to anything involving \Jgenetics\j has been colored by this ever since. After making even more contentious remarks about the Nazi era, and the fate of those who served the regime, Watson turned to what the Germans are doing about the human \Jgenome\j project. German funding for the project, he suggested, was too low. "Your budget is still totally inadequate for \JGermany\j to have a real impact" he said. "You are putting money in to use the \Jgenome\j, not to get it."
#
"Gene patents",101,0,0,0
(May '97)
European patent law states that an inventor cannot take out a patent on any discovery that has already been made known to the public. One result of this is that DNA data, once published are no longer patentable under European law. In the United States, discoverers are allowed a one-year "period of grace" in which to prepare and lodge a patent application, after the results in question have been reported in the scientific literature.
The members of the international Human \JGenome\j Organisation (\JHUGO\j) have called for a similar provision to be offered to European scientists, so they can report their results just as fast as they get them. Failure to allow this could undermine the whole standard of scientific cooperation that scientists accept as normal.
At the same time, the \JHUGO\j's 10-member intellectual-property committee who made the plea also criticised the American Patent and Trademark Office, whose director has suggested that patents should be granted on short stretches of genes known as "expressed sequence tags." This, they say, could mean that somebody who merely describes a sequence, without identifying its function, would be able to lay claim to it.
#
"A complete genome",102,0,0,0
(May '97)
As an example of the sort of cooperation which scientists regard as normal, May 29 saw the first publication of a complete \Jgenome\j or a eucaryotic organism: in this case, the yeast, \ISaccharomyces cerevisiae\i, which was published as a separate supplement to the week's issue of \INature\i.
\BGenomics\b
But once we have the information, how do we use it? During May, a consortium of companies stitched up a 5-year $US40 million deal to develop what they call "functional genomics" at the \JMassachusetts\j Institute of Technology (MIT).
There are probably 60 000 to 100 000 genes in the human \Jgenome\j, and the Human \JGenome\j Project has brought us to the stage where we have a rough map of the 3 billion nucleotides in human DNA. These maps are studded with thousands of the landmarks called "sequence tagged sites" that the director of the US Patents and Trade Marks Office thinks ought to be patentable.
Now it is time to put that information to work in biomedicine, and to speed up the identification of sequences. The companies in the consortium will get access to the new technologies that will develop during the program. This area of science may well soon be part of industry and business, rather than a part of science.
#
"DNA--double double helix",103,0,0,0
(May '97)
Just when we think we know what it is all about, as we sequence genes, clone sheep, insert genes into other animals and more, up comes a surprise nobody ever expected. The name "double helix" is synonymous with DNA for all scientists, and even for those lay people with some general interest in science, but now DNA has been spotted which does not fit this label. Instead, it is a quadruple helix.
This is not entirely a new phenomenon, because some types of synthetic DNA have been known to form quadruple strands, but these were unusual forms of DNA, and some researchers thought the quadruple strand could have been formed during the preparation of the samples. Now a report from Stephen Salisbury and his colleagues at the Cambridge Crystallographic Data Centre describes a quadruple \Jlinkage\j in natural DNA.
In normal DNA, the two strands are linked when a \Jthymine\j and an \Jadenine\j form an AT \Jlinkage\j, or when a \Jcytosine\j and a \Jguanine\j form a CG \Jlinkage\j. In the quadruple strand, two AT pairs in one double helix have managed to link to two TA pairs in the other double helix. A positive sodium ion "glues" the arrangement by exerting a pull on four negatively charged oxygen atoms on the four \Jthymine\j molecules.
The conditions in which the quadruple strand formed are fairly normal, leading Salisbury to speculate that such arrangements might play a part in real life, perhaps in the process of "crossing over" which occurs as gametes are formed, when \Jchromosome\j pairs swap material with each other. These exchanges of material need to happen at exactly the right place, or faulty chromosomes will be formed. That means that the chromosomes need to be lined up perfectly, so maybe the quadruple strand is something that happens in all sorts of cells, all the time, ensuring the formation of perfect genetic sets of sperm cells and ova.
#
"A gene for ADHD?",104,0,0,0
(May '97)
Cynical teachers have remarked for years that attention-deficit \Jhyperactivity\j disorder (ADHD) is caused by a failure of parents to pay due attention to their offspring, that the problem is hypoactive parents, rather than hyperactive children. This attitude is due in large part to the enthusiasm with which some parents seek to have their children diagnosed as "ADD", but behind that, there is a genuine condition which affects around 4% to 6% of school-age children. These children fidget, fiddle, call out impulsively in class, and have a short attention span.
Previous research has indicated that there could be multiple genes involved, with chromosomes 5, 6 and 11 all being identified as sites for such genes. That research, however, did not indicate whether the condition was an "all or nothing" problem, or whether ADHD came in a variety of levels, possibly requiring different responses or levels of response.
In the June issue of the \IJournal of the American Academy of Child & Adolescent \JPsychiatry\j\i, an Australian researcher, Florence Levy of the Prince of Wales and Sydney Children's \JHospital\j, reports on a survey of almost 2000 families with children aged between 4 and 12. In the survey, Levy and her team asked parents to rate their children on fourteen ADHD symptoms identified by the American Psychiatric Association.
By looking at siblings, non-identical and identical twins, the team were able to analyse the data. They conclude that genetic factors account for 75% of the variability, much higher than the levels seen with other behavioral problems that are inherited, like \Jschizophrenia\j or \Jalcoholism\j.
More interestingly, those groups with more of the fourteen symptoms did not seem to show increased levels of heritability, suggesting that ADHD is not a case of an "all or nothing" condition. (If it were an "all or nothing" situation, the probability of inheriting should be increased when those involved carried a high genetic load.)
#
"DNA of worms",105,0,0,0
(May '97)
DNA sequencing has become a powerful tool for taxonomists, the scientists who classify plants and animals. A recent study suggests that the moulting invertebrates have some family resemblances, and that the ability to moult arose once only in the history of life, descending into the different phyla as \Jevolution\j progressed.
On this basis, researchers argued this month for a grouping called the Ecdysozoa, the moulting animals, made up of arthropods, tardigrades, onychophorans, nematodes, nematomorphs, kinorhynchs and priapulids. Interestingly, \Jgenome\j research on the invertebrates has concentrated on the \Jnematode\j \ICaenorhabditis elegans\i and the fruit fly \IDrosophila melanogaster\i (an arthropod), which are generally thought to be very distant relations, but which both fit into the proposed new group.
\BDo worms get cancer, too?\b
More than a third of the human genes which have been linked to different diseases are echoed in the genomes of our most distant relatives. Even worms, yeast and \Jbacteria\j show similarities to ourselves. Scientists are now wondering if this means that they will be able to carry out research and tests on simpler animals to deal with these problem genes, or at least to unravel the biochemistry involved.
The researchers identified a number of genes linked to conditions like bowel cancer and \Jobesity\j, and then scanned for related sequences in databases of gene sequences for the yeast \ISaccharomyces cerevisiae\i, the \IEscherichia coli\i \Jbacteria\j, and the \ICaenorhabditis elegans\i \Jnematode\j worm. While they got hits in the 10-20% range for \Jbacteria\j and yeasts, the worm \Jdatabase\j produced a matching rate of 36%.
#
"Entire human chromosome in a mouse",106,0,0,0
(May '97)
A Japanese team announced that they had succeeded in transferring a whole human \Jchromosome\j, regulatory sequences and all, to a mouse. This is about fifty times more DNA than has ever been transferred to a mouse before-the usual has been just a few isolated genes, which often do not operate fully when they are inserted in isolation. The work was reported at the end of the month in the June issue of \INature \JGenetics\j\i.
#
"MS cure in mice",107,0,0,0
(May '97)
Researchers at Stanford University reported in the \IJournal of Experimental Medicine\i on a form of gene therapy which worked in treating mice with a condition which mimics multiple sclerosis (MS). While this is only a small step, it must offer some hope that there is a future for gene therapy for those with the human MS condition.
\BA model for Alzheimer's in mice\b
The last issue of \INature\i for the month revealed that scientists have developed a strain of mice who appear to model the early stages of Alzheimer's disease. The mice show neuron and memory loss without plaques, which means that they may be suitable test animals for drugs intended to slow the onset of the condition.
#
"Alzheimer remembered",108,0,0,0
(May '97)
Recent studies of 90-year-old \Jmicroscope\j slides have revealed the patient whose case probably got the name of German physician Alois Alzheimer into the medical textbooks. The original case, as described by Alzheimer, did not suffer from Alzheimer's disease, as we now understand it, as Auguste D (as the case is identified) had hardening of her arteries.
The second case, Johann F, is still with us in the form of ninety-year-old slides of brain tissue, and these reveal the tell-tale plaques of the disease. The slides and other records have been found in the basements of the University of Munich, where they have been "lost" for decades.
Even so, Johann F was not a typical case, since he lacked a gene, the Alzheimer's susceptibility \Jallele\j apo-epsilon-4, which is carried by two in every three Alzheimer's sufferers. The name of the disease-"Alzheimer'sche Krankheit" in German, was published in 1910, just before Johann F died, and Alzheimer wrote the name of the disease in the autopsy book. The full report appeared in \INeurogenetics\i this month.
#
"Breast implants dangerous?",109,0,0,0
(May '97)
One of the problems with the "scare campaigns" that start up from time to time is that some of them are just that-scare campaigns. And one of the problems with refutations of the scare campaigns is that they can sometimes be wrong.
For some time now, it has been a "well-known fact" that breast implants cause all sorts of problems, and statistics have been produced to demonstrate this, comparing the implantees with other women, and showing that they have a greater incidence of certain problems. Careful analysis has even shown that you could expect problems to arise.
But what if the women who elect to have implants are not typical of women as a whole? Would it be reasonable to compare them with other women? During May, the \IJournal of the American Medical Association (JAMA)\i reported that the women who have implants for cosmetic purposes are very different from their unimplanted sisters.
A sample of 3570 women were sampled in the study, including 80 women who had augmented their breasts for cosmetic reasons. The group of 80 were three times more likely to have seven or more alcoholic drinks in a week, nine times more likely to have had 14 or more sexual partners, and more than twice as likely to have used the \J\Jbirth control\j\j "Pill", or to have had an \Jabortion\j.
They were less likely to be overweight, and used hair dyes more, and these factors could all have a "confounding effect", which is scientist-speak for "could cause some or all of the observed differences".
Against that, a 1995 comparison of women who had reconstructive implants and women who had cosmetic implants showed very similar symptoms, suggesting that "life style" causes could be ruled out. As is so often the case in science, the jury is still out on this one.
#
"Prions--how they work",110,0,0,0
(May '97)
Cells are undoubtedly alive, since they reproduce on their own. Some people argue that viruses are alive, because they can reproduce, even if they only do it in cells, and they have nucleic acid to store a sort of "\Jchromosome\j". What are we to make of prions, the proteins which are believed to cause BSE (mad cow disease), scrapie, and Creutzfeld-Jakob disease (CJD)? These proteins have no nucleic acids, and we have little idea of how they carry out their role-not until now, anyhow.
At the end of May, \ICell\i reported on work by a group led by yeast cell biologist Susan Lindquist of the University of \JChicago\j, who have been exploring the way in which prions operate. They found that there was a characteristic in yeast which could be passed to new generations of yeast, but which involved no changes in the yeast's DNA, a characteristic which causes the yeast cells to clump together.
The cause seems to be a protein, identified as Sup35. When this protein is kept in the test tube, it polymerises into long fibres like those seen in the prion diseases. If a small amount of the polymerised protein is added to a fresh sample of Sup35, the whole solution polymerises more quickly. This matches what has previously been suggested for the operation of prions, and so makes researchers more confident that they know what the mechanism is. The next step: to try the same experiment with the prion proteins.
#
"Cockroaches linked to asthma?",111,0,0,0
(May '97)
A report in \IThe New England Journal of Medicine\i during May suggests that cockroaches may be a significant cause of \Jasthma\j in disadvantaged households in the US. The study looked at the effects of three common allergens associated with \Jasthma\j, the faeces of dust mites, the faeces of cockroaches, and feline skin flakes, the equivalent of cat \Jdandruff\j.
The research revealed that children who were allergic to cockroaches, and who also had high levels of cockroach allergen in their bedrooms, were more than three times as likely to be admitted to \Jhospital\j for \Jasthma\j attacks than other children. Even after other factors were taken into account, the effect remained significant, but the researchers say that they have not been able to rule out the possibility that fungi, or rat or mouse droppings may also be involved. In the mean time, it looks as though there is a good market out there for environmentally friendly roach baits.
#
"Chernobyl still a risk",112,0,0,0
(May '97)
It is now more than eleven years since the disastrous explosion at \JChernobyl\j, but the risk is not over yet. An international program costed at $US780 million will be set in operation over the next few months, aimed at reducing the risks of a second explosion.
After the first explosion, a plume of radioactive \Jisotopes\j poured out, and the fuel of the reactor melted and flowed into the rooms below the reactor chamber, and then cooled and solidified. A concrete shell, referred to as a sarcophagus has been built over the chamber, but gaps have allowed \Jwater\j to leak in-as much as 3000 tonnes of it. This could act as a moderator, slowing neutrons and triggering a faster reaction, or even a full explosion, as a \J\Jchain reaction\j\j takes off.
In a normal reactor, controls are in place to stop any \J\Jchain reaction\j\j running wild, but the uncontained fuel is not in a position to be controlled. The risk of the mass of fuel "going critical" is thought to be low, but without the \Jwater\j, there will be no risk at all. There have been at least three occasions since 1990 when the neutron flux increased inside the sarcophagus, reminding us that the present low risk is not enough.
The rescue program involves fitting new neutron detectors, draining the radioactive \Jwater\j, building a new sarcophagus, and creating robots that can move into the "hottest" parts of the sarcophagus, where humans cannot venture. These robots would be able to take fuel samples and measurements, allowing better planning to be carried out.
#
"Dinosaurs ancestors of birds?",113,0,0,0
(May '97)
Most palaeontologists say that the dinosaurs were the ancestors of the birds, and a few of them will even go so far as to claim that the dinosaurs are still alive today, only we call them birds. (This point of view is represented in the original "Jurassic Park" movie in the scene where the small dinosaurs are observed to be "flocking this way".)
The problem is that there are a few gaps in the record from "\Jdinosaur\j" to "bird", but an Patagonian \Jfossil\j, revealed during May, may help to close the largest of the gaps. A team from the Argentinean Museum of Natural Sciences, led by Fernando Novas, found a \Jfossil\j \Jdinosaur\j, about 1.5 metres tall, with a pelvis which was intermediate between the pelvis of \IDeinonychus\i (picture the \IVelociraptor\i of Jurassic Park) and that of Archaeopteryx, which had feathers 150 million years ago, and is usually recognised as the "first bird".
The \Jfossil\j theropod, named \IUnenlagia comahuensis\i, is only about 90 million years old, so it is ruled out as a bird ancestor, but the \Jscapula\j (shoulder blade) of the beast suggests that it ran with a flapping motion, exactly the sort of motion you would expect to see if flight is evolving. The assumption is that some of \IUnenlagia\i's ancestors carried their flapping into the air, while others just continued to use it on the ground.
#
"Largest dinosaur unearthed!",114,0,0,0
(May '97)
Also revealed in May was the skull of the biggest meat-eating \Jdinosaur\j seen on \Jearth\j, dwarfing even \ITyrannosaurus rex\i, but with many of the same unfriendly features (if you are a smaller piece of meat, that is). Along with \IT. rex\i and \ICarcharodontosaurus\i, a giant meat-eater found by Paul Sereno in Morocco, the 100 million year-old \IGiganotosaurus carolinii\i of Rodolfo Coria is in good company, but it retains top position, with a skull 180 cm long.
Palaeontologists are now beginning to consider that there may be more riches to be uncovered in the sediments of \JSouth America\j: after that, it will be time to explore the other parts of \JGondwanaland\j, \JAntarctica\j and \JAustralia\j, to see what secrets they are harbouring.
\BChina in the wings\b
Then the hunt will need to move on to China, where a massive new deposit has been found in the Yixian formation of the Liaoning province of north-east China. Hundreds of early birds and dinosaurs are to be found there, and already a female \Jdinosaur\j called \ISinosauropteryx\i has been excavated which has a fossilised \Jmammal\j carcass in its gut and an egg in its oviduct. The site is on the edge of the Jurassic-Cretaceous boundary, and promises to reveal a great deal more over the next few years.
#
"Humans once giants?",115,0,0,0
(May '97)
As an example of the sort of problem which can only be solved by studying whole populations of fossils, a letter to \INature\i (May 8) estimates that the body mass of humans has dropped 13% over the past two million years, having climbed to a peak about 1.4 million years ago, and only dropping to modern levels about 100 000 years ago. The authors, C B Ruff, Erik Trinkaus and T W Holliday, have developed a new method of estimating body mass, and applied it to a sample of 163 Pleistocene \IHomo\i specimens.
#
"Brain surgery in Stone Age",116,0,0,0
(May '97)
A skull, more than seven thousand years old, has been excavated from a Stone Age burial site in \JAlsace\j. Trepanning, or removing sections of the skull bone is a common African treatment for a variety of conditions ranging from headache to \Jepilepsy\j, but this skull suggests to us that the practice of trepanation was known and used in Stone Age Europe. There are two holes in the skull, the larger one being 9 cm in diameter.
#
"Modern humans in Europe, 800 000 y.a.",117,0,0,0
(May '97)
New human remains, believed to be 800 000 years old, have been found at Atapuerca in \JSpain\j. They appear to have had a face rather like that of modern humans, and are being treated by some as the ancestors of both the Neandertal people and \IHomo sapiens\i. But this claim, and the claim for a separate species, \IHomo antecessor\i, are both being questioned by many palaeontologists.
This is hardly surprising given that the palaeontologists are largely divided into "splitters" and "lumpers", those who hail new species at every opportunity, and those who would squeeze everything into a small box with just a single label.
The discoverers, palaeoanthropologists JosΘ Berm·dez de Castro and Antonio Rosas of the National Museum of Natural Sciences in Madrid and their colleagues, have uncovered 80 fossils from a boy and five other early humans. They clearly identify themselves as splitters, since they refer in their report to \IHomo heidelbergensis\i, but the claim, based as it is on the facial features of the boy, intermediate between modern humans and the Neandertal people, is going to need a lot of support before it finds general favor.
#
"Mapping fossil hot spots",118,0,0,0
(May '97)
Across the \JPacific ocean\j, chains of volcanic islands show us where the plates of the crust have dragged across stationary hot spots. As the crust passes over the "hot spot", where a plume of hot rock is pushing up from deep inside the \Jearth\j, the hot rock occasionally bursts through, forming a marker for a former position of the hot spot which remains as the crustal plate trundles on.
Many of the volcanoes only exist as seamounts, volcanic mountains eroded down until they sink beneath the sea, mountains which then remain as comparative shallows, completely hidden under the sea. The recent release of declassified \Jsatellite\j data gave Paul Wessel and Loren Kroenke of the University of Hawaii just what they needed: information on 8800 Pacific seamounts.
Three accepted hot spots are near Hawaii, Rarotonga, and \JLouisville\j, but the aim of the study was to find more of these, including the hot spots which have since "burned out". Using simple \Jgeometry\j, the researchers were able to back-track the three established hot spots, but none of the other dozen or so expected hot spots stood out. This failure has led some researchers to question the plume theory, but it may also stem from something as simple as shallow hot spots which drift over time.
Most interestingly, the \JLouisville\j hot spot is now likely to lie on the Hollister Ridge, south of the Eltanin fracture zone.
#
"Corn plant's defence",119,0,0,0
(May '97)
The \Jmetaphor\j of warfare is a popular one to describe \Jevolution\j, but corn plants seem to have elected to take the \Jmetaphor\j seriously. Beet armyworm caterpillars chewing on corn leaves dribble as they go, and the dribble contains a chemical which stimulates the leaf to release chemical vapors. Unfortunately for the caterpillar, the chemical vapor acts as a dinner call for parasitic wasps which fly in, laying eggs in the caterpillar, so the grub eventually dies.
Researchers have shown that mechanical chewing has no effect on the leaves. Without caterpillar spit, there is no reaction at all. But given the spit, which contains a chemical that researchers have dubbed "volicitin", the corn pumps out a mix of terpenoids and indole which calls in an air strike of wasps every time.
Now come the problems: working out why \Jevolution\j has left the caterpillars producing this chemical, which must be important to them, and how this can be used to help control crop pests. There is no future in spraying volicitin on crops, as this would simply confuse the wasps, and prevent them finding places to lay eggs, but somewhere in there, scientists think there are some useful tricks, just waiting to be found.
#
"Nereus, a close look",120,0,0,0
(May '97)
Japanese plans to explore an asteroid looked firmer this month, after NASA indicated that they will supply a robotic rover and ground support to the mission, due for launch in January 2002 on a 20-month mission to \JNereus\j.
In theory, the \Jasteroids\j are original rocks, unchanged since the start of the \J\Jsolar system\j\j, the sort of stuff that we see when meteorites fall to \Jearth\j. In this case, however, the samples will be free of any heating effects from friction, so the samples will reveal just how close these \Jmeteorite\j samples are to the rocks of space.
MUSES-C, the US$104 million \Jspacecraft\j, will drift alongside the asteroid for two months, landing three times to collect samples. The main problem will be collecting rock samples on conditions that are close to zero-gravity. The plan is to fire a metal projectile at the asteroid, knocking off fragments of rock, some of which will be caught in a funnel and carried into the \Jspacecraft\j.
\JJapan\j's Institute of Space and Astronautical Science (ISAS) will be working on some exciting new technologies, especially an ion-thruster propulsion system which should reduce the weight of the fuel that rockets must carry by a significant amount.
#
"Hubble looks at M84",121,0,0,0
(May '97)
There was slightly bad news and very good news from Hubble this month. As we indicated in the April update (\IHubble trouble\i?), the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) is causing some worries. Two of the cameras are doing very well indeed, but the continuing loss of coolant means that the camera is likely to have only half its expected life, so schedules have been rejigged to give NICMOS up to 50% of the Hubble's time over the next 18 months. Still, not everything was gloomy.
"You can have any color you like," Henry Ford is supposed to have said of his Model T, "so long as it's black." Logically, a black hole emits no light, and can have no spectrum, but the gas which whirls around, close to, but beyond the black hole's horizon, does indeed have a spectrum. The Hubble Space \JTelescope\j's Space \JTelescope\j Imaging Spectrograph (STIS) has taken a close look at the centre of M84, a galaxy 50 million light-years away, where there is a black hole at the galaxy's centre. They have come up with a spectrum for the light coming from the gas and dust which orbits the black hole at 400 kilometres/second, 1.5 million kilometres or 900 thousand miles an hour.
Why does this matter? Well, given that speed, you can work out the mass of the black hole, which apparently weighs as much as 300 million copies of our \Jsun\j. As we explained in the February update(\IHubble gets a facelift\i), the STIS is able to sample multiple points simultaneously, so it can survey an entire black hole in just twenty minutes, and researchers expect to survey hundreds of black holes over the next few years. With a population of black holes to work on, physicists hope to have a better understanding of the way the universe works over the next few years.
#
"Dark matter mystery gets worse",122,0,0,0
(May '97)
There are certain basic assumptions that physicists and astronomers take for granted. One of these is that orbiting bodies which travel incredibly fast can only do so if they are travelling around a very massive object. The mass of the orbiting body itself does not matter-it can be very light, very heavy, or anywhere between, and it will still move in exactly the same \Jorbit\j. A heavier orbiting body feels a greater pull from gravity, but also needs more force to keep pulling it into its \Jorbit\j.
Still, if you have very fast orbiting bodies, you can at least tell that there has to be a lot of mass at the centre. This rule applies also to a galaxy where the stars are travelling swiftly around the orbital centre. But what lies at the centre? Some kind of "\J\Jdark matter\j\j" is the best answer we can give right now, making the identification of the "\J\Jdark matter\j\j" very interesting to astrophysicists.
Infrared observations of NGC5907, a spiral galaxy similar to our own, reported in \INature\i at the start of the month, indicate that the centre of that galaxy contains a large number of very small stars which are rich in heavy elements. The only snag is that our galaxy seems to have no such stars, and theory tells us that these stars, deep in the galactic halo, should be poor in heavy elements. Instead of helping, these observations seem only to have made matters worse. It's enough to make a physicist sick.
The penguins must certainly be feeling sick. A resistant avian virus, infectious bursal disease, appears to have been spread to emperor penguins in the Antarctic. The disease, which is common in domestic chickens, causes haemorrhaging and breathing difficulties, and often kills its victims.
The cause of the spread of the disease is unknown, but it has likely to have been introduced after the careless disposal of the remains of chicken products. In tests, up to 65% of the penguin chicks tested around the Australian Antarctic base at Mawson showed \Jantibodies\j to the disease.
This month, Mark Sobsey told an American Society of Microbiology meeting that sewage pumped from tanks on airliners routinely contained infectious viruses which had survived the disinfectants used in the tanks. Should we be suspicious about airliners flying tourist trips to \JAntarctica\j, and discharging their tanks, or worse, their kitchen refuse? Not so, according to our spy in the airline industry, who says that all wastes are carried back to the airport where the plane lands.
#
"Mitochondria, the evolution",124,0,0,0
(May '97)
How did the mitochondria, the power houses of the cells, evolve? The most popular answer today is that they arose by \Iendosymbiosis\i. Rather than creating such things by themselves, large cells engulfed smaller cells with useful powers, and rather than digesting the smaller cells, they adopted them, setting up a mutually satisfactory arrangement, where the small cells had a safe home, but made their own special contribution to the operation of that home.
One piece of evidence supporting this theory is that the mitochondria have their own DNA, or at least their own scraps of DNA, but up until now, nobody has been sure what the original mitochondria were like, though the rickettsial group of alpha-proteobacteria fills the role of "usual suspect".
A freshwater protozoan \IReclinomonas americana\i has a huge helping of mitochondrial DNA, with 69 034 base pairs making up 97 genes, the largest collection of genes so far identified in any mtDNA that has been studied. There are even four genes specifying a multi-subunit, eubacterial-type of RNA polymerase. Together, these discoveries (reported in \INature\i this month) suggest that this \Jmitochondrion\j is closer to the original form of free-living \Jmitochondrion\j than any other studied so far.
#
"Red algae rewrite family tree",125,0,0,0
(May '97)
And in a related story which broke at the very end of April, the red \Jalgae\j were promoted into a new branch of life, at least by some researchers who reported in the \IProceedings of the National Academy of Sciences\i about the family tree of the red \Jalgae\j.
John Stiller and Benjamin Hall of the University of Washington, Seattle have been studying the gene for RNA polymerase in chloroplasts. Like the mitochondria, the chloroplasts probably had an independent origin, and were taken up by other cells in endosymbiosis. The usual assumption is that this invasion, occupation or whatever happened once and once only, and that all chloroplast-bearing cells are descended from that event.
Looking at the gene for RNA polymerase, the researchers find evidence that the simple common-ancestor model has problems. It looks as though one of three scenarios was played out, although it is hard to say which one, since all offer some attractive explanations of other observations:
ò an ancestor of the green \Jalgae\j engulfed a \Jchloroplast\j, and so did an ancestor of the red \Jalgae\j, or
ò all organisms once had chloroplasts, and many of them lost them, or
ò the ancestor of the green plants engulfed a red algal cell to gain the \Jchloroplast\j through \Isecondary endosymbiosis\i.
Changes to family trees, base on the analysis of single genes, tend to be changed back when a few more genes are examined, but in the mean time, quite a few people are looking at the read \Jalgae\j in a different light.
#
"Jupiter update (May 97)",126,0,0,0
We already knew that Ganymede has a magnetosphere and a magnetic field, and this raised hopes that Callisto might also have an internally generated magnetic field. New reports in from the analysis of \JGalileo\j space-craft results tell us that this is not the case. The \Jelectron\j density around Callisto indicates that the moon is a generator of plasma, perhaps from a thin Callistan \Jatmosphere\j, rather like the wispy \Jhydrogen\j \Jatmosphere\j of Ganymede. \JGalileo\j flew just 1100 km from the surface of Callisto.
If (as seems likely, given the magnetometer results so far) Io also has a magnetic field, then it will have a large metallic core, while scientists are now fairly confident that Ganymede is made up of a metallic core, a silicate mantle, and a deep outer layer of ice, while they assume that Callisto is most likely to be a homogeneous object consisting of a mixture of 40% compressed ice and 60% rock, iron and iron sulphide. They speculate that perhaps Ganymede was melted in a period of resonant tidal heating by Jupiter, allowing the different parts to mix together.
#
"New radio telescope",127,0,0,0
(May '97)
The world's biggest radio interferometer is the New Mexico Very Large Array (VLA), but astronomers are never satisfied with what they have. Astronomers in \JAustralia\j, Canada, China, India, the Netherlands and the United States are planning the Square \JKilometre\j Array Interferometer (SKAI). With 75 times the area of the VLA, this will cost an estimated $US150 million, and involve 34 elements.
Each element will have an area the size of a 200 metre dish, and the elements will be in a circle with a radius of 150 kilometres (100 miles). The resolution will be even better than that obtained by the Hubble Space \Jtelescope\j. While it is in the early planning stages, and a site has yet to be chosen, the planners hope to begin construction on SKAI by the year 2005.
#
"Deep Blue beats Kasparov",128,0,0,0
(May '97)
As the shock news broke, that Deep Blue had beaten Gary Kasparov, the stories ranged from references to a "Grandmaster machine" through to dismissive headlines along the lines of "giant calculating machine beats Kasparov".
Overall, the consensus was that the machine, while effective in producing winning moves in a chess game, was by no means a thinking machine, and did not play chess in any real sense. Kasparov won the first game, Deep Blue won the second, and the next three games in the six-match series were draws, making the final game into a "sudden death" playoff, where the winner would take all.
As it turned out, Kasparov lost the game, rather than Deep Blue winning, when he made a blunder on his eighth move. It is a well-documented error, and one that a player of Kasparov's standing would be expected to know. The most charitable interpretation is that he showed his humanity, something that even the strongest supporters of Deep Blue would not claim for the machine.
On the other hand, Deep Blue's play in the other games showed what experts called errors, and yet the machine was able to hang on to an ultimate draw. In part, this may have been a result of Kasparov changing his game in an attempt to exploit what he thought were the machine's weaknesses-it is this which lay behind Kasparov's comment that if he had been playing a human of the same strength, he would have won. In chess at high levels, \Jpsychology\j is all-important, but it looks as though, in this case, Kasparov was largely responsible for his being "psyched out".
#
"ALH84001 less likely to contain signs of life",129,0,0,0
(May '97)
A report in \INature\i this month indicates that the "Martian life \Jmeteorite\j", ALH84001, contains \Jcarbonates\j which formed at high temperatures. The original argument was that the \Jmeteorite\j, which is undoubtedly of "shocked" origin (that is, it was struck a very hard blow at some point, and it is very probably of Martian origin. What was less certain is whether or not the \Jcarbonates\j in the rock are associated with globules formed by living things. If the \Jcarbonates\j formed at low temperatures, they may well have formed around life forms of some sort.
New petrological studies show that the \Jcarbonates\j, plagioclase and \Jsilica\j were all melted and partly redistributed in the same shock event that crushed the pyroxene in the \Jmeteorite\j. It now looks as though the \Jcarbonates\j crystallised from shock-melted material, in which case, there is little chance that the globules in the rock are anything more than artifacts. It looks as though we will have to look again, before we find Martian fossils.
#
"51 Peg's planet in doubt again",130,0,0,0
(May '97)
The \Jplanet\j of Pegasus 51 (alias 51 Pegasi to the astronomers) was reported in December 1996, and reported to be under attack in February 1997, when the \Jplanet\j signs were dismissed as no more than a sloshing on the star's surface. Now astronomers at the \JCalifornia\j Institute of Technology (Caltech) and NASA's Jet Propulsion Laboratory suggest that the "\Jplanet\j" may in fact be another star, that 51 Peg is a binary star, in fact.
#
"Leptoquarks less likely",131,0,0,0
(May '97)
Although the discovery of the leptoquark was reported in late February (see February update: \BA new particle and a fifth force?\b), searches since then have found no confirmation for the claim. While this does not rule out the existence of the leptoquarks, it makes them less likely-at least in the simple form described in the claims that were coming out of \JHamburg\j.
#
"Breast cancer 'gene' not such a sure sign",132,0,0,0
(May '97)
Two reports published in the \INew England Journal of Medicine\i during May suggest that the so-called breast cancer genes, BRCA1 and BRCA2, may not be such sure signs of a cancer risk as most people think. (See \BBreast cancer genes,\b April update)
The BRCA mutations are rare, and sampling for the genes has concentrated on families with severe rates of breast and other cancers which break out in young members of the families. In cautious language, the researchers reported that this may well have biased the sample, and any radical action taken with women who carry the gene may be unwise.
More importantly, only 16% of women with a family history of breast cancer proved to have a BRCA1 \Jmutation\j, much lower than the 45% found in previous studies. There is a real risk, say researchers, that if general screening is carried out, women may assume that the absence of the gene means they are not at risk, when a significant risk still remains.
#
"Fermat's last case?",133,0,0,0
(Jun '97)
Pierre de Fermat died 332 years ago, leaving behind him what we now call Fermat's last \Jtheorem\j. When he died, it was just one of the many theorems he left behind, but the others were fairly easy to prove, and it soon became the "last". Andrew Wiles was thought to have managed to prove the last \Jtheorem\j in 1993, but there were minor flaws in his proof which were not solved until 1995. Because so many other "proofs" have come up over the years, the committee waited two years before awarding the prize to Andrew Wiles. So it was not until late in June that Professor Wiles finally received the cash prize of a little under $US 100 000, a prize which at one stage was worth almost fifteen times as much.
#
"Blue Planet Prize",134,0,0,0
(Jun '97)
British scientist James Lovelock of Coombe Mill, for many years an independent researcher, and best known as the inventor of the Gaia hypothesis, won this year's $430 000 Blue \JPlanet\j Prize, which is awarded by the Asahi Glass Foundation of \JTokyo\j to people who have helped resolve global environmental problems.
In 1957, Lovelock invented the highly sensitive \Jelectron\j capture detector, which made possible the measurement of ozone-destroying \Jchlorofluorocarbons\j in the \Jatmosphere\j and pesticides in foods, but the award to Lovelock, who is now 77, is rather more for his offbeat Gaia hypothesis. This suggests that the \JEarth\j controls its climate and chemistry for the benefit of life, acting as some sort of super-organism. While most scientists find the Gaia theory hard to swallow, it has helped foster their recognition of the role of living systems in influencing climate, and stimulated a great deal of research.
#
"Crafoord Prize",135,0,0,0
(Jun '97)
Another independently-minded Englishman, Sir Fred Hoyle, has just shared this year's Crafoord Prize with Edwin Salpeter of Cornell University, for work the two have done in the area of \Jcosmology\j. This prize is controlled by the Royal Swedish Academy of Sciences, and it is used to cover areas missed by the \JNobel Prizes\j. Each will receive $US 250 000.
In the 1950s, the two astronomers independently predicted the \Jenergy\j level that must exist in the nucleus of carbon atoms if carbon was to be synthesised in stars. This was one of the first examples of the use of an astrophysical argument to predict a fundamental property of matter. Hoyle will always be remembered as the man who coined the phrase "\J\Jbig bang\j\j", although, as a follower of the competing "steady state" model of the universe, he intended it to be a disparagement of his opponents' ideas. Hoyle has also written \J\Jscience fiction\j\j in the past.
#
"Mir's problems",136,0,0,0
(Jun '97)
The former \JSoviet Union\j's 11-year-old spaceship "Mir" got into serious trouble in late June. A "Progress" cargo carrier, remotely guided by a Mir \Jcosmonaut\j, ran into the Spektr science module of Mir, opening a hole which was estimated from air loss rates to be about the size of a postage stamp. The main problem was that the hole could not be seen, and may be either a puncture or a gash. In either case, the hole needs to be found and patched, if it can be.
So far, nobody has been hurt, but the US \Jastronaut\j, Mike Foale, had a narrow escaped from the Spektr module as its air escaped into space after the collision. Early July will see two astronauts, Vasili Tsibliev and Aleksandr Lazutkin, going outside the station in pressure suits to try to fix the \J\Jsolar power\j\j supply which normally provides 55% of the station's power, as one of the four solar panels was damaged in the collision. They may have difficulty getting through the 80 cm hatch on the Spektr module in their space suits, which were designed for a 100 cm opening.
This is not the first glitch Mir has encountered this year. A faulty oxygen generator, a major fire and \Jtemperature\j control problems have all plagued the craft, along with a near miss from another supply craft last March, and now the whole future of Mir is now up for grabs. The problem is that any decision to shut down Mir will probably lead to delays in the whole Russian space program, including the International Space Station-the first module of which is due for launch in just a year from now, all going well-if it can.
#
"Red Centaur asteroids spark interest",137,0,0,0
(Jun '97)
A group of \JCentaur\j \Jasteroids\j can be found in the outer \J\Jsolar system\j\j, well beyond the \Jorbit\j of Jupiter. Two of these were described at the American Astronomical Society conference during June, because they are of particular interest. The Centaurs in question, 5145 Pholus and 1995 GO, are redder than any other \Jasteroids\j in the entire \J\Jsolar system\j\j.
According to David Weintraub, who led the study of these \Jasteroids\j, this red color is likely to be due to a covering of raw organic matter and minerals, similar to the stuff that life in the \J\Jsolar system\j\j (wherever it may be found) was assembled from. Most \Jasteroids\j have a bluish color, but it is possible that the orbiting bodies in the Kuiper Belt, out beyond Pluto, are likely to still carry the original material because there are fewer collisions, and there is less heating to abrade or remove these materials.
Weintraub believes that some of the Centaurs have strayed in towards the orbits of Neptune, Uranus and Saturn, and that the occasional \JCentaur\j can then be slung out into interstellar space, or in towards the centre of the \J\Jsolar system\j\j. The red color, he suggests, comes from compounds rich in carbon and \Jnitrogen\j, pointing to the need for more study.
#
"Asteroid in earth's Lagrangian points",138,0,0,0
(Jun '97)
A new companion of the \Jearth\j has just been unmasked. Asteroid 3753 (1986 TO), discovered in 1986, seems to be trapped in a complex \Jorbit\j that involves the \Jearth\j. The \Jorbit\j swings out towards Mars, and in towards Mercury, but the centre of the \Jorbit\j traces out a horseshoe pattern around the \Jearth\j, with a full orbital cycle of 385 years.
There are many near-\Jearth\j \Jasteroids\j, ranging in size from a few metres to more than 30 km across, many of them on orbits crossing that of the \JEarth\j, but none has previously been identified as having an \Jorbit\j associated with our own. 3753 is just 5 km across, and it moves in a "horseshoe \Jorbit\j, a familiar feature of the gravitational three-body problem, but only known otherwise from two satellites of Saturn, Janus and Epimetheus, both of which have orbits less intricate than that of 3753. The asteroid's \Jorbit\j is an "overlapping horseshoe", dynamically locked to the \JEarth\j.
Apart from the Moon, 3753 is the only known natural companion of the \JEarth\j. While theorists had long recognised the possibility that our \Jplanet\j might have a small partner conducting a complex orbital dance like this, it remained hidden until three astronomers, Paul Wiegert and Kimmo Innanen of York University in \JOntario\j and Seppo Mikkola at the University of Turku in \JFinland\j analysed the path of a small asteroid.
As far back as 1772, Joseph Louis Lagrange demonstrated the existence of what we now call lagrangian points, positions in space where a third object, like an asteroid, can sit comfortably in a stable \Jorbit\j. While astronomers have found swarms of \Jasteroids\j in Jupiter's lagrangian points, \Jasteroids\j known as the Trojans, no such companions for \Jearth\j had been found. 3753 follows a circuit which takes it around both lagrangian points, travelling first around one Lagrangian point, then going the long way around the \JEarth\j's \Jorbit\j to the other lagrangian point, circling it, and going back again, to complete the cycle in 385 years.
But that is only the asteroid's average path, and it swoops off to cross the path of Venus at times, before swinging back across our path from time to time. But there is no need to worry: we are unlikely to collide soon, even on the next occasion that the orbits are predicted to intersect, in 2750 years.
#
"Lost friendly planet?",139,0,0,0
(Jun '97)
Nobody has ever really been able to explain how Bode's Law, more correctly called the Titius-Bode Law, works. It successfully predicted the location of Uranus, but has nothing to say about the locations of Neptune and Pluto, besides predicting a \Jplanet\j more or less where the asteroid belt is today. In fact, the asteroid belt was once thought to be made of Bode's missing \Jplanet\j, reduced to a rockpile by some nasty catastrophe.
Working from a different direction, says Dimitris Christodoulou, also addressing the American Astronomical Society, he has come up with a similar answer to Bode. Christodoulou was modelling the way in which gas and dust would settle out around the young \Jsun\j, using a Bessel function, a complex equation. Because he did not know where the peaks of the function would arise, he made the reasonable assumption that the third \Jplanet\j lies at the third peak, and looked to see what else would happen.
Mercury, Venus and \JEarth\j neatly fill the first three slots, with the fourth peak empty, Mars close to the fifth, the Hungaria \Jasteroids\j at the sixth, the main asteroid belt across the seventh to tenth peaks, while the outer planets fit neatly into slots, although leaving some free. But what about that empty fourth slot, asks Christodoulou? Does it indicate the \JPlanet\j that Never Was, or does it indicate a \Jplanet\j that formed and went away again? Other astronomers are sceptical for the moment, pointing to the other empty peaks among the outer planets, so for the moment, the Christodoulou model remains in the realm of curious but interesting.
#
"Hubble sees double vision",140,0,0,0
(Jun '97)
At the same meeting of the AAS, researchers announced what they think is the aftermath of a collision between two supernovas. It began with astronomers looking at a bright object 17 million light-years away in the \Jconstellation\j \JCepheus\j, which turned out to be a very bright object, too bright to be a young \Jsupernova\j, as astronomers had first suspected. More importantly, a dust \Jcloud\j rushing away from the object implied an age of several thousand years. The Hubble space \Jtelescope\j has now been used to resolve the issue, and it has found not one, but two objects, about 40 light years apart. The shock waves of expanding gas from each star had collided, compressing the shells and providing a stellar light show second to none.
#
"Boomerang Nebula--coldest spot?",141,0,0,0
(Jun '97)
Space is cold at the best of times, but some bits are colder than others. Now astronomers think they have found the universe's coldest spot in the Boomerang Nebula, a \Jcloud\j of gas and dust that is being ejected by an old star before its core collapses into a white dwarf. By studying the signature of carbon monoxide in the gas \Jcloud\j, they find that the expanding and cooling \Jcloud\j is absorbing heat from the background radiation, making it cooler than 3 K. The discovery brings a problem with it: if very old stars are very cold, this may mean that they will be very much harder to detect.
#
"Robots on Mars",142,0,0,0
(Jun '97)
With the landing on Mars the main news as this report is assembled (more on that next month), plans are already under way for bigger and better expeditions. At the high end, NASA is dreaming about people on Mars by 2010, while at the low end, a new robot was let loose on the high Chilean \JAndes\j during June, controlled from a mere 8000 kilometres away, rather easier than the 20-minute delay with a Mars robot.
Scientists at Carnegie Mellon University were behind the project, working from the Carnegie Science Center in \JPittsburgh\j as they ran Nomad. Its predecessors include Dante II, which explored a \Jvolcano\j in \JAlaska\j 2 years ago, after a disastrous earlier attempt in \JAntarctica\j. Nomad is expected to cover 5 kilometres of the Atacama desert a \Jday\j, and to simulate several sorts of search, as well as providing 360░ images, stereo color photographs and information from assorted sensors and metal detectors.
#
"Cyberlaw code of conduct",143,0,0,0
(Jun '97)
So far as the USA is concerned, if you want to use secure encryption systems and you are a foreigner, forget it. If you are an American citizen, you may not transmit software which carries out encryption, and even the algorithms for encryption are prohibited exports, being classed as munitions. Why? Well, if Uncle Sam gets into a war with you, Uncle Sam wants to know what you are thinking and saying. And Uncle Sam has decided against "giving guns to the Indians", a lesson they learned in the Wild West, and now apply to the world. So even the most unlikely future enemies are denied access to really secure encryption tools, no matter what their need.
Aside from military considerations, the US government is also determined to stop drug dealers and other criminals getting access to encryption tools, so only encryption systems using a 40-bit key could previously be exported, although this has recently been raised to allow 56-bit key systems. A 40-bit system means that anybody wanting to crack a coded message had to be able to find a number which lies somewhere between 0 and 2\U40\u, or about 1.1 trillion. In a 56-bit system, the number can now be as large as two raised to the 56th power, or about 72 quadrillion, which is 72 followed by fifteen zeroes.
Unfortunately, one of these 56-bit systems has already been cracked, so now the US government has offered to allow longer keys, but only under conditions which critics say could leave the whole system open to abuse, since your message must contain its own key, encrypted according to a US government standard that they can read. This "key recovery" system would make a single point of failure which hackers would fall upon with glee.
The comparatively weak 56-bit Data Encryption Standard (DES) was trumpeted in January this year, when the secure software maker, RSA Data Security set up a $10 000 challenge, with a message posted on the \JInternet\j in encrypted form. Decoded, it reads "Strong encryption makes the world a safer place". They were less than embarrassed when the challenge, expected to last for years, was met within four months. RSA president Jim Bidzos, who had offered a $10,000 bounty to the successful code-cracker, said "We've been saying for a long time that DES is no longer secure and here is the proof."
The code was broken by Rocke Verser who recruited up to 14 thousand \JInternet\j users to crunch through up to 7 billion keys a second, using their spare computing time to crank through the possible keys. In the end, Verser was lucky, finding the key after just 17 quadrillion keys had been tried. Slowly, ever so slowly, the barriers are coming down: during June, Pretty Good Privacy Inc. obtained US government approval to export 128-bit encryption technology to foreign subsidiaries and branches of large US companies. This encryption software is free of key recovery features, but it is still being kept away from any "Indians".
\BCDA struck down\b
The ill-fated attempt by the US Congress to control what they do not own, died quietly during June. The Communications Decency Act was ruled invalid in an unsurprising decision by the US Supreme Court, which found that the law was unconstitutional because it impinged on freedom of speech.
The Congressional supporters of the original bill threaten to come back with a second attempt. According to civil liberties lawyers, the first bill was flawed by the proponents' total misunderstanding of what the \JInternet\j was and is, but they are quietly confident that no bill can be put up which remains constitutional while muzzling the \JInternet\j. At present, the CDA supporters' ploys seem to involve a narrower definition of decency and a mandatory rating system for all Net materials, conveniently forgetting that US Congress has no power beyond the shores of the United States.
The court's ruling was summed up by Justice John Paul Stevens: "It is true that we have repeatedly recognised the governmental interest in protecting children from harmful materials. But that interest does not justify an unnecessarily broad suppression of speech addressed to adults. The government may not reduce the adult population ... to ... only what is fit for children.''
Ahead of the CDA ruling, the \JWhite House\j positioned itself in readiness for a defeat, basically taking the position that regulation of the Net should be left to industry. Commented David Sobel of the Electronic Privacy Information Center: "To come in right after the Supreme Court decides the issue and say we didn't really mean what we said up to now-I can't imagine anything that would be seen as more of a waffle than that. It raises waffling to an art form."
European sources greeted the ruling with relief, suggesting that it would now be possible to get some cooperation under way to regulate \JInternet\j content effectively, and that the Supreme Court's ruling may spur other nations to begin developing their own laws, rather than rely on the CDA to do it all for them.
At almost the same time, state laws in New York and Georgia which put limits on free speech over the \JInternet\j were overturned. The New York legislation was struck out because it sought to regulate transactions outside the state's borders, thus violating the Constitution's interstate commerce clause. In the Georgia case, an injunction was granted against a law which made it illegal to use a name that "falsely identifies" the sender of an electronic message, such as a pseudonym or an anonymous e-mail address.
#
"French laws and the internet",144,0,0,0
(Jun '97)
In \JFrance\j, opinion polls must be kept secret for a week before a parliamentary vote, but before the June elections, poll data were posted anonymously on a number of Web sites, where all French Web users could access them. As \ILe Monde\i (a slightly left-of-centre newspaper) commented: "From here on, it is the globalisation of communications that renders the law obsolete."
The French legal system also lost out against Georgia Institute of Technology, an American University accused of breaking \JFrance\j's law which forbids the sales of goods and services in a single language unless that language is French. The university has an English-only Web site about its operations at a \Jsatellite\j centre in the town of Metz, which will be allowed to continue. The decision, however, was on narrow legal grounds, and does not really settle whether French law can control the \JInternet\j. Two French language groups which started the case are looking at appealing the decision.
In better news for Francophones, a Quebec computer store, Micro-Bytes Logiciels, has taken most of its English-language site from the Web after receiving notice from the \IOffice de la Langue Franτaise\i that the company is in violation of the French Language Charter, which says that all catalogues, brochures, leaflets, commercial directories and other similar publications must be in French. The following week, Quebec's Culture and Communications Minister, Louise Beaudoin said that if left up to the federal Heritage Minister, there would be no French-language content on the \JInternet\j.
We use Edupage as a source of cyberlaw news. To subscribe to Edupage, send mail to listproc@educom.unc.edu with the message: subscribe edupage Louise Beaudoin (if your name is Louise Beaudoin, otherwise, substitute your own name).
#
"Cybernews bits",145,0,0,0
(Jun '97)
\BSpam free?\b
FTC commissioner Christine Varney says the Federal Trade Commission will increase its efforts under existing \Jfraud\j laws to punish e-mail spammers, saying that much of the spam mail is of a fraudulent nature. One available source reports that unsolicited \JInternet\j messages account for 5 to 30% of the 15 million messages received by America Online subscribers every \Jday\j.
\BCheaper chips\b
Intel has found a way to shrink its Pentium and Pentium MMX chips by about 10%, which means Intel can fit more chips on a single silicon wafer, cutting manufacturing costs. In July, the price of a 200 MHz Pentium MMX processor should drop from $US492 to about $US240.
#
"Urine test of benefit",146,0,0,0
(Jun '97)
Most people who are required to undergo a urine test do so unwillingly, but a new test offers the chance to do patients some good, as it will identify damage caused by free radicals after bypass surgery or heart attack therapy. The test shows promise as a way of developing better antioxidant vitamin therapies against free radicals as well, pointing to the best dosage of \Jantioxidants\j that could prevent free-radical damage in cardiac patients.
After bypass surgery and other cardiac therapies, the improved blood flow to the heart carries in extra oxygen and causes the \Joxidation\j of, and damage to, heart tissue. This leaves higher blood levels of oxidised compounds, but the very act of drawing blood samples can oxidise chemicals, leading to false results. The new urine test identifies the presence and levels of a chemical formed after free radicals such as \Jperoxide\j and superoxide attack the common lipid arachidonic acid in the blood.
#
"Growing blind",147,0,0,0
(Jun '97)
Premature babies and people with diabetes often suffer from a form of \Jblindness\j which is caused by the abnormal growth of new blood vessels in the \Jretina\j of the eye. This seems to be triggered by oxygen deprivation. The most common treatment is \Jlaser\j surgery, but prevention would obviously be better, and it now seems that growth hormone (GH) is part of the problem, as is \Jinsulin\j-like growth factor-1 (IGF-I).
In the past, removal of the pituitary \Jgland\j was used to restore vision in some blind diabetics, but this rather drastic removal of a major source of chemicals needed for \Jmetabolism\j, including growth hormone only as one of many products-makes people tired, intolerant of stress, and susceptible to infections. On the other hand, \Jlaser\j surgery can damage or destroy the \Jretina\j, as well as reducing peripheral vision.
A team led by Lois Smith, a paediatric ophthalmologist, starved mice of oxygen, inducing blood vessel growth near their retinas. Using normal mice and mice genetically engineered to inhibit the effects of growth hormone, the transgenic mice showed 34% less growth of the new blood vessels which lead to \Jblindness\j. Normal mice treated with MK678, a growth-hormone-suppression drug, showed a drop of up to 44% in blood vessel growth. Treating the mice with IGF-I wiped out the protection afforded by the drug, pointing squarely at this hormone. So in the future, rather than ripping out the whole hormone factory, doctors may be able to suppress a handful of \Jhormones\j, and achieve the same cures.
#
"Marijuana--just as bad",148,0,0,0
(Jun '97)
Sensationalists tell us that all drugs are bad, while people on the fringes of the drug scene tend to distinguish between "hard drugs" and the supposedly less harmful "soft drugs" like marijuana. New evidence this month suggests that the sensationalists may have been right all along about the dangers of marijuana. Two separate reports published in \IScience\i this month show disturbing similarities between marijuana's effects on the brain and those produced by highly addictive drugs such as \Jcocaine\j and heroin. Marijuana withdrawal turns on the same stress system in the brain triggered by withdrawal of opiates and alcohol, and a second study points to marijuana activating the same reward pathway as heroin.
The cannabinoid THC (tetrahydrocannabinol) was injected to rats over a two-week period, once a \Jday\j, to mimic heavy marijuana use. At the end of that time, the researchers injected the rats with a drug which counteracts THC, producing classic withdrawal symptoms. The rats were found to have two to three times as much of a substance called corticotropin-releasing factor (CRF), which is related to the anxiety and stress felt by people withdrawing from alcohol, \Jcocaine\j, and opiates.
Meanwhile, rats in \JItaly\j were being treated with THC, and checked for increases in \Jdopamine\j levels in a brain region called the nucleus accumbens. Surges in \Jdopamine\j level are found in heroin users, and this brain reaction is usually regarded as the system which reinforces, or encourages, continued brain use.
#
"Neutrinos--do they have a mass?",149,0,0,0
(Jun '97)
At a specialist neutrino conference in the Italian resort of Capri, three groups of physicists reported evidence that neutrinos can "swap their identities". This phenomenon would require that the neutrinos have mass, while physicists' current picture of particles and forces assumes that neutrinos are massless. So if the reports stand up, the reports could revolutionise the way we see the universe.
Massive neutrinos would be prime candidates for a role as the universe's missing \J\Jdark matter\j\j, the mass that astronomers believe must be out there but have not been able to pin down, and which is needed to explain simple things like the speed at which our galaxy rotates without flying apart.
If massive neutrinos can "swap", this could explain why the number of neutrinos coming from our \Jsun\j, the solar neutrinos, appears smaller than we should expect from the \Jsun\j's nuclear reactions as we understand them. If they are changing between the three types or "flavors" - \Jelectron\j, tau, and muon neutrinos - some of the solar neutrinos might have eluded detectors in the past.
The evidence comes firstly from neutrinos created by cosmic rays colliding with the \Jatmosphere\j. \JJapan\j's underground Super-Kamiokande detector can detect both atmospheric neutrinos and solar neutrinos, and this detector is seeing fewer muon neutrinos relative to \Jelectron\j neutrinos than expected from the cosmic ray collisions, possibly because muon neutrinos are transforming into another "flavor".
At the Soudan2 experiment, in the Soudan iron mine in Northern \JMinnesota\j, researchers have noted a similar apparent skewing of atmospheric neutrinos, as has the Liquid Scintillator Neutrino Detector at Los Alamos in New Mexico, which detects neutrinos from reactors. It is early days yet, but there is a tantalising pattern here, and if the researchers are correct, they have fingered a point where either there are vital data missing, or there is a serious misunderstanding. In either case, the way forward is clear.
#
"New genomic data",150,0,0,0
(Jun '97)
The partners in a major genomics combination have gone their separate ways, say William Haseltine and J. Craig Venter, hailed as the "Gene Kings" in 1995. Haseltine's medical product development firm, Human \JGenome\j Sciences Inc. (HGS) in Rockville, Maryland, and Venter's non-profit The Institute for Genomic Research (TIGR), also in Rockville, ended business relations on 20 June. TIGR will give up more than $US38 million which it was due to receive from HGS over the next 5 years, and HGS will release TIGR from patent requirements and publishing delays on future research, but HGS retains rights to TIGR's earlier work.
The main result has been a flood of data. TIGR released 40 million base pairs of DNA sequence data from 11 organisms, including portions of the genomes of the microbes responsible for tuberculosis, \Jcholera\j, and \Jsyphilis\j, and \Jchromosome\j 2 of the \Jmalaria\j parasite, within a few days of the break-up.
The problem seems to be that Venter was more interested in sequencing organisms of little medical importance, and in human \Jgenome\j sequencing, also of limited value for a company like HGS which is interested in genes as drug targets, while TIGR found itself unable to release data until all commercial possibilities had been explored. Tensions like this are probably always going to plague every situation where the scientific culture says "publish" and the commercial culture says "patent" or "hide".
#
"HUGO News",151,0,0,0
(Jun '97)
In May, gene mappers at their annual meeting at Cold Spring Harbor Laboratory in New York reported that they are now at the stage of sequencing pieces of the human \Jgenome\j, rather than just mapping the genes. Taken together, work now totals some 52.4 million bases of the 3 billion-base \Jgenome\j. The researchers predict that by May 1998, they will have hit around the 200 million mark, which will see them 7% of the way to their scheduled completion date of the Human \JGenome\j Project in 2005.
#
"Gene and Parkinson's disease link",152,0,0,0
(Jun '97)
Scientists have pinpointed the gene which, when defective, causes a hereditary form of Parkinson's disease. A team led by Mihael Polymeropoulos of the National Human \JGenome\j Research Institute in Bethesda, Maryland, say the disease affects an Italian family. It develops at an early age, and shows a strong genetic \Jlinkage\j to a region on \Jchromosome\j 4. They identified other chromosomal markers which seemed to be inherited with the disease, and pinned the gene down to a section of the \Jchromosome\j 6 million base pairs long. Within this region, the alpha-synuclein gene stood out as a suspect.
When they sequenced this gene in the Italian family, they found the affected members had a \Jmutation\j not present in unaffected members, and they also found the same gene in three of five affected Greek families., but not in any of 300 controls drawn from \JFrance\j and \JItaly\j, or in 52 Italian patients with sporadic Parkinson's disease. It probably only accounts for a few percent of all the cases of the disease, but unravelling the way in which the mutated protein causes its damage, they will have a hint as to what kills off crucial neurons in the much larger number of patients with non-hereditary Parkinson's disease.
Early in June, \INature Medicine\i reported on a study which suggested that electrical stimulation of the brain can reduce the symptoms of Parkinsonism. Electrodes were inserted in the portion of the brain, the globus pallidus, which usually acts to slow down the rest of the brain in affected patients. Sending rapidly oscillating electrical pulses to the globus pallidus triggered increased blood flow to the premotor cortical areas, brain regions that are responsible for planning and initiating movement, and at the same time, the patients' movements became faster and less jerky. But while the effect is established, nobody yet knows what causes it.
#
"Down's syndrome and trisomy",153,0,0,0
(Jun '97)
The condition once known as "mongolism", and more commonly called Down's syndrome (or Down syndrome in some parts of the world) arises from a trisomy, the presence of a third copy of a \Jchromosome\j. In Down's syndrome, this is always \Jchromosome\j 21, but nobody knows why the condition is caused by the trisomy.
Many cases of trisomy lead to early and spontaneous \Jabortion\j, but this trisomy is able to affect some normal functions while leaving others intact, and allowing the person to survive to adulthood. It now appears that a section of the \Jchromosome\j, a 20-million-base-pair stretch on the long arm of \Jchromosome\j 21 called 21q11-21, has a low percentage of cytidine and guanosine nucleotides. This suggests low gene content, in a sea of "junk DNA", which makes the area interesting.
Where only five genes had been found in the area before, Fa-Ten Kao, Jinjwei Yu, and their colleagues at the Eleanor Roosevelt Institute for Cancer Research in \JDenver\j have managed to locate 18 potential genes, at least nine of which had been "switched on".
One of the basic principles of development is that genes are normally switched off, and only switched on when they need to act in some way. Under normal circumstances, some 90% of the DNA in this odd region is methylated, a \J\Jchemical reaction\j\j in which DNA is tagged and deactivated with methyl groups. It now looks as though the extra \Jchromosome\j somehow interferes with the methylation process, leading to the conditions which define the syndrome. Once the human \Jgenome\j sequence for the whole \Jchromosome\j is available in a few years, this will probably prove to be a key discovery.
#
"Plants and chromosomes",154,0,0,0
(Jun '97)
Plants, as opposed to humans, have no equivalent to Down's syndrome, and no real problem with extra chromosomes, and commonly use tetraploidy, a doubling of the number of chromosomes, as new species develop, so that taking a karyotype, a \Jchromosome\j count, is often a good hint to the species relationships in some plant genera.
With the release of the entire \Jgenome\j of \ISaccharomyces cerevisiae\i, it is now possible to look for evidence of this yeast duplicating its entire \Jgenome\j. Why would it bother? Well, when you have a spare set of genes to monkey around with, you can do so, as long as one of each pair of genes can still carry out its old task.
A study reported in June in \INature\i indicates that there must have been a whole doubling of the \Jgenome\j in \ISaccharomyces\i, just after it diverged from \IKluyveromyces\i, and that many of the genes were later deleted, but still leaving a significant number of duplicates to be seen today.
#
"Origins of new species",155,0,0,0
(Jun '97)
Where do new species arise? For some time, theorists have said that the new species would not arise in the heartlands, in habitats where everything is just right for the individuals which ere there. Rather, the new species would arise in the \Jbadlands\j, where excess individuals had been pushed to live or die as best they could. The edges, the "ecotones", they said, were probably still too close to the original mass of genes, and any new developments would be swamped by newly ejected members of the original species.
Now it appears that ecotones may be the places where new \Jrainforest\j species emerge. Thomas Smith of \JSan Francisco\j State University and his colleagues studied the \JCameroon\j's little greenbul, \IAndropadus virens\i, a small green bird which lives in both the tropical \Jrainforest\j and the ecotone. As a general rule, the ecotone-dwelling little greenbuls are heavier and have deeper bills and longer wings and legs than their \Jrainforest\j cousins. Genetic analysis, however, reveals many similarities between the populations, suggesting a considerable gene flow, equivalent to about one to 10 migrants per generation joining each population of birds from the other "camp".
In other words, the selection pressures in the two zones can overcome any "swamping effect". So if gene flow isn't a barrier to the formation of new species , then perhaps the ecotone may be more important than theorists have thought.
#
"Dogs evolved earlier than thought",156,0,0,0
(Jun '97)
How long have dogs been teamed up with humans, and how important are they to human \Jevolution\j? The quick answer, say some theorists, may be "very" to both questions. Archaeologists, on the other hand, can only trace domesticated dogs back to about 14 000 years ago.
A study in \IScience\i this month suggested that the wolf may have become a dog as much as 100 000 years ago, or even more. At the same time, the study shows that the dog has only one wild ancestor: the wolf, and revealed that there is no such thing as a "pure breed".
A team led by Robert Wayne, an evolutionary biologist at the University of \JCalifornia\j, Los Angeles, studied samples from 162 wolves from around the world, and 140 dogs from 67 breeds, and five mixed breeds. They extracted a haplotype from the DNA in the mitochondria, which was similar in all dogs or wolves, but different from the same regions of jackals and coyotes. The dog haplotypes did not sort by breed, so individuals of the same breed might carry different haplotypes, indicating a mix of ancient doggy ancestors, even in the "purest" of dog breeds.
Using a "genetic clock" method, Wayne suggests that the dog and the wolf diverged 135 000 years ago, although others are doubtful about this dating, which may have been flawed by selection effects at certain points. But if dogs and humans have been together for that long, this will probably end up having strong implications for the history of human \Jevolution\j-a human's linguistic and toolmaking skills, coupled with the dog's senses of smell, vision and hearing, would make a formidable hunting and fighting team.
#
"Fossils in the news",157,0,0,0
(Jun '97)
\BGetting blood out of a stone\b
It turns out to be possible after all, provided the stone is a \Jfossil\j that was once a bone. And maybe it's not quite blood, but haemoglobin, the oxygen-carrying protein in blood, has been found, say researchers from Montana State University, \JIndiana\j University, and the University of Wyoming report that they have found haemoglobin in a 65- to 67-million-year-old \ITyrannosaurus rex\i bone. The team, led by Montana's Mary Schweitzer, extracted material from the \Jfossil\j and tested it with a whole range of techniques, including ultraviolet, visible, and Raman \Jspectroscopy\j, nuclear magnetic \Jresonance\j, and \Jelectron\j spin \Jresonance\j, all of which seem to indicate that they have the core of haemoglobin.
To test the matter further, the team injected some of the material into rats and showed that it caused the formation of \Jantibodies\j. Better still, the haemoglobin resembles bird haemoglobin, exactly what we would expect if today's canary was once yesterday's \Jdinosaur\j.
This could, of course, be the start of something big, not in the \IJurassic Park\i sense, but in the sense of unravelling relationships, using haemoglobin variations to draw up reliable family trees. Others are less hopeful, pointing out that 50 000 year old haemoglobin is unusual.
\BOldest young \Jdinosaur\j\b
Portuguese scientists report finding a \Jdinosaur\j embryo which is 140 million years old at Lourinha, a small seaside town 60 kilometres north of \JLisbon\j, and known for decades for its wealth of Jurassic fossils.
The egg cache which contained the embryo was found by two local amateur \Jfossil\j-hounds, Oratio and Isabel Mateus, who have a small \Jdinosaur\j museum in Lourinha. One of the crushed eggs, about 18 cm long, was found to contain bones, possibly of a theropod, a three-toed, meat-eating dinosaurs (including \IT. rex\i), since theropod remains have been found in the area before. Virtually all previous finds have been of Cretaceous \Jdinosaur\j embryos, making this an exciting find. Aside from anything else, it would allow palaeontologists to start linking isolated eggs to particular species.
\BChitin in ancient fossils\b
\JChitin\j is a common biological molecule which is found in many organisms, notably arthropods such as crabs, \Jcrayfish\j and insects. \JChitin\j has always been thought to degrade rapidly when it is buried in sediments and so unlikely to be preserved in the \Jfossil\j record beyond a few hundred thousand years. A June report records the finding of \Jchitin\j in \Jfossil\j insects preserved in shales nearly 25 million years old. So whatever determines \Jchitin\j preservation and degradation, it is not time alone.
#
"Nuclear news",158,0,0,0
(Jun '97)
\BRussian nuclear incident\b
\JChernobyl\j couldn't happen again, could it? Perhaps. Better safety procedures are in place throughout the Russian nuclear industry, aren't they? Not quite, though we don't hear much to the contrary.
At the Federal Nuclear Centre in the town of Sarov, one of \JRussia\j's restricted research cities, some 350 kilometres east of Moscow, a physicist called Alexandr Zakharov was injured by a burst of radiation from a research reactor in June. Researchers are supposed to work in pairs, but Zakharov was working alone when he mistakenly added too much of a radioactive substance, exceeding the \J\Jcritical mass\j\j, and was hit with a 1000-rem dose of neutrons, more than twice the maximum safe dose.
Apparently Zakharov, as a well-qualified senior researcher, was allowed to work on his own. Some days later, the only available report indicated that the laboratory was still being hit with a neutron flux too high to allow anybody to enter and take control of the reactor, and plans were in place to send in a robot under remote control to turn off the reactor.
\BSuperphenix sinks back into the ashes?\b
The new French government claimed its first victim in June, when the new French prime minister Lionel Jospin said that the troubled French Superphenix nuclear reactor was to be abandoned. His socialist-led government is a coalition which includes the greens, one of whom, now the environment minister, Dominique Voynet, called the $10.5 billion reactor in Creys-Malville a "stupid financial waste" and promised to close it down during the election campaign.
The 1200 megawatt Superphenix was planned 20 years ago as the world's largest fast breeder reactor, but it has produced power only now and then. In 1994 the former government agreed to downgrade it to use as a research reactor for disposing of \Jplutonium\j waste from other reactors, and the conversion began six months ago. The closure could have implications for the whole nuclear power industry in \JFrance\j, especially regarding the disposal of long-lived \Jactinides\j, by-products of nuclear power generation which would have been no problem with Superphenix. As \JFrance\j draws more than half of its power from nuclear sources, this could end up producing major changes in French society. While the first protests about job loss have already taken place, the reactor is widely regarded as a commercial disaster, so its closure will also leave a number of people, some of them in the pro-nuclear camp, feeling more relieved.
#
"El Nino rides again!",159,0,0,0
(Jun '97)
Fasten your seat belts, we're in for a stormy season or two-El Ni±o is back, bigger and tougher than before. The warming of the tropical \JPacific Ocean\j has been building during the past several months, and it will soon trigger a sequence of drastic shifts in \Jweather\j patterns all the way from India to \JCalifornia\j, say the experts.
After the more chaotic El Ni±o events of the early 1990s, this one looks like a classical pattern, leading forecasters to suggest that it will be like the El Ni±o of 1972, so this summer in the Northern Hemisphere should see the Indian monsoon weaken and the Caribbean region dry out. Next southern summer, at the end of this year, \Jdrought\j should strike \JAustralia\j, South \JAfrica\j, and north-east \JBrazil\j. At the same time, storms will bring extra \Jwater\j to \JCalifornia\j and the Southeast, or even much of the southern half of the United States.
#
"Gaia hypothesis update",160,0,0,0
(Jun '97)
The Gaia hypothesis of James Lovelock had led supporters of the idea to suggest that tiny \Jalgae\j, the ocean \Jphytoplankton\j, are in some way involved in acting as a thermostat to regulate the \Jearth\j's \Jtemperature\j. Now it appears, in a report by oceanographers Timothy Bates and Patricia Quinn of the US National Oceanic and Atmospheric Administration's Pacific Marine Environmental Laboratory in Seattle, that the thermostat may not exist. Writing in Geophysical Research Letters in April, they question the idea that ocean \Jphytoplankton\j produces dimethyl sulphide (DMS), which enters the \Jatmosphere\j and forms tiny particles which accumulate \Jwater\j vapor to form \Jcloud\j particles, reflecting the \Jsun\j and cooling the \Jearth\j.
Their study of a fifteen-year record of DMS in tropical Pacific waters shows little variation across that time, in spite of wide El Ni±o-induced swings in \Jtemperature\j and \Jcloud\j cover during that period. So if Gaia is interacting with El Ni±o in some way, it either isn't happening there, or it is happening in some other way.
#
"World population growing more slowly",161,0,0,0
(Jun '97)
In much of the world, \J\Jbirth control\j\j and other family planning measures have reduced fertility rates during the 1990s; but the United Nations still predicts that the world's population of almost 6 billion people, will peak at about 11 billion by 2100. Now there is good news: the human breeding rate is falling away. That seems to be the point of an unusual forecast made in \INature\i this month, where rather than giving the usual "low", "middle" and "high" estimates for future population, the researchers have suggested probabilities for different population sizes in 13 regions of the world up to 2100, and they find that there is a probability of two-thirds that the world's population will not double in the twenty-first century.
On the other hand, the proportion of elderly people will rise, and some of the poorest parts of the world, like the \JMiddle East\j and sub-Saharan \JAfrica\j could still see their populations triple over the next 50 years, as India has done since independence in 1947, while eastern Europe is actually expected to have a drop in population, but the increase in the number of aged people is said to be "almost certain".
#
"Russians dwindling faster",162,0,0,0
(Jun '97)
The rise of the grey folk may not be seen in \JRussia\j, it seems. Hard-drinking Russian men continue to die young these days, but \JRussia\j's recent downward spiral in life expectancy hides the fact that life expectancy in the former \JSoviet Union\j (FSU) levelled off more than 30 years ago before falling during the last decade, according to a report released by the US National Research Council (NRC).
The latest figures collected by the US \JCensus\j Bureau reveal that the average life expectancy for men in \JRussia\j, for example, fell from about 65 years in the mid-1980s to 57 years in 1994. Life expectancy numbers began levelling off in the \JSoviet Union\j as early as the 1960s, shot up in the late 1980s, after Gorbachev's short-lived vodka prohibition, but soon plunged again.
So it may be the vodka that is causing it, or then again, it may not. A Russian study of European noble families suggests that daughters of older fathers may have shorter life-spans. They suggest that the finding reflects the accrual of gene mutations as fathers age-specifically, damage to "housekeeping" genes on the paternally transmitted X \Jchromosome\j.
Leonid Gavrilov and his wife, Natalia Gavrilova, studied data from 700 families, including 2159 daughters and 4942 sons born in the 1800s, tallying only children who survived past 30, to minimise the effect on mortality of infectious diseases. Daughters born to fathers in their thirties lived, on average, to an age of 74.5, but if the fathers were in their fifties, daughters' life-spans were about 2 years shorter-72.4. When the data are controlled for maternal age, parental longevity, and other variables, the difference is closer to 3 years.
The logic is that ova are manufactured early on, and stored by the mother until they are released in \Jovulation\j, safe from \Jmutation\j. Sperm cells are manufactured throughout adulthood, and so are vulnerable to mutations that accumulate over time. Boys, getting a Y \Jchromosome\j with very little information on it, seem not to be affected, while girls, getting an information-rich X \Jchromosome\j, may be affected by the mutations it carries. The only problem is that other chromosomes, common to both males and females, should also have an effect as the father ages, but this seems not to be there.
#
"United States 1997 census",163,0,0,0
(Jun '97)
The USA is gearing up for its biggest ever \Jcensus\j, and as happened in 1890, they are running into problems. Consider the \JConstitution of the United States\j of America, especially Article One, Section Two of that document. Politicians are a particularly suspicious breed, and the American colonial ones who created the US Constitution were no different. To prevent any single state from dominating Congress, they provided that the membership of the Congress should be in proportion to the \Jcensus\j which was to be taken at ten-year intervals.
This scheme was fine for the first hundred years or so, when there was only a small population, but the United States was opening up fast, and the swarms of immigrants were pouring in from Europe. By 1886, there was trouble: the results of the 1880 \Jcensus\j would not be available by the time the 1890 \Jcensus\j came due. In 1880, there were slightly more than fifty million Americans to be counted, by 1890, there were over sixty three million, and it seemed certain the provisions laid down by the Constitution were about to be defied. Something \Ihad\i to be done!
(In \JAustralia\j, the local politicians were framing the Australian constitution at about the same time, and they were cunning enough to commit themselves only to using the "latest statistics of the Commonwealth", and giving the Commonwealth the power to make laws about censuses and statistics.)
The solution then was the invention of the Hollerith card. Many of the questions on the \Jcensus\j form called for yes/no answers, and these could be represented by holes on a card, just as they were in a Jacquard loom. The more complex questions, such as age, could be coded with a group of holes. Once the cards were punched, they could be read by gently lowering pins onto the surface of the card.
Where there was a hole, the pin made an electrical connection with a pool of mercury below. Where there was no hole, there was no connection. The 1890 \Jcensus\j was completed in a little over two years, and Hollerith had founded one of the companies which was later to become \JIBM\j. Now there are more than 250 million Americans, and the bulk of data is no problem, but the bulk of missing data is a serious concern. According to the National Research Council (NRC), the US \JCensus\j Bureau should use sampling techniques to estimate the number of people not tallied by traditional surveys.
Last year, the Bureau proposed that door-to-door surveys be carried out for 1% of all US households as part of the 2000 \Jcensus\j. The survey results would be used to confirm the accuracy of the standard mail-in questionnaires and also to estimate data for the uncounted population. The bureau believes that such techniques would save about $700 million. Many Republican members of Congress have objected to such sampling, however, in part because they claim it might boost counts for urban areas, resulting in some congressional districts being redrawn in favor of Democrats.
Now the plan has been given the nod of approval, with a few suggested improvements, by the NRC, who say that sampling would yield a much more accurate national head count at a more reasonable cost. It was the dark suspicions of politicians in the late 18th century that gave us the punched card in the late 19th century, and the computer age of the late 20th century. Who can say where the dark suspicions of today's politicians may lead in 200 years time? Your reporter thinks it may have something to do with better sampling methods.
#
"Scrapie, an end to it?",164,0,0,0
(Jun '97)
The British government has announced a boost to spending aimed at wiping out scrapie, the mysterious prion disease, linked to mad cow disease (bovine spongiform encephalopathy or BSE), which in turn has been linked to a variation of human Creutzfeldt-Jakob disease. The disease has been present in British sheep flocks for over 200 years. This is part of a larger plan to wipe out BSE.
Oddly, scrapie is found in Europe and America but not in \JAustralia\j or \JNew Zealand\j, despite the fact that many of the animals in those countries originally came from U. K. flocks that may have been infected by the disease, raising the question of how environment (or perhaps a long sea voyage?) influences the way a disease is transmitted and maintained in a population.
#
"Bill to outlaw cloning",165,0,0,0
(Jun '97)
Reacting to the news of Dolly the sheep, President Clinton commissioned a report on \Jcloning\j from the National Bioethics Advisory Commission (NBAC), and now indicates that he will be sending a bill to Congress that would outlaw the \Jcloning\j of humans.
President Clinton had already ordered a moratorium on human \Jcloning\j with federal funds, and asked the NBAC to report in ninety days. The report has avoided ethical questions for the moment, looking instead at the safety issues. It took 227 attempts to get a single healthy lamb, and so the NBAC concludes that there would be too many risks, not only for the "child", but also there would be a risk of psychological harm to infertile couples who might be tempted to pursue the method.
While noting that such methods could, in the future, offer ways of saving lives, the President agreed that the time was not right. This may be seen later as a dangerous precedent, since it makes certain types of research a criminal activity, but at least it is not a blanket ban of the sort urged by some of the more uninformed critics of \Jcloning\j.
#
"Men are inferior after all?",166,0,0,0
(Jun '97)
In humans, gender is determined by the chromosomes. Every \Jovum\j contains an X \Jchromosome\j, sperm cells contain either a Y \Jchromosome\j or an X \Jchromosome\j. The Y \Jchromosome\j, apparently almost free of genes or information, is found in males (identified as XY), while women have two X chromosomes (XX).
A few women have what is known as Turner's syndrome, and are found to have just a single X \Jchromosome\j (X0). Their intelligence is usually normal but they often have social adjustment problems. Turner's syndrome is a condition that affects about one in 2500 females who have short stature and often demonstrate social problems at school and throughout their \Jadolescence\j.
A study from a team led by David Skuse of London's Institute of Child Health, and reported in \INature\i during June compared Turner's women for whom the X \Jchromosome\j came from the mother (45,Xm), and where it was paternal (45,Xp). Members of the 45,Xp group were found to be significantly better adjusted, having superior verbal and higher-order executive function skills, all of which mediate social interactions.
Using genetic markers, the team identified 55 subjects who had received the \Jchromosome\j from their mother and 25 who had it from their father. Using information from school counsellors about any social difficulties experienced by the girls, Skuse's group found that 40% with the maternal X \Jchromosome\j were likely to have problems at school, against only 16% of girls with the paternal X. In other words, there was a clear effect, but it is by no means a perfect pattern.
It appears from this that there is a genetic locus for social cognition, which is imprinted and is not switched on from the maternally derived X \Jchromosome\j. Now males, who are known to have a wider range of develop\Jmental disorders\j of language and social functioning, such as autism, than are 46,XX females, but males only have an X \Jchromosome\j from their mothers, suggesting a possible explanation of this established fact.
In the past, the differences have mostly been blamed on social conditioning or hormonal effects: now it looks as though the causes of these conditions lie far deeper. But why is it only \Isome\i males?
#
"Arecibo awakes",167,0,0,0
(Jun '97)
Good news for astronomers: the Arecibo crater is awake and active once more. Arecibo, a mountain sinkhole, is the world's largest radio \Jtelescope\j, with the crater being turned into a dish 305 metres (1000 feet) across. While this dish cannot be directly steered, operators can move the collecting equipment over the dish surface, effectively pointing it at different parts of the sky. This month, after a $27 million revamp, taking five years, the \Jtelescope\j is working once more.
#
"Wasps help date rock painting",168,0,0,0
(Jun '97)
How do you date a rock painting on a cave wall or a rock overhang? Carbon dating is often useless, and other dating methods are all open to criticism by those who disagree with you. Perhaps wasps can help, according to a paper in \INature\i this month, co-authored by eleven of \JAustralia\j's top names in the dating field.
Mud-nesting wasps build nests that become petrified after they are abandoned. In northern \JAustralia\j, the nests may lie over paintings, or be covered by them, and these nests contain \Jpollen\j, spores and phytoliths which can yield information about local plants from the time when the nest was made.
The authors used a mix of optically stimulated \Jluminescence\j (OSL), and accelerator mass spectrometry (AMS) 14C dating of \Jpollen\j to determine the ages of mud- wasp nests associated with rock paintings in the Kimberley region of Western \JAustralia\j. They dated individual sand grains in the mud, and showed that some paintings showing human figures are older than 17 000 years.
While this is less exciting than the claim of up to 176 000 years made last December, it offers us a technique which is applicable all over the world where the mud wasps are found, and it offers us a confirmed date which can now be extended.
#
"Solid lubricants",169,0,0,0
(Jun '97)
When the going gets tough, machines need lubrication. When the going gets even tougher, in heat or in cold, or in the vacuum of space, working parts still need lubricants, but the traditional liquids are no use any more. And so we enter the world of the solid \Jlubricant\j, such as \Jgraphite\j.
Now there is a new kid on the solid \Jlubricant\j block, a hollow tube, rather like the tubular fullerenes, made of \Jtungsten\j disulphide. Instead of slipping between weakly bonded crystal planes, these nanoparticles may lubricate by rolling. The tubes are chemically inert, and they seem to be able to roll because of their peculiar cage-like structure.
Metal dichalcogenides with the formula MX\D2\d (where M is, for instance, molybdenum or \Jtungsten\j and X is sulphur or \Jselenium\j) are widely used as solid lubricants, but the hollow nanoparticles (often referred to as HNs) are a new development.
#
"DNA fingerprints from fingerprints",170,0,0,0
(Jun '97)
A new method has now been found which will allow crime scene examiners to extract DNA from the fingerprints left behind at a crime scene. Swabs taken from objects such as \Jtelephone\j receivers and knife handles yielded enough DNA to produce a genetic profile, sometimes several profiles of several individuals who had been in contact with the object. Now police will need to be even more careful to avoid "contamination" of the crime scene.
#
"Equatorial ice cores",171,0,0,0
(Jun '97)
Ice cores from equatorial glaciers are just as useful as those from high latitudes in providing evidence of ancient climates. The ice core record from the Guliya ice cap (Qinghai-Tibetan Plateau) extends in detail back past 100 000 years ago and may even contain ice deposited 500 000 or more years ago, and this is the subject of serious research at the moment.
\JTibet\j, however, stays cold long enough to allow serious study. The mountains of \JBolivia\j are not such a good place to examine ice, and to extract the air bubbles trapped in the ice. The answer is to take the ice cores down the mountain, but that means going down into tropical heat which melts the ice and destroys the record. The creative solution: hot air balloons, which can be launched from the \Jglacier\j, drift clear of the mountain, and then drop quickly to deliver the ice cores to the freezers, waiting below!
#
"Ark debate, Plimer vs Roberts",172,0,0,0
(Jun '97)
Professor Ian Plimer had what could only be described as a minimal victory in his case against creationist Allen Roberts in Sydney, \JAustralia\j. While it was described by some as "a rerun of the Scopes trial", the case was launched against Roberts in the Federal Court on the ground that he had breached the Australian Trade Practices Act, in that he had distributed "misleading and deceptive" materials.
Plimer's case, which ran for seven expensive days in April, was that Roberts was selling tapes and publications which made claims about a supposed site for Noah's Ark in Turkey, and that this meant Roberts was engaged in trade or commerce. Roberts' organisation, operating as ArkSearch, had an income in excess of $50 000, but according to the judgement, it "lacked the necessary degree of system and continuity" to be considered a business.
While Justice Ronald Sackville found in his June 2 verdict that some of Roberts' claims were false, he thus ruled that they did not constitute trade or commerce, meaning that the matter could not be a matter for his court. "The courts should not attempt to provide a remedy for every false or misleading statement made in the course of public debate on matters of general interest," said Sackville, whose ruling may be found at http://www.austlii.edu.au/au/other/fca
A joint case by a marine salvage expert against Roberts, for breach of copyright, gave the claimant a verdict of $2500 for damages, some small consolation to Plimer, but he was able to use the court as a forum to place in the public domain a great deal of information which is damaging to the continuation of any claims that Noah's ark is waiting for true believers, high in the mountains of eastern Turkey.
The way is open for Plimer to appeal the Sackville ruling that Roberts' actions did not constitute trade or commerce, and on June 20, he went ahead, lodging an appeal that will be heard by three Federal Court judges.
#
"No ice on the moon after all",173,0,0,0
(Jun '97)
Last December, we reported a hopeful account of ice on the moon, which reported that a radar signal from Clementine bounced very strongly off an area inside a large crater, the South Pole-Aitken. This signal was much more than could be expected from silicate rocks, and this suggested frozen volatiles of some sort. Because the crater was in permanent shadow, ice seemed like a possibility.
Now similar strong reflections have been found in other craters, ones which are not in permanent shadow, making it likely that something else is responsible, possibly surface roughness. Or could it be some of the small comets, proposed by Louis Frank and John Sigwarth? Who knows, maybe they \Iare\i splatting into the moon after all, freezing until they sublime back into space.
#
"Deep Blue goes data mining",174,0,0,0
(Jun '97)
Fresh from its chess win against Gary Kasparov (see May), \JIBM\j's computer has demonstrated the difference between human intelligence and machine power by switching programs and turning to data mining, researching new drugs and maximising stock market returns. This sort of machine is best suited to searching large databases to find new answers.
We have to hope that Deep Blue's owners will keep in mind the old message GIGO, or Garbage In, Garbage Out. An infinite number of monkeys, given sufficient time, will produce anything, even the complete works of Shakespeare, and given enough data mining, you can find any pattern in any data set. A report in Business Week during June revealed that the single best predictor from a United nations \JCD\j-ROM of data for the variations in the Standard & Poor 500-stock index was butter production in \JBangladesh\j.
#
"Obituary for June 97",175,0,0,0
(Jun '97)
\BJacques Cousteau (1910-1997)\b
Jacques Cousteau was co-inventor of the aqualung, and the person who, more than anybody else, made two generations aware of the watery world around them. Ocean explorer, \Jtelevision\j host and confirmed conservationist, his passing will be regretted by many.
He perfected the Aqua-Lung during World War II, while he was a member of the French Resistance, a breathing device which led to the SCUBA diving system, together with engineer Emile Gagnan. He also made some of the first underwater films with the help of a waterproof camera case he designed. It was the underwater films that he and his team later made which made his name a household word.
#
"Pathfinder touches down!",176,0,0,0
(Jul '97)
Mars is the \Jplanet\j that features most in \J\Jscience fiction\j\j as a place where life might be or might have been. The world's most expensive beachball landed on Mars, on time on July 4, the first successful mission to Mars since the two Viking probes landed there in 1976, and deployed almost exactly as planned, at the end of a journey which began last December.
Slowed by a parachute and then by retrorockets, the airbag-surrounded craft dropped the last few metres to the ground, bounced, and then settled to the ground in a giant outflow channel called \JAres\j Vallis. Just after reaching the surface, the \Jspacecraft\j began sending its first images of Mars, which were immediately taken up by Web mirror sites all around the world.
On July 5, a six-wheeled "micro-rover," called Sojourner had rolled out onto the Martian surface. The size of an ordinary \J\Jmicrowave oven\j\j, Sojourner is fitted with a wide range of scientific equipment, and it is designed to pilot itself between the rocks, study the composition of the soil and rocks and send the data back to the \JEarth\j.
While Sojourner has been described as an expensive remote-controlled car, this is not really the case. It takes eleven minutes to get a radio signal to the "car", and during that time, even at a rate of 0.4 metre a minute, the "car" has travelled quite a long way. And even worse, Sojourner only takes still pictures, and they take eleven minutes to arrive back on \Jearth\j.
So Sojourner has an on-board computer, camera and \Jlaser\j scanner to help it select the safest way to its destination, once the \Jearth\j controllers have indicated where they want it to go. The rover is programmed to detour around any unforeseen obstacles, and because it only has enough power to either move or navigate, it progresses slowly, moving half a wheel-turn forward, then stopping to scan and analyse before moving cautiously forward again.
The pictures from the \JAres\j Vallis look broadly like those from the Viking landing site, some 850 kilometres away, two red deserts strewn with assorted rocks, but they are actually quite different. The Viking site was an old \Jlava\j flow, but Sojourner appears to be wandering among rocks that have been carried there from great distances, presumably by the action of past floods. Nearby hills show bands which may be sedimentary layers or benches cut into the valley by rushing \Jwater\j.
Most scientists are hoping for sedimentary layers, where fossils might be found, but this must remain as an open question until the next century, as Sojourner is unable to carry out such a search. Early next century, NASA hopes to return Mars rock samples to the \Jearth\j for detailed analysis.
Sojourner transmits its data to Pathfinder, which relays it to the \Jearth\j, as well sending back meteorological data on the thin Martian \Jatmosphere\j from its built-in \Jweather\j station.
Perhaps the most surprising finding has been that Mars is more like \Jearth\j than anybody expected. A rock, nicknamed "Barnacle Bill" was given a chemical analysis by the rover's alpha proton x-ray spectrometer. The result: "Barnacle Bill" is richer in silicon than any meteorites of known Martian origin, about 10% richer, making it similar to the common \Jearth\j rock, \Jandesite\j.
The standard theory about the \Jearth\j's \Jandesite\j is that the \Jquartz\j (\Jsilica\j) and silicon levels are raised by the action of \J\Jplate tectonics\j\j. High-\Jsilica\j rocks on \JEarth\j generally come from volcanic eruptions fuelled by the sinking of plates of surface rock into the \Jplanet\j's interior, a process thought to be uniquely terrestrial. Now scientists will have to decide whether the theory is right, or whether Mars has aspects that nobody expected, like \J\Jplate tectonics\j\j, or liquid \Jwater\j, which can also be involved in the \Jsilica\j-enrichment process!
There is certainly evidence of liquid \Jwater\j there, and large amounts of it at that. The landforms around the site show evidence of what appears to be flood damage, including 4-metre-high ripples nearby, spaced 20 metres apart, exactly the sort of ripples left by the late stages of a catastrophic flood. The flow direction suggested by the ripples matches the orientation of a stack of boulders piled up against each other, as if by a flood, and the flow direction also matches the direction that orbiter images suggest the flood took.
Sojourner has been studying rocks at close range, and matching the \Jreflectance\j at different wavelengths with their chemical properties. Later, the \JHubble Space \JTelescope\j\j will be able to examine Mars at long range, identifying areas on the surface with similar reflecting properties, so one result of the mission should be a detailed geochemical map of the red \Jplanet\j.
Pathfinder is one of NASA's new Discovery program, which aims to conduct important space science experiments for less than $150 million. According to NASA, the budget for Pathfinder was less than one fifteenth of the Viking program, after adjusting for \Jinflation\j over the past twenty years. Another program in the same series is the NEAR probe, which sent back high-resolution images of the asteroid Mathilde in July, described next.
Pathfinder is the first round of NASA's Mars Surveyor Program, a series of missions to be launched every 26 months over the next decade, so whatever is left unanswered by this mission should be here soon, without any more twenty-year gaps.
Next time, the robots will have some serious clout. The newest version, "Rocky 7" has been going through its paces in the Mojave Desert. Rocky or its successor, will fly to Mars in 2001 - 2002. The existing Rocky has already travelled more than a \Jkilometre\j, placing scientific instruments and taking 500 photographs, manoeuvring with the help of stereo cameras mounted on its front.
#
"Mathilde--not your average asteroid?",177,0,0,0
(Jul '97)
During July, the Discovery program's Near \JEarth\j Asteroid Rendezvous \Jspacecraft\j (NEAR) passed close to the deeply cratered asteroid Mathilde, and gathered data showing that the asteroid is surprisingly light. Now scientists are beginning to wonder if the \Jasteroids\j are just loose collections of flying rubble. The flyby happened in late June, but the details were only available in the first few days of July, when they were largely submerged in Sojourner fever.
\JAsteroids\j are important to us, here on \Jearth\j, with one blamed for the wiping out of the dinosaurs and another now being linked to the late Eocene \Jextinction\j event. Another suspected asteroid impact destroyed thousands of square kilometres of forest near Tunguska, \JSiberia\j, in 1908. More importantly, the \J\Jsolar system\j\j's \Jasteroids\j, comets, and meteorites, the so-called primitive bodies, are where we can expect to find clues to the processes that controlled the formation of the early \J\Jsolar system\j\j, clues that will long since have been destroyed on larger bodies.
The NEAR mission has been the first close approach to a serious asteroid (Mathilde was estimated before the fly-by to be about 60 km across, larger than Gaspra and Ida, both visited by \JGalileo\j, several years ago), but there will be an even better approach to an asteroid when NEAR approaches 433 Eros in early 1999. Mathilde is presently near its perihelion point, q=1.94 AU, providing this flyby opportunity to the NEAR \Jspacecraft\j. Until then, Mathilde is our best "shot".
Mathilde (253 Mathilde to the precise) is a main-belt asteroid. Discovered in 1885, it is believed to be named after the wife of the then vice-director of the Paris Observatory, Moritz Loewy. Unlike Gaspra and Ida, both S-type (stony) \Jasteroids\j like Eros, Mathilde is a carbonaceous, or C-type, asteroid. The carbonaceous \Jasteroids\j are mainly found in the outer regions of the asteroid belt and they make up more than 75% of the known \Jasteroids\j. Chemically, they are rather like what the \Jsun\j would be if you boiled off the \Jhydrogen\j, \Jhelium\j, and other volatiles, or easily vaporised elements.
We now know that Mathilde is very lightweight, with a very dark surface, and it is covered with enormous craters, begging the question: how has it survived the huge impacts that must have made those craters? And how or why does it rotate so slowly (just once in every 17.4 days), when those cratering impacts should have left it spinning rapidly?
The observed craters range from over 30 km (18 miles) across to less than 0.5 km (0.3 miles) in diameter. There are more than five craters larger than 20 km in diameter on the 60 percent of Mathilde's surface that NEAR was able to photograph as it raced past at 36 000 km/hr, far beyond Mars' \Jorbit\j, and just 1210 km (750 miles) from the surface of the asteroid. In just 25 minutes, NEAR collected 534 images of Mathilde. The 30 km crater is 6 km deep.
NEAR's observations tell us that Mathilde is slightly smaller than astronomers thought: only 53 kilometres (32 miles) in diameter, and surprisingly low in density, "weighing in" at around 1.3 grams per cubic \Jcentimetre\j, hardly more than \Jwater\j. An ordinary adult on the surface of this asteroid would weigh about a pound (500 gram), due to the low gravity.
Mathilde remains one of the darkest objects known, reflecting only 4 percent of the incoming light, and the material making it up is surprisingly uniform-even the bottoms of the craters seem to be the same pale grey. When NEAR swung away from Mathilde it took a series of images that astronomers intend to scour for signs of a tiny moon, but so far, no reports have been heard on this search.
Eros was first observed in August 1898, and it is about 14 x 14 x 40 km, with a rotation period of 5.27 hr. It is thought to be a silicate rock, and it has an \Jorbit\j inclined at 10.8 degrees, an aphelion (distant point from the \Jsun\j) of 1.78 AU, and a perihelion (close point to the \Jsun\j) of 1.13 AU. It appears to be about 35 km across at its widest point, and it is the largest of the "near-\Jearth\j \Jasteroids\j," objects whose orbits take them menacingly close to our \Jplanet\j.
The NEAR craft is fitted with a multispectral imager system, an x-ray/gamma-ray spectrometer, a near-IR spectrograph, a magnetometer and a \Jlaser\j rangefinder. It will study the size, shape, volume, mass, gravity field, and spin state of Eros, and send back data on surface properties such as the elemental and mineral composition, \Jgeology\j, morphology, and texture, while measuring internal details such as the mass distribution and magnetic field.
Unlike the earliest missions which used tape recorders, NEAR is well-equipped with data storage, having 1.7 gigabytes of solid-state storage. It will need this memory, because it will \Jorbit\j Eros for a year, studying its surface from altitudes as low as 24 kilometres (15 miles). The probe will end its mission on February 6, 2000, with a controlled landing onto Eros' surface.
#
"Big holes in Vesta",178,0,0,0
(Jul '97)
The asteroid Vesta, which is only 525 km in diameter, has a 450 km impact crater on it. Peter Thomas of Cornell University and his colleagues reported at the annual meeting of the Division for Planetary Sciences in July that repeated imaging of Vesta using the \JHubble Space \JTelescope\j\j had revealed a large jagged bite taken out of the asteroid's southern hemisphere.
By recording the silhouette of Vesta, they mapped a classic impact crater 450 km across and 8 km deep with a 13-km-tall peak in the centre. The so-called basaltic achondrite meteorites are now likely to be bits of Vesta, thrown off in that impact. These meteorites look very much like small, 5-km \Jasteroids\j near the zone in the asteroid belt where Jupiter can fling rocks gravitationally towards the \JEarth\j.
In the past, the puzzle has been to explain the force that tore these pieces away, but an impact capable of making a crater that size would certainly be able to hurl lumps of rock, several km across, out into space, setting them on an eventual collision course with the \Jearth\j.
The final answer will probably depend on sending a probe to Vesta.
#
"Death by asteroid",179,0,0,0
(Jul '97)
"The asteroid that wiped out the dinosaurs" is now widely accepted by scientists, but how many other \Jextinction\j events were caused in similar ways? \JSiberia\j, home of the Tunguska event, is also the home of the Popigai impact structure on the Anabar shield. Popigai is a remnant of a huge crater, 100 km across. The crater was formed some time between 5 and 65 million years ago, although it was probably older than 29 million years. Nobody could be more precise than that, until age determinations were carried out using \Jargon\j isotope ratios.
Now the impact is dated at about 35.7 ▒ 0.2 million years ago, almost the same time as the Chesapeake Bay crater off the coast of \JNorth America\j, and raising the interesting question that the two impacts may have been two parts of the same object, and suggesting that the impacts may have caused the mass \Jextinction\j of the late Eocene.
We now know that northern \JItaly\j has an iridium anomaly and shocked \Jquartz\j in Late Eocene deposits, and the dating for these finds is consistent with the Popigai dates. This age is also similar to that of the North American tektites, which have been associated with the Chesapeake Bay impact structure in the eastern United States, making it even more likely that the events were close together, and part of something big.
#
"Japan's new telescope in space",180,0,0,0
(Jul '97)
July saw the release of the first images from a new orbiting radio \Jtelescope\j, called HALCA, whose data were combined with signals from ground-based telescopes to simulate an enormous collecting dish. The images show a powerful jet of subatomic particles spewing from a \Jquasar\j, and demonstrate the power of a system with a hundred times the resolving power of the \JHubble Space \JTelescope\j\j.
HALCA is the first space-based radio antenna designed for interferometry, using a technique that combines data from far-flung telescopes and create the equivalent of an enormous collecting dish. In effect, HALCA, once it is linked with other dishes on \Jearth\j, simulates a dish with a diameter greater than 30 000 km.
HALCA was built by scientists at the Japanese Institute for Space and Astronautical Science. Launched in February, it should allow astronomers to see further than ever before and make precise observations of objects that emit radio waves, such as quasars and black holes. The particle jet in the first images comes from a \Jquasar\j which may contain a black hole within it, though this remains uncertain.
#
"Sulfur and earth's outer core",181,0,0,0
(Jul '97)
Sulfur hit the news in July, when a group of scientists led by Philippe Nasch offered a new view on an old problem. The outer core of \JEarth\j is mainly molten iron, but there must be some lighter alloying elements in the outer core as well, because the compressional sound waves which travel through the outer core do not match up with the expected measurements, based on experimental determinations and extrapolations of the properties of molten iron.
There are several possibilities, including sulfur, nickel, oxygen, and \Jhydrogen\j, but it can now be shown that sound velocity seen in ultrasonic interferometry in an Fe-Ni-S liquid increases with increasing \Jtemperature\j. This is unexpected, and the researchers suggest that the anomalous behavior is due to the presence of sulfur and believe that this behavior could be used to distinguish sulfur from other candidate core components.
While not being a discovery as such, the finding reflects the way in which science progresses, with a small observation opening the door for further developments.
#
"Sulfur in a solar nebula",182,0,0,0
(Jul '97)
Chondrites are meteorites which contain "chondrules" that are assumed to represent condensates from the earliest formed components of the solar nebula. They also contain a set of sulfide minerals which seem likely to have been formed by an interaction of iron-nickel alloys with \Jhydrogen\j sulfide gas, according to work led by Dante S. Lauretta, a young research scientist who appears to have a bright future in front of him, no word-play being intended.
According to Lauretta and his colleagues in \IScience\i, the chondritic sulphides may be condensates from the solar nebula and so be available to be used as tracers of the \Jtemperature\j in the nebula when these phases formed. The work stems from placing a solar composition Fe-Ni alloy in a \Jhydrogen\j sulfide-rich gas.
#
"Europa has an atmosphere",183,0,0,0
(Jul '97)
Remember \JGalileo\j, our emissary to Jupiter? While \JGalileo\j has been touring the Jovian system, it has been taking the odd glance at Europa, studying occultations, when Europa has been located between \JGalileo\j and a star.
The purpose of the sneaky looks: to identify any signs of an \Jatmosphere\j around the \Jsatellite\j. It now appears that there may be a thin \Jionosphere\j, produced by particles impacting on the \Jwater\j ice that covers the \Jsatellite\j. If the \Jatmosphere\j is rich in oxygen or \Jwater\j, as appears to be the case, the \Jtemperature\j of the tenuous gas could be as high as 340 kelvin, about 65 degrees \JCelsius\j or 150 degrees Fahrenheit, because the moon lies inside Jupiter's magnetic field, its magnetosphere, as do Ganymede and Callisto. Because of this unusual position, Europa's \Jionosphere\j will be heated by the action of the \Jplanet\j's magnetic field.
#
"Rhea and Dione can have ozone holes",184,0,0,0
(Jul '97)
Similarly, Saturn has two satellites, \JRhea\j and Dione, which \Jorbit\j inside that \Jplanet\j's magnetosphere. This means that they are subjected to continual particle radiation from trapped ions, much as happens on the three Jovian satellites mentioned above.
All five satellites have \Jwater\j-rich surfaces, and the presence of \Jwater\j-oxygen atmospheres was predicted by laboratory studies before the recent discoveries of atmospheres on Ganymede and Europa. Now it appears that there is a measurable build-up of ozone on Dione and \JRhea\j, detectable from a distance by spectrometer, matching the earlier discovery of ozone on Ganymede.
#
"Dating the Old World monkeys",185,0,0,0
(Jul '97)
Among the monkeys, those of \JAfrica\j and Asia are known as the Old World monkeys, while the South American monkeys are the New World monkeys. Until now, we have only been able to hazard a guess that the monkeys split off from the stock which became us, somewhere between the late Oligocene and early Miocene, 28-22 million years ago.
Now the discovery of the oldest known \JOld World monkey\j skull, which is some 15 million years old, provides further fuel for debate about the division, which eventually resulted in the \Jevolution\j of Old World monkeys on the one hand, and the great and lesser apes, and humans, on the other.
Skull structures are commonly used to draw conclusions about the various members of the different \Jape\j genera. Unfortunately, good \Jfossil\j skulls, unbroken and uncrushed by the weight of sediments, are uncommon, and this makes it hard to work out which characteristics are primitive, there from the start, and which are recently derived. Obviously any characteristic which has evolved separately in two groups is not good evidence for any sort of relationship.
The discovery of a complete and undistorted skull of \IVictoriapithecus\i in middle Miocene deposits from Maboko Island, \JKenya\j, provides evidence that some of the characteristics which have been ignored in the past are primitive, allowing the experts to delve a great deal further into the relationships between different groups.
\IVictoriapithecus\i represents a branch of \JOld World monkey\j that appears to be intermediate between today's cercopithecids (Colobinae and Cercopithecinae) and the common ancestor they shared with apes (Hominoidea). This \Jfossil\j possesses a number of key skull features, and if these features were present in the ancestral form, they were probably not derived separately by \Jfossil\j forms such as \ISivapithecus\i and \IDryopithecus\i from Eurasia and the living orang hutan (or \Jorang-utan\j) (\IPongo\i) from \JBorneo\j and \JSumatra\j. You can expect to hear more arguments based on \IVictoriapithecus\i over the next few years.
#
"Chimp retirement plan",186,0,0,0
(Jul '97)
Large populations of chimpanzees have been bred in the USA for AIDS research and then not used, once people realised that the animals proved to be a bad model for the disease. An expert panel has now proposed that the National Institutes of Health acquire the approximately 1000 chimps the government already supports and shelter them for the rest of their lives. Or will the economic rationalists have them sent to a new theme park?
#
"Neandertal Man partly cloned",187,0,0,0
(Jul '97)
The individuals most of us learned to call \JNeanderthal Man\j, although the modern spelling of "Neandertal" is now more common, have been partly cloned.
That is to say, the mitochondrial DNA of one Neandertal has been extracted and analysed, but the results will be less than pleasing to the "lumpers" who see the Neandertals as humans like us, but of a slightly different form, but pleasing to "splitters" like Chris Stringer of London's Natural History Museum, who has long held that the Neandertals were a different species. On the evidence, the Neandertals were not our ancestors, but an evolutionary dead end.
More importantly, the results have been replicated in an independent laboratory, so there now appears to be no doubt that the individual sequenced was well separated from our own line of development. The evidence comes from a small amount of independent DNA, found not in the nucleus, but in small bodies in our cells, the mitochondria, and it says that there is a gap of 500 000 years of independent \Jevolution\j between the Neandertal studied, and modern humans, half a million years of travelling down different evolutionary branches.
We only inherit our mitochondria from our mothers, so while we have no way of knowing much about our fathers or our male ancestors on either side of the family tree, we can trace an individual's female ancestry back to the year Dot through a single unbroken female to female line. Every person alive today has inherited mtDNA through a line of female ancestors that disappears into past ages.
As time goes by, mutations slowly creep in at unimportant points in the DNA, and from this, we can estimate how far back two different lines diverged. The mitochondrial DNA does nothing but sit there like a luggage label, so there is no evolutionary selection operating on it, so far as we know. So when you sequence the mtDNA of two individuals, you can tell, from the differences, how long it is since they had a common female ancestor on that key female to female line.
In \IHomo sapiens\i, the mtDNA is simply a marker which seems to say that, without a doubt, we all had a common ancestor 200 000 years ago, or maybe a bit more, as the "clock" is open to some question. That at least is what the "mitochondrial Eve" supporters say. Based on clear and incontrovertible evidence from mitochondrial DNA, there existed a female in \JAfrica\j, some 200 000 years ago, who was the ancestor of us all. Not surprisingly, the lady has been dubbed mitochondrial Eve.
The opposing theory says humans evolved in \JAfrica\j two million years ago, radiated out as \IHomo erectus\i, and then evolved into \IHomo sapiens\i in their new homes. A variety of racial groups arose, but they were all part of the human story, and all contributed to the modern pool of genes that is humanity. When we look at the \Jfossil\j skulls, and there are several hundred of them available, the evidence seems to point quite strongly to a much longer and more continuous relationship. We came out of \JAfrica\j, say the fossils, and Peking Man became the modern Chinese, \JJava Man\j became the people of the Pacific and \JAustralia\j, and so on.
That argument is still a matter for some dispute, with "Eve" supporters wanting to exclude the Neandertals from our line of development, and the "Eve" opponents wanting to include all of the humans into our general family tree. They point to a typically Neandertal skull characteristic, still found in 4% of modern Europeans, and argue for a Neandertal input into our make-up.
A report in the journal \ICell\i this month would appear, on the surface, to push the Neandertals out of our tree. Scientists in \JGermany\j and the United States worked on a sample from the upper arm bone of the prototype Neandertal skeleton, discovered in \JGermany\j in 1856. Working with extreme care to avoid contamination, Matthias Krings and Svante PΣΣbo of the University of Munich and Anne Stone and Mark Stoneking of \JPennsylvania\j State University all took part. Krings used the polymerase \J\Jchain reaction\j\j to amplify part of a particular sequence called the control region in the DNA of mitochondria.
The team found variations in stretches of the DNA which never vary in modern human samples, showing the DNA is ancient and uncontaminated. When they compared the Neandertal sequence with 986 distinct sequences from living humans, they found that the ancient DNA was three times more different than any two modern human sequences, putting it outside the statistical range of modern human variation.
The "splitters", of course, are delighted, but there is still room for the lumpers to retain some hope. The mtDNA only tells us about the female to female line of inheritance, and Neandertal genes could have come down to us from a line of descent that passed through a male at some stage. To the splitters, the finding implies that the Neandertal and "sapiens" lineages diverged before the first known Neandertal at about 300 000 years ago, and long before the first modern humans at less than 200 000 years ago.
As with so many other issues in science, there is evidence for each interpretation. Scientists tend to believe based on a selective view of the data, shaped by their friendships and associations, and shaping those associations in accordance with their views. Each side tries to fault the methodology of the other side, but it is all done in a carefully scientific manner.
But will we see Neandertal Park? It is unlikely in the near future. To have even got this much material was something extraordinary, since DNA degrades over time, and is so easily contaminated. As one palaeoanthropologist commented: "The fact that they managed to find DNA from a region of prime importance is proof that there is a God who likes palaeoanthropology."
#
"AIDS transmission by kissing",188,0,0,0
(Jul '97)
A case was reported in July of the \JHIV\j virus being passed on in kissing. A man who was \JHIV\j-positive has passed the virus to his partner, apparently after kissing her, just after brushing his teeth and flossing, which caused his gums to bleed. While the virus does not normally survive long in \Jsaliva\j, it apparently lasted long enough to make the transfer. Doctors have ruled out any alternative avenue of infection.
#
"Language acquisition",189,0,0,0
(Jul '97)
Humans have had a speech area in their brains for maybe 1.9 million years-a \IHomo habilis\i skull of that date has the imprint of a frontal-lobe language-sensitive region known generally as "Broca's area", according to Phillip Tobias, one of the world's experts in the arcane art of reading the inner side of skulls to get clues about the brains they once contained. We all use Broca's area to speak our native language, where "grammar knowledge" is stored, but where are other languages' grammar knowledge stored?
It seems that it depends when you learn the next language. Children who grow up bilingual process the two languages differently from those who learn a second tongue later in life. The bilingual children store both languages in Broca's area.
In a report in \INature\i during July, Joy Hirsch and her colleagues at the Memorial Sloan-Kettering Cancer Center in New York City described their studies on two groups, "early bilinguals", and "late bilinguals" who learned their second language as adults. The subjects were asked to \Ithink\i of a story in each language while the researchers used functional magnetic \Jresonance\j imaging (fMRI) to monitor brain activity. They had to only "think" the story, as any body movement, even the movements of speech, can distort an fMRI image.
In this "thought tale" in the native tongue, several areas of the brain "lit up" including Broca's area, and in early bilinguals, the same part of Broca's area lit up, while in late bilinguals, the activity centred on a region 9 mm away. In both late and early bilingual subjects, another part of the brain, the temporal-lobe language-sensitive regions (Wernicke's area) showed little or no separation of activity based on the age of language acquisition.
Based on this, it seems possible that adults learn new languages in a different way from the way they learned as young children. Or is the native language region "closed off" to a second tongue after a certain age?
#
"Baby talk language",190,0,0,0
(Jul '97)
Maybe the difference has something to do with the way that a language is passed to the learner-how many adults acquire a second language through baby talk? Patricia Kuhl of the University of Washington in Seattle has been looking at the effects of baby talk, and she believes that baby talk may help infants to learn the key features of vowel sounds.
Kuhl has been looking at "parentese" in three different languages-English, Swedish, and Russian-and with her colleagues, she has found that in all three languages, mothers produced exaggerated versions of vowels when talking to their babies , emphasising the features that distinguish the vowels from each other. Adult speech varies, and children must learn how to identify the differences which are important, and the differences which are unimportant, so any tendency to emphasise the important differences will obviously be useful.
While this observation does not prove that children need this sort of speech to learn language, it does suggest some very interesting lines for future enquiry, such as identifying cultures using different amounts or types of parentese, and the ways children lean in those cultures.
Gram-positive \Jbacteria\j are those which are stained by Gram's stain. They all have cell walls made mainly of peptidoglycan, and they lack the outer membrane of lipopolysaccharide, lipoprotein and other large molecules, found in the Gram-negative \Jbacteria\j. \IBacillus subtilis\i is a Gram-positive soil bacterium which grows on simple media. Other Gram-positive \Jbacteria\j include \IStaphylococcus aureus\i, and the \Jbacteria\j causing \Janthrax\j and tetanus.
\IB. subtilis\i is a common laboratory species, and it is often used for studies on protein secretion, and for growing enzymes for industry. It is now the first of its group to have its \Jgenome\j completely sequenced.
After 5 years of work, leaders of a team of 37 laboratories announced the complete sequence of the 4.2-million-base \Jgenome\j in July. EU scientists completed 60% of the sequence, with Japanese researchers doing 30%. The remaining work was done by laboratories in the United States, Korea, and other countries. If you go to http://www.pasteur.fr/Bio/SubtiList.html, you will find 63% of the information stored there in an accessible \Jdatabase\j. More is expected to be online soon.
In mid-July, President Clinton urged Congress urged Congress to pass a new federal law forbidding discrimination based on a person's genes, in an effort to prevent the misuse of human genetic data. Once this law is in place, it will remove a potential roadblock from some types of genetic research, so the proposed law is only partly about safeguarding the individual.
Mr Clinton said he wanted to make it illegal for any health insurance company to deny coverage to a healthy person on the ground that medical data indicate that the person is at risk for an inherited disease. The proposed law will also rule out insurance companies raising premiums on the basis of genetic data. A background statement added that the law would also protect privacy by stopping health plans from releasing or demanding access to genetic information without a subject's consent. At the same time, the law will make provision for responsible use of genetic information for biomedical research.
Broad support appears to be available in Washington, but it remains to be seen how far this can be undermined by lobbying from the insurance companies, or whether other nations will follow suit.
#
"Genetically engineered viruses",193,0,0,0
(Jul '97)
Maybe genetically engineered viruses are not so bad after all. Researchers at the Johns Hopkins Oncology Center in Baltimore, working with a Californian company called Calydon, have produced a mutant strain of the common cold virus which will target and attack the cells of the prostate \Jgland\j. As prostate cancer is the second biggest cancer killer of men in the western world (after lung cancer), this is good news indeed.
The virus can enter any cell, but only becomes active when it finds PSA (prostate specific \Jantigen\j) in the cell. PSA is a protein involved in the formation of semen, and only rarely found in other cells, so its presence is a very effective identifier of prostate cells.
The problem was to make the virus recognise that it is in a prostate cell. This was achieved by finding the human gene that produces PSA, and then identifying the regulator gene which only switches on in prostate cells. Then they added this to a gene in the viral \Jgenome\j which is needed for replication, so the switch now triggers viral replication, but only in cells of the prostate.
A report in this month's Cancer Research (57:2559) shows tumors have been reduced to about a sixth of their former size in mice, and human trials are due to start soon. Like other cancers, prostate cancer sloughs off cells that move to other parts of the body in a process called metastasis, and the researchers hope in the future to use intravenous injections to send patrolling viruses around the body, seeking and destroying rogue prostate cells.
There is some evidence of the virus replicating in non-prostate cells, but it is unlikely that these will present problems, say the researchers, pointing out that the virus itself is one that we all have and manage to deal with, except that it can only grow in a limited range of human cells, rather than in most of them.
#
"Monkeypox makes a comeback",194,0,0,0
(Jul '97)
In the Democratic Republic of \JCongo\j, monkeypox virus appears to be transferring between people more easily than in the past. Smallpox vaccinations used to give resistance to this disease as well, but these were so successful that the smallpox vaccinations were stopped some two decades ago, leaving a large pool of people who are unprotected against the related monkeypox.
This raises the question of whether or not the world's remaining stocks of smallpox should be killed off or not. It may be that future researchers will need the smallpox virus as they try to defeat something like the monkeypox outbreak, which causes almost identical symptoms in its victims.
Between February 1996 and February 1997 at least 92 cases of the disease and 3 deaths were reported in a remote region of what was then Zaire. This compares with just 37 recorded cases from 1981 to 1986. Of the 89 survivors, 73% seem to have been infected by other people, while the 1981-1986 studies reported no more than a 30% "secondary contact" rate, according to the Czechoslovakian epidemiologist and monkeypox expert, Zdenek Jezek, who carried out the study.
Immunity problems may stem only from the lack of smallpox \Jvaccination\j, but it is further compromised because the Democratic Republic of \JCongo\j has high rates of infection with \JHIV\j, which cripples immune systems. The \J\Jcivil war\j\j may have contributed another factor as villagers, faced with the risk of starvation may have increased their hunting of animals that carry monkeypox, which include monkeys, squirrels, and rats. The unrest has also made it risky for scientists from the U.S. Centers for Disease Control and Prevention (CDC) to work in the area. Until the area settles down, the way in which the disease is transmitted must remain a mystery, and a slightly worrying mystery at that.
#
"Sarawak's new killer disease strikes children",195,0,0,0
(Jul '97)
A mystery disease killed 31 children in \JSarawak\j, from its first appearance in April, up to the start of July. The children, twenty boys and eleven girls, were aged from 5 months to six years, and their symptoms appeared similar to the "hand foot and mouth disease", caused by the Coxsackie A16 virus and other enteroviruses, which has been raging on the Malaysian mainland, affecting more than 2000 children. The mystery disease causes the same rashes on the hands, feet and mouth region.
There is one big difference: the HFM disease does not normally kill its victims. This disease triggers heart failure in its victims, while none of the mainland patients has died so far. A team of the recognised centre of expertise, the U.S. Centers for Disease Control and Prevention (CDC), is now in \JMalaysia\j trying to help the government identify the cause.
Malaysian health officials think that Coxsackie B virus, which is linked to myocarditis, could be a possible cause of the heart failure disease. Yet another possibility is enterovirus 71, which is known to cause heart failure and which has been isolated from several of the dead children. So far, researchers have managed to eliminate a number of viruses with serological studies, so they know that they are not looking at Japanese \Jencephalitis\j, \Jdengue\j, yellow fever or a Rickettsial infection, but that still leaves a very wide range of possibilities.
The latest details are to be found by accessing this URL, although it should be noted that some of the press releases are in Malay: http://ftp.\Jsarawak\j.com.my/org/jkns/outbreak/virus1.htm.
#
"Yeast prion model",196,0,0,0
(Jul '97)
Diseases like Creutzfeldt-Jakob disease (CJD) in humans and scrapie in sheep, and "mad cow disease" (bovine spongiform encephalopathy or BSE), are all fatal neurological diseases, all marked by the accumulation of protein deposits in the brain. While all of these have been blamed on prions or "infectious proteins", evidence has been lacking up until now, and arguments for the theory have appealed rather more to logic than to fact.
During July, a report in \IScience\i described a yeast protein which behaves in the test tube exactly like the hypothesised prions. The idea has been that prion proteins convert from normal helixes into durable sheets, and then act as a template, converting other proteins to the same sheet form. The technical details are too complex to describe fully here, but the results seem to support the prion hypothesis.
#
"New Zealand rabbits escape calicivirus",197,0,0,0
(Jul '97)
The \Jrabbit\j calicivirus disease (RCD), also known as \Jrabbit\j haemorrhagic disease, has spread from Asia into Europe over a number of years, and been deliberately introduced into \JAustralia\j in recent times. Trials in \JAustralia\j involved its use at an experimental station on Wardang Island, off South \JAustralia\j, but it escaped (or was deliberately or accidentally carried) to the mainland in 1995. After that, it was artificially carried across the country, both officially, and also unofficially by enthusiastic farmers.
In many areas, the disease has served to reduce drastically the number of rabbits in farming areas. The \Jrabbit\j is a widely recognised pest, introduced by English colonists in the 19th century, and found across most of \JAustralia\j. Many areas of infestation remain, but the virus has reduced \Jrabbit\j numbers in other areas, and accompanied by baiting, ripping and fumigation, has given many Australian farmers their best season for a decade or more. The only Australian complaints have come from \Jrabbit\j trappers who trap rabbits either for food or for their skins, used to make the traditional Australian felt hat of "the bush".
New Zealand, however, has decided to wait and see what happens in \JAustralia\j before trying to unleash the virus on their own introduced rabbits. They say they are still uncertain about how effective the virus would be in New Zealand, which lacks some insects probably responsible for spreading the disease in \JAustralia\j. New Zealand may also be too damp for the virus to spread effectively, say scientists.
Rabbits cause severe damage to about 9% of \JNew Zealand\j land, mainly in the South Island, and farmers in the South Island are less than impressed at the decision, while in \JAustralia\j, a large legal firm which specialises in litigation is planning to file a negligence suit against the CSIRO, the Australian government research body which was running the trials on Wardang Island.
#
"Genes may be patented",198,0,0,0
(Jul '97)
During July, members of the European Parliament approved an outline of legislation to determine what biotechnology inventions can be patented in the European Union. Genes and genetically modified animals may be patented under specific conditions, but plant and animal varieties and techniques directly related to human germ line manipulation or human \Jcloning\j are on the no-go list.
#
"Quantum physics update",199,0,0,0
(Jul '97)
Quantum physics has some most unpleasant and brain-hurting aspects, not only to ordinary mortals, but to people like Albert Einstein as well. He was fond of generating challenges that seemed to "prove" that quantum theory had flaws in it. Now one of Einstein's "impossibilities" appears to be possible.
The problem runs like this: when a particle such as an \Jelectron\j breaks down and generates two photons, these are in an indeterminate state. If they separate by several light years, and you then measure one of them, this act of measurement shifts both the measured \Jphoton\j and the other \Jphoton\j as well, at the same instant, into a determinate state. This means there has been an action at a distance, an action which has travelled far faster than the speed of light. Not possible, said Einstein.
Bad news, Albert. Researchers using the fibre-optic lines of the \Jtelephone\j system in \JSwitzerland\j have shown that the links persist up to 10 kilometres. One cynic has told your reporter that this will lead to a situation where \Jtelephone\j companies will be able to bill you for calls before you make them.
#
"Cambrian explosion",200,0,0,0
(Jul '97)
Something happened, around 540 million years ago, something that triggered an explosion in animal life forms, often referred to as the Cambrian explosion. That something may have been the \Jevolution\j of vision, the \Jevolution\j of hard parts to protect growing bodies, or maybe they were just signs of something else. The latest theory is high-speed tectonic plates.
That is, high speed in geological terms, though slow in ours. \JFossil\j magnetism from several continents suggests for example that \JAustralia\j rotated 90 degrees in just 30 million years, and this sort of shift would have produced rapid variation in habitats, enough to trigger rather rapid evolutionary events. All of the major continental plates rotated and moved rapidly during the Early and Middle Cambrian, to some extent, say the researchers.
#
"Magnetic fields and cancer",201,0,0,0
(Jul '97)
A major study was reported in the \INew England Journal of Medicine\i, ending the claims that have been made since 1979 that exposure to electromagnetic fields (EMFs) from home wiring causes cancer. A major epidemiological study on 638 children with acute lymphoblastic leukemia, and a set of 620 matched controls in nine midwestern and mid-Atlantic states of the USA has investigated field strengths in each child's bedroom over a period of 24 hours. They also made spot measurements in the kitchen, the family room, and the room where the mother had slept during her \Jpregnancy\j.
The survey found no link between EMFs of 0.2 microtesla, the level previously alleged to be enough to trigger cancer. There was a "hint" of an association in homes with field strengths of 0.4 to 0.499 microtesla, but there were only 14 cases and 5 controls in this group, not enough to establish a case, and the "hint" was absent in homes with fields greater than 0.5 microtesla.
The NEJM editorial suggests that it is time to stop wasting research resources on this question.
#
"Paradoxical ice shelves",202,0,0,0
(Jul '97)
Spectacular pictures of ice crashing off towering glaciers on the Antarctic Peninsula have been used to illustrate the threat of global warming. Yet according to a report in \INature\i this month, global warming may actually cause the massive ice shelves of \JAntarctica\j to thicken.
The Antarctic glaciers have flowed out beyond the shore, where they float on the sea as attached ice shelves. In winter, when \J\Jsea ice\j\j forms further out in the oceans, the \J\Jsea ice\j\j is pure ice with no salt, because the salt is forced out of the ice crystals as they form. So the remaining unfrozen sea \Jwater\j is now more salty and so more dense than before. This brine is also at the surface freezing \Jtemperature\j of ice, and as the denser salt \Jwater\j sinks, it holds this \Jtemperature\j.
Flowing below the ice shelves, 1500 metres below the sea, the \Jwater\j is still at the same \Jtemperature\j, because there has been no heat flow in or out. But the brine is now a degree warmer than the freezing/melting point of the deep ice, because the melting point is lower 1500 metres down, due to the increased pressure. So now the "warm" \Jwater\j can carve away ice from below. The flow of this "warm" \Jwater\j is greatest in winter, according to a team from the British Antarctic Survey, led by Keith Nicholls, who have been drilling through the Ronne Ice Shelf and studying the \Jwater\j below.
Global warming, with milder winters, says Nicholls, is likely to produce less \J\Jsea ice\j\j in winter, and that means less brine, and so less melting out of the underside of the ice shelves. But snow is still falling on the upper side, adding to the amount of ice on the ice shelf, and so the ice shelves will get thicker. So it may be that the ice shelves have less to fear from climate warming than we first thought.
#
"Penguins and voluntary hypothermia",203,0,0,0
(Jul '97)
Penguins can stay under \Jwater\j for a remarkably long time, even allowing for their blood being more oxygen-rich than ours. Now it looks as though the birds that fly under \Jwater\j have mastered the art of cutting off circulation to unimportant parts and going into voluntary hypothermia.
Trained human divers can stay down for up to four minutes, and reach depths of 100 metres, but given their body size and oxygen consumption, a penguin should only be good for about two minutes. A king penguin weighing 12 kg can stay down for more than 7 minutes, and reach a depth of 300 metres, while 30 kg emperor penguins get to 534 metres in dives lasting almost 16 minutes. This is well beyond what the penguins should be able to do, given the oxygen they are carrying with them when they dive.
A report in \INature\i this month explains how the mystery was cracked, by implanting \Jtemperature\j sensors in penguins' stomachs and monitoring their feeding habits. They found that the penguins' \Jstomach\j temperatures dropped from a normal homoiothermic 38░C to as low as 19░C. This was unsurprising, since the \Jtemperature\j of the fish the birds consume is only about 4░C. What was surprising was that the \Jtemperature\j of the rest of the \Jabdomen\j dropped to just 11░C, much cooler than could be explained by the penguin's chilly diet.
It seems that the penguins are restricting the oxygen supply to some of their tissues, and this means using less \Jenergy\j and less oxygen, and producing less heat. The effect is not new in one sense: whales cut down on blood flow to their outer layers, but the difference is that the penguins are reducing their core \Jtemperature\j.
#
"White House looks at greenhouse",204,0,0,0
(Jul '97)
Ask most concerned climatologists, and they will tell you despairingly that politicians simply do not care about \Jgreenhouse effect\js, carbon emissions, or any decisions that might inconvenience the electors, however mildly. July may have seen one small twinkle of hope to counteract that pessimism.
President Bill Clinton and Vice President Al Gore called in a group of seven scientists to address them on the topic, but the duo seemed to have a political agenda as well, to educate the American public in preparation for a major international meeting on climate change in December in \JKyoto\j, \JJapan\j.
"We see the train coming, but most ordinary Americans in their \Jday\j-to-\Jday\j lives can't hear the whistle blowing," said the President. It was a message echoed by one of the speakers, John Holdren, a professor of environmental science and public policy at Harvard University, who commented on the many linkages between the developed world and the developing world in a time of climate change: "you can't sink just one end of a boat".
Commentators say the meeting indicates that Clinton has been convinced by Gore and other environmentalists in his administration to try to build public support for action on climate change, as the environmentalists have been seeking. Unfortunately, the US Senate remains unmoved, unanimously backing a resolution calling on the administration not to agree on anything in \JKyoto\j that would damage the US economy or set emission limits for developed countries while leaving developing countries without restrictions. As the senate will have to ratify any convention, it appears that maybe the President needs to educate his senators first and foremost-unless he is counting on the public to do it for him.
The same selfish "not-me-first" line has been adopted by a number of countries, especially \JAustralia\j, one of the worst per capita carbon dioxide emitters in the world, due to that nation's reliance on \Jcoal\j-fired power stations.
#
"What drives the Ice Ages?",205,0,0,0
(Jul '97)
The standard model for some decades has had \Jice age\js triggered by cyclical changes in the elliptical shape of \JEarth\j's \Jorbit\j about every 100 000 years. Not so, according to a paper in Science this month. Muller and MacDonald have been looking at several types of oceanic climate records, and they have questioned this assumption. Instead, they suggest, changes in the inclination of \JEarth\j's \Jorbit\j relative to the plane of the \J\Jsolar system\j\j have taken our \Jplanet\j into a climate-altering \Jcloud\j of cosmic dust.
#
"New athletic training method?",206,0,0,0
(Jul '97)
How do you get ready for the big race? Do you head for the high country, and stimulate your body to generate additional red blood cells, and run the risk that altitude will cause \Jinsomnia\j or loss of appetite, or because there simply isn't enough oxygen in the air, run the risk that you will not use your muscles fully?
Races are won by very small percentages, and a great deal of research effort goes into finding ways of shading the odds to favor an athlete. Now it appears that runners can shave crucial seconds off their time if they live at high altitudes but train closer to sea level.
Benjamin Levine and James Stray-Gundersen, from the University of \JTexas\j Southwestern Medical Center in \JDallas\j, gathered 39 amateur competitive runners and timed them in a 5-\Jkilometre\j race at sea level before assigning them randomly to one of three groups.
The first group lived and worked out at sea level, a second lived and trained at 2500 metres, and the third group lived at 2500 metres but trained at 1200 metres. After four weeks, the runners were brought together again, and timed once more.
Neither the high-high nor the low-low group showed any real advance, but the high-low group gained about 13 seconds on average, equal to about 100 metres over the 5 km race. How long will it be before athletes start sleeping in sealed low-pressure vessels?
#
"Chemistry update",207,0,0,0
(Jul '97)
"How many atoms does it take to do an analysis?" may sound a bit like a lightbulb joke, but when it comes to the transuranic elements, where half-lives are fractions of a minute, this is an important question. More importantly, the heaviest atoms are also the ones in which relativistic effects can make the elements behave in unexpected ways.
The transactinide elements, beginning at element 104, have now been identified with a high level of certainty, up to element 112. The first two of this series, elements 104 and 105 show variations from the chemical behavior that would be expected from a simple glance at the \J\Jperiodic table\j\j and its trends.
From \INature\i this month, we now have some knowledge of element 106, often called Seaborgium. Using just seven atoms, each lasting just a few seconds, scientists have shown that element 106 is clearly in Group 6 of the \J\Jperiodic table\j\j, beneath \Jtungsten\j and molybdenum, rather than continuing the trend of wayward behavior established by elements 104 and 105. But does this make the chemists happy, to see an element performing as it should? No chance: they now want to know why it behaves in such a coventional manner.
#
"Extinction on high seas?",208,0,0,0
(Jul '97)
Some biologists are now suggesting that the oceans are not as resilient as we once thought, and that we may be on the edge of a massive marine \Jextinction\j event. At the same time, while only 275 000 marine species have ever been described, new estimates suggest that \Jcoral\j reefs by themselves support around a million species, and that the deep sea floor, once dismissed as a barren desert, may support another ten million species. For the moment . . .
But are corals as fragile as people think? They seem vulnerable in the short term to all sorts of changes, yet they have persisted over hundreds of millions of years. A new study in \INature\i in July shows that corals enter into symbiotic associations with many different types of \Jalgae\j, and that the \Jalgae\j involved at any one time can vary considerably as corals adjust to changing conditions.
On this basis, the phenomenon of "\Jcoral\j bleaching" may not be the problem it was once considered to be. \JCoral\j bleaching has been known now since it was observed on the \JGreat Barrier Reef\j of \JAustralia\j in the 1920s, but studies in the Caribbean have typified the bleaching as a changeover, rather than a death event.
The ecologically dominant Caribbean corals, \IMontastraea annularis\i and \IM. faveolata\i provide a home to a variety of dinoflagellates (\ISymbiodinium\i spp.), and compositions vary with light levels. The dinoflagellates are generally classed as \Jalgae\j, and they are able to carry out \Jphotosynthesis\j, allowing them to "pay" for their accommodation in the \Jcoral\j. The analysis described by Rowan, Knowlton, Baker and Jara showed that the bleaching process they observed involved the elimination of a dinoflagellate which was less able to stand up to high light levels.
And even though reefs in Southeast Asia and the Caribbean are crumbling, most reefs in the Pacific and Indian Ocean appear to be in good health, says a report this month.
#
"Ecosystem inter-reliance",209,0,0,0
(Jul '97)
It is a polite fiction that ecosystems are isolated from each other. In fact, most ecosystems leak significant amounts of \Jnutrients\j and \Jenergy\j into other ecosystems, and receive much the same amount of leakage in return. Because the two effects largely balance out, the interchanges can generally be ignored, but what happens if a stream is unable to gather leaf litter from the ground around it?
During July, a group of scientists reported on a three-year study of the effects of keeping leaf litter out of a stream. When the stream bottom was composed of cobble, pebble, and gravel substrate, the effects were dramatic. All invertebrates, from detritus eaters to predators dropped away, but in areas with moss-covered bedrock, the removal of fallen leaves had no effect, suggesting that different food webs were involved.
The implications for stream restoration efforts are fairly clear: no leaf litter means no \Jinvertebrate\j diversity.
#
"Cloned transgenic lamb",210,0,0,0
(Jul '97)
From the same lab that gave us Dolly the sheep: in July, the first cloned transgenic lamb, a lamb containing foreign genes. This has been hailed as the first step to developing domestic animals which could make, for example, human proteins for therapeutic use.
#
"Mir--a suitable case for treatment?",211,0,0,0
(Jul '97)
There is little to report on the Mir situation. No action was undertaken during July, but a new crew joined the space station in August, and were preparing to make repairs later in the month.
Many of the current experiments on board were halted immediately, as power was lost, and some experiments were irretrievably damaged. Some members of the U.S. Congress are openly questioning whether NASA should continue to participate in experiments aboard Mir, and questions about the future of any international space station were asked. As the crews changed over in mid-August, a new deadline was set, with Mir's \Jwater\j recycling system failing. A new supply of \Jwater\j will need to be taken to Mir by the next shuttle flight, and if this is delayed for any reason, the cosmonauts could be in very serious trouble.
In the event of a disaster, will Mir mean the end of space stations and space flight? If it does, it will only be because people have not properly considered the risks. Space station supporters were busy in July, pointing out that there has been about one death for every 10 000 hours of space flight time, while civil \Javiation\j has one death for every 37 000 flying hours. And if you take it on kilometres flown, they say, space flight is the safest form of transportation around.
#
"No watery comets after all?",212,0,0,0
(Jul '97)
Louis Frank's minicomets (May, "Controversy in Space"), may not be the source of our oceans after all. Measurements of the ratio of ordinary \Jwater\j molecules to molecules containing \Jdeuterium\j in the comets Hale-Bopp and Hyakutake were reported at recent meeting to show a marked departure from the ratio found in seawater.
Tobias Owen and Roland Meier of the University of Hawaii's Institute for \JAstronomy\j and their colleagues used the James Clerk Maxwell \JTelescope\j on Mauna Kea to pick up radio emissions from the \Jdeuterium\j-containing molecules in the \Jcomet\j, Hale-Bopp, and found three atoms of \Jdeuterium\j (\Jhydrogen\j with a neutron in the nucleus) for every ten thousand ordinary atoms, around twice the ratio found in ordinary sea \Jwater\j. This determination agrees with measurements of last year's \Jcomet\j Hyakutake last year, and also released recently.
Opponents of Frank say that his theory is in trouble, that we could not accept the idea of making the bulk of \JEarth\j's oceans with \Jwater\j from these sorts of comets, but Frank is unconvinced. Frank says that the large comets like Hale-Bopp have nothing to do with the \Jwater\j in our oceans. His comets, he points out, are "small and fluffy", with a different make-up, and quite possibly in their \Jdeuterium\j to \Jhydrogen\j ratios. So until somebody manages to capture a minicomet and measure the \Jcomet\j's composition, the question will have to remain open.
#
"Resistant antibiotics",213,0,0,0
(Jul '97)
The antibiotic-resistant \Jbacteria\j (see April) continue to be a problem around the world, and medical researchers have launched a new program, called Sentry. This links up 72 hospitals and clinics worldwide to keep tabs on these threats to human health and well-being. Funded by the drug manufacturer, \JBristol\j-Myers Squibb, Sentry will cost several million dollars over the next three to five years, in return for which, the manufacturer will get access to the bacterial samples involved.
In one aspect of a pilot trial, swabs of South American patients revealed that the \Jbacteria\j that commonly cause urinary tract infections were resistant in about 30% of cases to common cephalosporin \Jantibiotics\j.
#
"Antibiotics useful for cardiac conditions",214,0,0,0
(Jul '97)
A bacterium called \IChlamydia pneumoniae\i has been found in the arterial blockages of 75% of patients with \Jatherosclerosis\j, or partially clogged arteries. A report in the journal \ICirculation\i this month suggests that \Jantibiotics\j may be useful in treating some types of \J\Jheart disease\j\j.
In a trial, 80 of 213 male patients who had previously suffered a heart attack had high antibody levels to the bacterium, suggesting a \Jchlamydia\j infection. Researchers treated half these patients for 3 days with azithromycin, an antibiotic that wipes out \Jchlamydia\j infection, and found that, 18 months later, only 8% of the treated men had suffered another heart attack or other serious cardiac event or died, while 28% of those who were untreated or who were given placebos suffered such a fate.
This could be important in developing countries like India, where \J\Jheart disease\j\j rates are rising, but where facilities and costs largely rule out usual western procedures such as bypass surgery and \Jballoon\j angioplasty.
#
"Obituary for July 97",215,0,0,0
Eugene Shoemaker, co-discoverer of the \Jcomet\j Shoemaker-Levy which crashed into Jupiter, died in a traffic accident in northern \JAustralia\j in mid-July, where he was on an annual visit to look for impact craters. His wife, also part of the \Jcomet\j discovery team, was injured but survived the accident.
Shoemaker was 69. His wife and longtime scientific collaborator, Carolyn, suffered broken bones, and is in stable condition. Shoemaker was a front runner among planetary scientists in arguing that the \Jearth\j has long been bombarded by meteors and comets, and identified the Barringer \JMeteor\j Crater, near Winslow, \JArizona\j as the result of a strike by a \Jmeteor\j which exploded upon impact.
The Shoemakers discovered more than 800 \Jasteroids\j, but they have left a lasting mark with their share in the discovery of the \Jcomet\j Shoemaker-Levy which plunged into Jupiter as scientists watched.
#
"Robots for the 21st century",216,0,0,0
(Aug '97)
Quite apart from reports during the month of "robot sheepdogs" practising on ducks ("which behave like sheep, but more slowly"), Sojourner has continued to look around the Martian landscape, while Nomad has been having an extended desert vacation in \JChile\j.
Nomad? The next generation of Martian exploring machines will look much like Nomad, which travelled 215 kilometres in six weeks, found some meteorites which had been planted in the desert for it to find, and even found a type of rock not previously seen in \JChile\j.
A Carnegie Mellon University project, Nomad was controlled from 8000 km away, providing \Jfeedback\j from a camera that could image the 250 kg robot's full horizon. Next year, it will have an Antarctic break, when it is sent to look for meteorites on the ice. Regrettably, Nomad's strange rock, which fitted a profile for sedimentary rocks that might contain fossils, did not produce any traces of life, although the rock was indeed sedimentary. It was, however, a major victory just to have Nomad find the rock and identify it as different.
Robots have an active social life: the month also saw RoboCup '97 in \JNagoya\j, running from August 23 to August 29. RoboCup involves teams of six robots playing a form of \Jsoccer\j against each other. According to the organisers, in order for a robot team to actually perform in their modified \Jsoccer\j game, "various technologies must be incorporated including: design principles of autonomous agents, multi-agent collaboration, strategy acquisition, real-time reasoning, robotics, and sensor-fusion".
The RoboCup site provides a great deal of background information on the competition, and gives links to other pages as well. The organisers have listed a number of expected changes for the future: the walls which surround the playing field are to go soon, the current system allows a global vision system from above the field, but this is to be banned, and they are planning to introduce an offside rule, placing more demands on computational power.
No doubt the 21st century will see \Jsoccer\j matches between autonomous teams of robots, playing on the level playing fields of Mars.
#
"Genome news",217,0,0,0
(Aug '97)
Late July saw a report about the complete mapping of human \Jchromosome\j 7, with 170 million genes, some 5% of the entire human \Jgenome\j. The X \Jchromosome\j, of course, is already complete, leaving just 21 chromosomes and the y \Jchromosome\j to go.
Mapping is not the same as sequencing the \Jchromosome\j, which identifies every single base pair on the \Jchromosome\j, but it provides a set of landmarks, mapping a sequence tag site that could be detected by a PCR assay every 100 000 base pairs, right across the human \Jgenome\j.
#
"Completed bacterial genome",218,0,0,0
(Aug '97)
The complete 1 667 867 base-pair sequence of \IHelicobacter pylori\i was published in early August, involving 1603 (or 1590, depending on how you look at it) genes altogether. The new work reveals many features, among them the machinery that the \Jbacteria\j use to exist in the acid environment of the \Jstomach\j. It is estimated that the bacterium infests the intestines of half the world's population.
This is the sixth bacterium to have had every last \Jnucleotide\j base-pair of its DNA recorded and published, so the excitement is now a little muted, but \IH. pylori\i is an important bacterium. Until recently, gastric ulcers were blamed on diet and a stressful life-style: now we know that ulcers are cause by infections of this bacterium.
The sequences in the circular \Jchromosome\j were calculated by fragmenting the bacterial DNA, sequencing the bits, and then using a very powerful computer to link the sequences back together again, based on duplications and overlaps.
#
"Worms in the news",219,0,0,0
(Aug '97)
Meanwhile, one of the larger animals under genomic study (the entire \Jgenome\j of the \Jnematode\j \ICaenorhabditis elegans\i is due to be completed soon), can store up fat and enter a state of suspended animation (the "dauer phase") for 2 months or longer. They do this when they detect a shortage of food by slowing their \Jmetabolism\j, and now researchers have cloned and sequenced daf-2, the gene which controls this.
Curiously, the protein that the gene encodes appears to be the worm equivalent of the human \Jinsulin\j receptor. This molecule "listens" for the hormone \Jinsulin\j, which is secreted in response to a rise in blood sugar, and passes its \Jmetabolism\j-enhancing signal to our cells' interiors. This raises the question: are changes in \Jglucose\j \Jmetabolism\j the key to slowing the ageing process in higher organisms, including humans? Nobody knows where these studies will lead, but it could even bring us to a better way of treating diabetes.
\BAnd yeasts\b
If the worms don't have the answer, maybe the yeasts will. A rare genetic form of accelerated ageing in humans, Werner's syndrome, has just been reproduced in yeast. The work is likely to help researchers trace out the molecular changes that underlie aging in humans as well as in yeast.
It may seem improbable, but yeast cells age. After dividing about 25 times, the normal yeast cell stops dividing and dies. About halfway through that range, the cells normally cease sexual reproduction, but the new mutant form reaches this point after just five normal divisions.
A team led by Leonard Guarente at the \JMassachusetts\j Institute of Technology studied brewer's yeast, \ISaccharomyces cerevisiae\i, in order to shed light on the function of the Werner's syndrome gene. It looks as though the protein encoded by the normal form of the Werner's gene prevents premature ageing, and that it does this in the nucleolus of the cell, a part of the nucleus where the protein-synthesising ribosomes are assembled. This confirms earlier evidence which linked the nucleolus with the prevention of ageing.
#
"Birds indicate warmer climate",220,0,0,0
(Aug '97)
Welsh \Jcoal\j miners used to take canaries into the mines to warn of gas build-ups, because the birds were more susceptible to gas concentrations. Now wild British birds seem to be sounding a similar note of warning for the whole world, by nesting and laying eggs eight days earlier than they did 25 years ago, just as the growing season for British plants is now eight days earlier, and at a time when British amphibians have been seen to breed earlier.
Britain's compulsive bird watchers have been collecting nesting records, more than a million of them, since 1939, covering 225 species of bird. While some of this information is less available for statistical analysis, some 75 000 records covering 65 species have been extracted by Dr Humphrey Crick and his colleagues from the British Trust for Ornithology in Thetford, Norfolk. In all but one of the species, there is a trend to laying eggs earlier, and in twenty of the species, the difference was enough to pass statistical tests of significance.
The one species which did not fit the trend is the stock dove (\IColumba oenas\i), which nests throughout the year, not just in spring, so that it could hardly have been expected to match the pattern followed by the spring breeders.
Whales may also have good reason to worry about the \Jweather\j-or at least about our concern with it. Gerald Eddlemon is an ecologist at the Oak Ridge National Laboratory in \JTennessee\j who has been looking at \Jweather\j balloons in the Antarctic. About 10 000 of these are launched each year, falling eventually into the sea, where there appears to be about a 7% chance that a random whale will encounter a \Jballoon\j each year while feeding.
The \Jpolythene\j used to make most balloons would last for years in cold Antarctic waters, says Eddlemon, who suggests that it is possible the balloons are having an undesirable effect on some \J\Jendangered species\j\j of whale, especially the filter feeders which "sweep" large volumes of ocean each year.
On the other hand, marine mammals would probably be more worried by the news this month that DNA testing has shown that most of the whale meat sold in \JJapan\j seems to be of species other than those allegedly targeted in \JJapan\j's "scientific whaling". There have been dark allegations from conservationist groups about organised crime syndicates cashing in on the lucrative trade in whale meat. Whether criminals are involved or not, we know the names of the innocent parties: several dolphin species, as well as blue, humpback, fin, Bryde's and Baird's beaked whales were found in the samples of meat alleged to be from minke whales.
Perhaps the whales can find comfort in a report this month, showing that the cetaceans (that is, the whales, dolphins and porpoises) form a clade with the artiodactyls (animals such as pigs, hippos, camels and ruminants), which is science-speak for these groups sharing a common ancestor which was not related to any of the other mammals. Perhaps the answer is to modify \Jcattle\j so they can produce something indistinguishable from whale meat?
#
"Global warming--too early to judge",221,0,0,0
(Aug '97)
Back to the \Jweather\j, Europe relies heavily on the North Atlantic Current for warmth, a system which is driven by convective overturning at the northern end of the Atlantic. This consists of a mix of cool and warm currents, but the pattern could easily be about to change in response to just a small change in \Jtemperature\j and rainfall patterns. So the common political stance that we have seen no problems so far may be, as some scientists are now suggesting with increasing stridency, a bit like the person who jumps from the twentieth storey of a building, and reports no problems after completing 95% of the fall.
What they mean by this analogy is that the problem may have more to do with rates of change than degrees of change, that we are venturing into the unknown, and nobody knows what to do to switch the North Atlantic Current back if it \Idoes\i fail. What we do know already is that sea surface \Jtemperature\j oscillations have been observed in the Atlantic which persist for several years, and coupled with surface pressure variations, probably form a whole-ocean \Joscillation\j which may cycle over years or decades.
One of the more useful indicators, a \Jsatellite\j view of the changing color of \JEarth\j's oceans, can reveal the growth and death cycles of microscopic ocean plants across the globe and help establish their role in climate change. With the launch this month of SeaWiFS, the Sea-viewing Wide Field-of-view Sensor, the necessary data will be more freely available.
SeaWiFS will send down a complete color picture of \JEarth\j's surface every two days, giving us a broader and more frequent coverage than any earlier oceanographic \Jsatellite\j. From accurate information on the cycles of plankton growth, scientists should learn how much carbon dioxide the plankton absorb from the \Jatmosphere\j and then be able to refine their models of global warming.
SeaWiFS will also tell us more about how the El Ni±o current which is now warming the eastern Pacific affects \Jphytoplankton\j growth. As the El Ni±o hot spot develops, it blocks the upwellings of cold \Jwater\j, packed with the minerals that \Jphytoplankton\j require. This blockage decreases the plankton levels enormously, and without the \Jphytoplankton\j at the base of the food pyramid, everything else declines as well.
\BMonsoons as well\b
When you look at climatic interactions between the ocean and \Jatmosphere\j, the two most important are the Asian monsoon in the Indian Ocean and the El Ni±o-Southern \JOscillation\j (ENSO) in the \JPacific Ocean\j, so it seems only logical to look for links between those two systems. Evidence reported this month, based on the oxygen isotopic compositions in banded corals from the western Indian Ocean says there is such a link. The oxygen isotope ratios reflect the sea surface \Jtemperature\j, and the pattern reflects the ENSO pattern. The data, which extend back to 150 years, suggest that the \Jlinkage\j has held up even though the ENSO pattern is quite variable.
#
"New discoveries on the sun",222,0,0,0
(Aug '97)
The news stories treated it as "\Jweather\j on the \Jsun\j", but the true story was even more exciting. Data from the Solar and Heliospheric Observatory (SOHO) \Jspacecraft\j tell us there are doughnut-shaped zones of relatively rapid flow, rivers of solar material, near the solar north and south \Jpoles\j of the \Jsun\j.
The flows would never have been seen from \Jearth\j-based observations, but SOHO's Michelson Doppler Imager (MDI), one of 12 instruments on board the European Space Agency craft, watches the \Jsun\j continuously from a point 1.5 million kilometres closer to the \Jsun\j, and makes sensitive measurements of the undulations and wiggles on the \Jsun\j's surface at about three quarters of a million points at once. The wiggles show where sound (acoustic) waves are emerging on the \Jsun\j's surface, and they carry with them tell-tale traces of the conditions in the regions they have passed through.
The "rivers" which have been revealed are about 30 000 km across and 40 000 km below the surface. The flows are hard to explain, but may be similar to the jet streams that operate in the upper reaches of the \Jearth\j's \Jatmosphere\j.
#
"Supernova, quest to understand",223,0,0,0
(Aug '97)
Where would you look for signs of an exploding star? An international team of physicists plans to go underground, almost a \Jkilometre\j down into deep salt deposits, where they will set up a pair of subterranean observatories to capture thousands of the elusive particles called neutrinos, which should come from the core of a \Jsupernova\j.
The project proposal carries the name Observatory for Multiflavour Neutrinos from Supernovae, or OMNIS, and it is to be a set of low-maintenance detectors which could easily wait years to uncover a brief burst of neutrinos that would announce the occurrence of a type II \Jsupernova\j in our galaxy.
A type II \Jsupernova\j happens when a star runs low on fusion fuel. As the star cools, the internal pressures which maintain the star against the pull of gravity get less, and the star collapses inwards before exploding and firing a shock wave from the core, out through the upper layers.
As the shock wave slows, a blast of neutrinos boils up out of the core, blowing the star apart, and driving the outer layers off into space. A \Jsupernova\j is a random event, and for now at least, entirely unpredictable, so it will be necessary to set the system up and wait for the inevitable \Jday\j when the neutrinos pass by.
As the neutrinos race past , a few will crash into nuclei in the salt walls of the chamber, or into nuclei in metal slabs near the detectors. As they do so, neutrons will be thrown off, ready to be detected by plastic or liquid scintillation detectors, where they would produce countable flashes of light. Only a few of the neutrinos will do this, producing about 2000 events in a period of ten seconds, but this sequence and pattern will allow physicists to decide whether or not they understand what happens inside a \Jsupernova\j.
OMNIS would have one major advantage over existing neutrino detectors, since it would be able to detect all three "flavors" of neutrinos. Information gained from this might help to shed light on the question of whether or not the neutrinos have mass (see Neutrinos--Do They Have a Mass?).
The location, deep underground in a salt mine, means that only neutrinos are likely to reach the sensitive detectors.
#
"Ex-sun 1987A still going strong",224,0,0,0
(Aug '97)
There's no \Jsupernova\j like a new \Jsupernova\j-unless it's an old \Jsupernova\j, still going strong. Around 167 thousand years ago, the \Jsupernova\j we now call 1987A blew itself apart. Spheres of gas have formed up, about half a light year from the centre of the star, and these are moving slowly outwards. Now new material appears to be running into these slow ripples in space, and an exciting light show, across the visible and x-ray spectrum, is expected.
The light from the original \Jsupernova\j ("as bright as 100 million suns") told us a great deal about how stars die, but it also left a great deal unsaid. Seven years after the \Jsupernova\j first came to our attention, the \JHubble Space \JTelescope\j\j revealed a puzzling trio of rings around the exploded star, probably formed by the star some tens of thousands of years before the final explosion. While the gas surrounds the \Jsupernova\j as a set of spheres, \Jearth\j-bound observations see them as a set of rings.
Now the material that was thrown out by the final explosion is close to the gas rings, and the coming collisions should light up the debris, revealing the elements in the debris, their speeds and their directions. To be precise, something which is travelling fast is close to the gas rings-the debris was not expected to catch the rings for about another ten years, but the theory around supernovae is far from perfect. That, after all, is why scientists study them.
#
"New planet?",225,0,0,0
(Aug '97)
A network of telescopes in South \JAfrica\j, \JAustralia\j and \JSouth America\j has detected tell-tale flickers from a star in the centre of the \JMilky Way\j. The PLANET network has been looking for stars becoming apparently brighter, perhaps because a dimmer star, complete with planets, has drifted across the line of sight, bending the more distant star's light with their gravitational fields.
The flickers which have been detected match the signature of a Jupiter-sized \Jplanet\j orbiting a \Jsun\j-sized star, but further analysis will be required to pin down the exact planetary pattern shown by the flickers. One thing is certain: the flickering is confirmed, having been detected by another international group known as GMAN.
It began as a sharp spike around June 19, followed by a slow increase in the brightness, and perhaps a strange double hump around 24 July, before it dimmed away again. The pattern is assumed to have risen when a dim star moved across our line of sight to a brighter, more distant star, rather like a magnifying glass passing between us and the star. If there are planets present, their presence acts like specks of \Jwater\j on the \Jlens\j, adding small extra-bright spots to the overall picture.
The pattern will show planets as large as Jupiter, but \Jearth\j-sized planets do not have enough gravitation to show up on this sort of scale. So it will be some time before if we know whether the star in question is home to clever vegetables, humanoids, or scaly things with spines, a bad attitude and serious teeth.
#
"Hidden fluoride found",226,0,0,0
(Aug '97)
One thing now appears certain: if there are creatures living at the centre of the \JMilky Way\j, they may well have good sets of teeth. A report which is to appear in the October issue of \IAstrophysical Journal Letters\i suggests that the \Jfluorine\j found on \Jearth\j today is matched by \Jfluorine\j that can be found in the clouds of interstellar dust.
The evidence of absorption spectra, the patterns created when light passes through assorted chemicals, has identified more than a hundred different molecules in space, but \Jfluorine\j compounds have always been hard to detect. This is because each molecule absorbs light radiation at characteristic wavelengths, and our \Jatmosphere\j blocks the frequencies which would come from \Jhydrogen\j \Jfluoride\j, likely to be the most common \Jfluorine\j compound in space. In effect, the air we breathe casts a shadow that hides the signs of \Jfluorine\j compounds. In space, the air is no longer a problem.
David Neufeld and colleagues at Johns Hopkins University used the Infrared Space Observatory, a European Space Agency \Jsatellite\j, to see further and more clearly. A \Jsatellite\j-based instrument called the Long Wavelength Spectrometer on the ISO was pointed at the \JSagittarius\j B2 \Jcloud\j, around 20 000 light-years from \JEarth\j.
The amount of infrared radiation passing through the \Jcloud\j dipped at the wavelength absorbed by \Jhydrogen\j \Jfluoride\j, indicating a \Jfluoride\j concentration at 0.3 parts per billion, just one tenth of the concentration of \Jfluorine\j in our \J\Jsolar system\j\j. Neufeld has suggested that there may be more, but that it is hidden, existing as tiny clumps of frozen \Jfluorine\j, which would not be detected by infrared methods.
Because the \Jfluorine\j has been found in deep space, astronomers now consider it likely that the \Jfluorine\j there, like the \Jfluorine\j in our non-stick fry-pans and toothpaste, has all come from sources like this.
#
"Xenon still missing",227,0,0,0
(Aug '97)
The \Jsun\j and meteorites have more \Jxenon\j than the \Jearth\j's \Jatmosphere\j, and one explanation in the past has been that the \Jearth\j's \Jxenon\j was stored in the \Jearth\j's core, alloyed somehow with iron. But is the inert gas able to form alloys with iron? Until recently, nobody knew, but a study reported in Science this month reports on attempts to combine \Jlaser\j-heated diamond anvil cell experiments with extrapolated, thermodynamic calculations. The conclusion of this study is that \Jxenon\j is unlikely to alloy with iron even at pressures of at least 100 to 150 gigapascals.
#
"Missing dark matter present as hydrogen?",228,0,0,0
(Aug '97)
Some astronomers have proposed that some of the missing \J\Jdark matter\j\j of the universe may be present as neutral \Jhydrogen\j, lurking in dim, hard-to-spot galaxies, or in the clouds that lie between the galaxies. Unfortunately for that idea, a radio survey of the nearer parts of the universe has failed to turn up any sign of the \Jhydrogen\j.
It is unlikely that the \Jhydrogen\j is hidden in metallic form. Given sufficiently high pressures, molecular \Jhydrogen\j should become a solid metal, but no matter what experimenters have tried, this metallic form of \Jhydrogen\j has never been observed. Now it looks as though maybe this type of \Jhydrogen\j cannot exist, not if a newly predicted spontaneous asymmetry of the \Jhydrogen\j molecule occurs as theorists think it may. We will keep you posted.
#
"Arctic sea floor uncovered",229,0,0,0
(Aug '97)
Back on our own \Jplanet\j, in the \JCold War\j, the coldest part of the war must have been played out by the nuclear submarines that prowled beneath the \JArctic\j ice cap. Unlike the South Pole, which sits on the continent of \JAntarctica\j, the \JNorth Pole\j is a sheet of ice sitting over liquid sea \Jwater\j. Between 1957 and 1982, US Navy submarines travelled 220 thousand kilometres under the ice, and now the records of those journeys are to be released for scientific use. The depth charts that the submarines used will now be available to help scientists understand \JArctic\j \Jgeology\j and the transport of contaminants by \JArctic\j currents.
For the first time in many years, scientists will know more about the surface of the \Jearth\j under the north polar ice than they do about the surface of the \Jplanet\j Venus. According to a report in August, the data will be converted from rolls of chart recorder paper into digital form, to be posted on the \JInternet\j, where the information will join other data already supplied by the US Navy, such as \Jtemperature\j and \Jsalinity\j data.
#
"Vaccinating the sheep",230,0,0,0
(Aug '97)
Fly strike is a serious problem for sheep farmers, especially in \JAustralia\j, where a fly called \ILucilia cuprina\i is a serious pest of sheep. The larvae of this fly invade the sheep, feeding on sheep tissues. Now Australian scientists have hit upon a solution which may help farmers to keep down the levels of pesticides in livestock, making the meat from the animals safer to eat.
The larvae have a lining to their gut, the peritrophic membrane (or PM), which serves to keep out parasites and microorganisms, as well as aiding in digestion. The PM also keeps out host \Jantibodies\j, so if sheep blood is full of \Jantibodies\j to the PM, which will cling to it, it seemed that the \Jantibodies\j might then clog the larval peritrophic membrane, stopping the larvae from absorbing food.
Ross Tellam and his colleagues at the Australian government's Commonwealth Scientifice and Industrial Research Organisation vaccinated sheep with a PM protein. They then took some of the blood of the sheep, separated the serum from the blood, and cultured fly larvae in the serum. They found that the PM \Jantibodies\j in the serum significantly slowed the larval growth, in a dose-dependent way, according to a report published in the \IProceedings of the National Academy of Sciences\i.
This is only a start: future trials will need to involve \Jantibodies\j to several different proteins, as Tellam estimates that the growth rate of the larvae will need to be reduced by at least 80% to kill a significant number of larvae. But with customers around the world more and more aware of the insecticides and other chemicals in their meat, solutions like this are desperately needed.
#
"Weevil news",231,0,0,0
(Aug '97)
Weevils and other beetles have been in the news this month, some of them getting a good press, some of them not being held in very high regard at all.
A Eurasian weevil taken into the US and Canada to deal with invasive introduced thistles has turned out to be a cure worse than the condition it was brought in to deal with. The weevil, \IRhinocyllus conicus\i, was released widely in \JNorth America\j after 1968 to control several species of introduced thistles. Now it turns out that the weevil enjoys the flower heads of five native thistles which are a natural part of ecosystems in that part of the world.
The long-term effect of the weevil may be to prevent the thistles from setting seed: in the 1996 flowering season, up to 70% of all flower heads in the native thistles were infested with weevils, and seed production had dropped in study areas in \JColorado\j and South Dakota.
#
"Pest weevils",232,0,0,0
(Aug '97)
On the other hand, some weevils can be simple pests with no redeeming qualities, such as the weevils which attack commercially important plants. British researchers, working on the vine weevil which destroys strawberry, raspberry and other soft fruit crops, have turned to monoclonal \Jantibodies\j as they seek to identify the most helpful carnivorous beetles to assist their fight.
Sifting through the intestines of various caribid and staphylinid beetles, the scientists used monoclonal \Jantibodies\j to find whether different beetles have been eating weevil eggs, weevil grubs, or adult weevils. While this provides useful information about the species most likely to eat weevil eggs and prevent any damage to crops, the promise is limited-until somebody finds a way to rear carnivorous beetles cheaply but in isolation, so they cannot eat each other! For the moment, the researchers are looking at ways of improving the conditions which favor the predators, such as leaving leaf litter on the ground to provide shelter for the hunters, which feed mostly at night.
#
"Beetles help fight loosestrife",233,0,0,0
(Aug '97)
Other beetles are playing an important control role in the wetlands of north America, where \ILythrum salicaria\i, otherwise purple loosestrife, is now under attack from beetles imported from its original home in Europe. The American version of the plant seems to have evolved to lose some of its natural defences, transferring its biochemical efforts more to reproduction, rather than to creating defences against a non-existent enemy. Now the enemy has returned, and the loosestrife is vulnerable. Given a choice, the introduced beetles actually prefer the American variants over the original European forms that the American plants sprang from.
Loosestrife has been a problem mainly because it has forced out bulrushes, sedges, and other edge plants which are required shelter plants for vertebrates such as tortoises, amphibians and waterbirds. Now, with some infestations reduced to just 5% of their former size, and new beetle species still to be set loose, the prospect for restoring north American waterways to their original condition looks good. And in this case, there appear to be no untoward side effects.
#
"Bacteria in the wheat blight fight",234,0,0,0
(Aug '97)
The \Jfungus\j \IFusarium graminearum\i can wipe out a wheat crop, both killing the plants themselves and contaminating the survivors with a toxin, vomitoxin or deoxynivalenol, which sickens humans and animals. In the past, the best solution has been to change wheat varieties often, and vary the time of planting, so as to reduce the very real risks. Now a bacterium has been brought into the fray.
Typical "winter wheat" spends more than 8 months of its life putting down roots and growing leaves as it gets ready for a six-week burst of activity to make grain. If there are just a few days of rain when the wheat is flowering, the \Jfungus\j, also known as wheat scab, can take hold, reducing yields by as much as a third, and devaluing the rest of the crop as a result of the presence of the toxin.
Researchers at the Brazilian Wheat Research Center in Passo Fundo, \JBrazil\j have found three useful \Jbacteria\j in a screening study of thousands of different microbes, and seem to have struck it lucky. Plants which had their heads sprayed with the bacterium \IPaenibacillus macerans\i not only yielded 15% more grain than their untreated counterparts, they also had dramatically lower levels of vomitoxin, less than one-tenth the levels seen in control plants. The next step will be developing effective methods of delivering the \Jbacteria\j to the plants under field conditions.
#
"Colorful fossils",235,0,0,0
(Aug '97)
Andrew Parker, a biophysicist at the Australian Museum, one of the world's great natural history museums, may be about to find out what color fossils should be.
The standard view of fossils is that they are made up of pieces of mineral that have replaced the material that was once a living thing. The old \Jcartoon\j joke of a dog stealing \Jfossil\j bones has no relevance, according to the standard view of fossils, because the fossils are rocks in the shape of bones, not actual bones.
Andrew Parker was looking at ostracods, marine crustaceans, when he realised that the cell structures that give them a sheen today were also present in \Jfossil\j ostracods. Ostracods have hardly changed on their structures over the last 350 million years, but what of their colors? On the evidence of the \Jdiffraction\j gratings found in the fossils and also in modern ostracods, they must have looked pretty much the same.
This finding led Parker to investigate a range of other well-preserved fossils, and he found remnants of the cells which control color in many animal skins, cells called chromatophores. More importantly, he found traces of the \Jpigments\j from the chromatophores, providing a strong suggestion about the colors of some early animals. In one placoderm, an armored fish, Parker found traces of red pigment above, and silvery \Jdiffraction\j gratings below.
Exploring other animal fossils, Andrew Parker and his colleagues found it easy enough to reconstruct the colors of other animals as well.
#
"Argon dating",236,0,0,0
(Aug '97)
When Mount Vesuvius erupted in 79 AD, there may have been no media coverage in our modern sense, but it was widely studied and reported, and we even know that the Roman writer on scientific matters, Pliny the Elder, died when he got too close to the \Jvolcano\j, which also destroyed \JPompeii\j. So if we know the date so accurately, why would anybody bother to try to determine the age of the volcanic debris that flowed forth in that eruption?
The answer is found in a single word: \Jcalibration\j. Accurate \J\Jradiometric dating\j\j of young rocks is essential for scientists in many fields and where samples from the Holocene, the last ten thousand years are involved, the primary method has been \J\Jradiocarbon dating\j\j. The problem with relying on just one method is that it may be open to some kind of systematic error, such as the slow seepage of newer carbon into old deposits, topping up the carbon-14 levels and giving us spuriously young ages for material which is more than 40 000 years old. Nobody can say for sure if this is a problem or not, at least until an independent way of assessing ages can be used alongside the carbon method.
So if we can use the \Jargon\j-40/\Jargon\j-39 method to date material in the historical period, this will give us a second version of the dates we have established. Paul Renne, a geochronologist at the University of \JCalifornia\j, Berkeley, and his colleagues have reported to Science that this is exactly what they can do.
\JArgon\j dating tells us how long it is since an eruption, when \Jlava\j solidified and trapped radioactive elements in its crystal lattice, setting the radio clock to zero. All you have to do is measure the ratio of \Jpotassium\j-40, a radioactive isotope with a half-life of 1.25 billion years, compared with the concentration of its daughter product, \Jargon\j-40.
The method has been around for many years, but recent major refinements have made it possible to detect tiny amounts of \Jargon\j, making it possible to calculate the age of younger rocks. Because the \Jdecay rate\j is so slow, quite a few years have to pass before the \Jargon\j levels are high enough to allow accurate measurement, so that until now, the youngest rocks dated this way were more than 5000 years old, and the accuracy was no better than 10%.
By heating samples of volcanic ash with a very precisely controlled \Jlaser\j in careful steps, Renne and his team were able to date the 1918-year-old rock to 1925 years, plus or minus 94 years, a remarkably accurate result. (Note that dates like this always come with a confidence level: there is only a 5% chance that the rock is outside the range 1831 years to 2019 years.)
This may open the way to even more exciting possibilities, such as dating strata a long way from volcanoes by dating volcanic ash that blows into a deposit. More importantly, the occasional fall of volcanic ash onto an ice core may serve to place a set of time stamps on levels in the core where the ash has fallen.
New volcanoes were also in the news this month. The record for hottest \Jvolcano\j and hottest \Jmagma\j looks to belong to Io, one of Jupiter's satellites. At the start of the month, Alfred McEwan told the American Astronomical Society's Division of Planetary Sciences about the most recent \JGalileo\j results. It seems that the volcanoes on Io have a \Jtemperature\j of around 1800 K, some 200 degrees hotter than on \Jearth\j, and this translates to a \Jmagma\j \Jtemperature\j of about 2000 K, suggesting that the \Jmagma\j is a molten silicate rock which is rich in \Jmagnesium\j.
#
"mc2=e",237,0,0,0
(Aug '97)
People who know nothing else about Einstein are able to parrot "e = mc\U2\u", and many can even explain that mass can be converted into \Jenergy\j, a fact which they invariably relate to "atomic bombs". At the end of August, we learned of a report from physicists at the Stanford Linear Accelerator Center (SLAC) to appear in the 1 September \IPhysical Review Letters\i, describing the inverse process, turning light into matter.
The team collided large numbers of photons together so violently that the interactions spawned particles of matter and antimatter, electrons and positrons (which are anti-electrons). Physicists have long known that this kind of conversion is possible, but they have never observed it directly. The trick was to focus an extremely intense \Jlaser\j beam at a beam of high-\Jenergy\j electrons. When the \Jlaser\j photons collided head-on with the \Jelectron\j beam, they got a huge \Jenergy\j boost, changing them from visible light to very high-\Jenergy\j gamma rays.
These high-\Jenergy\j gamma ray photons then rebounded into the path of incoming \Jlaser\j photons, interacting with them to produce \Jpositron\j-\Jelectron\j pairs. Unlike similar effects seen in the past in \J\Jparticle accelerators\j\j, where at least one of the photons involved is "virtual", the photons here have an independent existence, so this is the first time matter has been created entirely from ordinary photons. The future for this sort of work will lie in using powerful lasers to look at the interactions of photons and electrons as described in the theory known as quantum \Jelectrodynamics\j (QED).
#
"Sojourner completes 30-day mission",238,0,0,0
(Aug '97)
The robot Sojourner completed its 30-\Jday\j mission on August 3. By then, it had captured far more data on the \Jatmosphere\j, \Jweather\j and \Jgeology\j of Mars than scientists had ever expected. In all, Pathfinder returned 1.2 gigabits (1.2 billion bits) of data and nearly ten thousand pictures of the Martian landscape. Since that time, it has continued to explore an ancient outflow channel in Mars' northern hemisphere, but the information flow on Sojourner quietened as NASA scientists went into analysis mode, and got ready for the arrival of Global Surveyor in September.
The mission of Sojourner was followed with great interest via the World Wide Web. Twenty different Pathfinder mirror sites between them recorded 565 902 373 hits world-wide during the period from July 1 to August 4. The highest volume of hits in one \Jday\j occurred on July 8, when a record 47 million hits were logged, which is more than twice the volume of hits received on any one \Jday\j during the 1996 \JOlympic Games\j in \JAtlanta\j.
#
"Distant close-up on Mars",239,0,0,0
(Aug '97)
Meanwhile, a careful analysis of Martian meteorites (that is, meteorites which are believed to have been blasted out of Mars by major impacts) has just appeared in \INature\i, suggesting that Mars is a peaceful \Jplanet\j, with limited plate motion, no giant impacts, and no large-scale mixing for 4.53 billion years.
The study is based on an analysis of \Jtungsten\j \Jisotopes\j, and the conclusion is that the large-scale convection patterns, which drive plate tectonic motion and mix the \JEarth\j's mantle, appear to have been unimportant during most of the history of Mars. Instead, it looks as though Mars formed fast and differentiated early in the \J\Jsolar system\j\j's history, about 20 to 40 million years faster than the time taken for the \Jearth\j to separate out into core, mantle and crust.
The current best estimate for the time of \Jplanet\j formation is about 4.57 billion years ago, when the planets started to clump together from a huge \Jcloud\j of interstellar gas, dust and debris left over from the formation of the \Jsun\j. As the materials gathered, molten metal would drift to the centre, while silicates stayed on the outside. Since the time when Mars' core formed, some 4.53 billion years ago, the \Jplanet\j seems to have been remarkably stable and free of geological upheaval. Such inferences, drawn from an \Jearth\j-bound observation, will be able to be assessed much more closely when Global Surveyor settles into a circular \Jorbit\j, and can begin resolving structures as fine as 1.5 metres across.
The researchers, Alexander N. Halliday and Der-Chuen Lee, based their study on the assumption that any pre-existing \Jtungsten\j would have been enriched by the formation of further \Jtungsten\j-182 as hafnium-182 decayed to that isotope. The hafnium, they say, tends to end up among the silicate rocks, while \Jtungsten\j "sticks" to iron, and so is dragged down into the core. This means that the mantle is rich in hafnium, but deficient in \Jtungsten\j, at least until the hafnium decays, and that is all they needed to set up their "clock".
#
"Venus still active?",240,0,0,0
(Aug '97)
Two researchers, Smrekar and Stofan, have presented a model in \IScience\i in which some Venusian features are formed by upwelling associated with plumes. They suggest that some of the plumes may still be active. The features they point to include circular collections of faults and ridges that range up to 2600 kilometres across, known as coronae.
#
"Extra plates on earth",241,0,0,0
(Aug '97)
Magnetic sea floor anomalies and seismic data indicate that the Indo-Australian plate is not one plate, but three plates with diffuse boundaries. This knowledge will make it easier to understand the deformational history of this region, including the collision of India with Eurasia to produce the \JHimalayas\j.
#
"New Zealand rabbits get calicivirus after all",242,0,0,0
(Aug '97)
New Zealand's decision not to release calicivirus (see New Zealand Rabbits Escape Calicivirus, July) has been overtaken by the actions of New Zealand farmers, apparently several months ago, in importing and distributing infected tissues in New Zealand.
An outbreak of \Jrabbit\j calicivirus disease (RCD) in \JNew Zealand\j was confirmed by the \JNew Zealand\j Ministry of Agriculture (MAF). This report was followed up with threats of prosecution, road-blocks and no-fly zones in the region around Cromwell in the South Island, but to no avail. Plans to shoot and poison the infected rabbits were dropped as gleeful farmers welcomed the news of outbreaks well beyond the five properties initially identified as having RCD.
The MAF already had RCD vaccine ready to be administered to pet rabbits, and could not really have been surprised by the event, which they say is no accident. The outbreak has happened at the wrong time of the year for an insect vector to be involved, and rabbits were infected across a wide area at the same time.
The virus was probably obtained from farmers in \JAustralia\j, and probably carried across the \JTasman Sea\j, some three hours by plane, as infected tissue in a sealed container. Customs and quarantine barriers between \JAustralia\j and \JNew Zealand\j exist, but are not particularly strongly enforced. There are fairly severe penalties for deliberately importing such a species as the RCD virus, but angry farmers, suffering from the effects of serious \Jrabbit\j infestation, were obviously willing to take that risk.
#
"Bacterial resistance",243,0,0,0
(Aug '97)
The past few years have seen an upsurge in bacterial resistance to \Jantibiotics\j, with infections such as tuberculosis, \Jpneumonia\j and meningitis all on the rise, and even serious killers like bubonic plague showing dangerous trends. Now it looks as though there may be some hope for a "fix".
The answer, described by Nobel laureate Professor Sidney Altman and his colleagues Dr Cecilia Guerreir-Takada and Dr Reza Salavati in the \IProceedings of the National Academy of Sciences\i, involves a way of disabling the genes that the disease \Jbacteria\j use to block \Jantibiotics\j like chloramphenicol or ampicillin. Innovir Laboratories already has a licence from Yale University to develop this technology, and they say that the results are promising.
While there may be other \Jantibiotics\j out in the field, just waiting to be discovered, this is unlikely, and it will most certainly be hugely expensive. If today's front-line \Jantibiotics\j can be used once more, this will bypass that huge expense, undoing the damage wrought by decades of indiscriminate use of \Jantibiotics\j in many parts of the world.
In simple technical terms, the researchers have produced a marker which can be passed between \Jbacteria\j as a plasmid. This marker attaches itself to a specific part of the bacterial mRNA and then calls in an \Jenzyme\j which is found naturally in all \Jbacteria\j, an \Jenzyme\j which splits the specified gene into two useless pieces. If the marker is targeted on a resistance gene, the resistance gene is destroyed, leaving the bacterium vulnerable to \Jantibiotics\j once again.
The first step in the development involved introducing synthetic genes into a standard \Jgenetics\j workhorse, the common intestinal bacterium \IEscherichia coli\i (usually referred to by its shorthand name of \IE. coli\i). In this case, the strain used is one that is resistant to both ampicillin and chloramphenicol. The synthetic genes code for "external guide sequences", the markers which do the actual damage.
Other groups have tried to make genetic sequences which bind to the gene which gives the \Jbacteria\j resistance, but this approach has had only limited success. Altman's group have used a more cunning approach, fooling the bacterium's own \Jenzyme\j into attacking the resistance genes by fixing a message which says "come and get me!" to the gene, or at least to the copy of it which is used in managing protection.
Next, these new genes were fused into loops of DNA, known to geneticists as plasmids, which have an independent existence inside the \Jbacteria\j. Plasmids are mostly used by \Jbacteria\j to spread and share new genetic material, and are the method used to transfer resistance genes from one cell to another, so the researchers were essentially beating the \Jbacteria\j at their own game: if the \Jbacteria\j destroy this mechanism, they also destroy their ability to pass on resistance.
Once they enter a new bacterial cell, the plasmids use the host's replication machinery to make more plasmids, and so on. And more importantly, the host begins to synthesise the external guide sequence, or EGS, which calls in the \Jenzyme\j RNase P, which then does the real damage.
Putting it in more technical terms, the EGS binds to messenger RNA (mRNA) at a point specified by the sequence on the artificial gene, which is an "antisense" version of the mRNA for a particular gene. RNase P usually works by snipping RNA at a point opposite the genetic sequence ACCA. Altman's strands of antisense RNA bind to the sections of mRNA which represent the resistance gene, while leaving the signal sequence (ACCA) dangling free at one end. The resistance gene would normally code for an \Jenzyme\j which confers resistance to the \Jantibiotics\j chloramphenicol or ampicillin. If the gene is cut in two, it cannot code for that \Jenzyme\j.
The effect of the marker is rather like that of infantry calling in artillery or an air strike: the RNase P destroys the offending mRNA by cutting it in half, leaving the \Jbacteria\j open to \Jantibiotics\j once more. Better, the EGS emerges unscathed, ready to tag another mRNA molecule, and best of all, it does all this while using the enemy's weapons.
While this method has worked in culture, it still remains to be seen if the defence will work when applied in a sick patient, riddled with rapidly multiplying \Jbacteria\j, where even a few surviving \Jbacteria\j may be able to stage a recovery, but it looks as though the method will work well in cases where infections are found on the lung, in the intestines, or on the skin, three of the situations where some of the toughest of the resistant \Jbacteria\j are usually found.
A spokesman for Innovir says that researchers there have already had some success at EGS treatments of mice infected with a drug-resistant hepatitis virus, so the end of this particular story is by no means in sight at the moment.
The process may also have other uses: knocking out the mRNA responsible for producing the toxins of \Jdiphtheria\j and food-poisoning \Jbacteria\j. All that is needed is a knowledge of a part of one of the sequences in the target mRNA, and the rest is as easy as ABC-or as easy as ACCA. Until, of course, the \Jbacteria\j find a way of evolving around this problem as well . . .
#
"Vancomycin unravelled",244,0,0,0
(Aug '97)
A new software package, called SnB, has allowed scientists to unravel the structure of Vancomycin, providing information needed to synthesise new forms of the drug which should be able to wipe out those \Jbacteria\j which have started to win the battle against this, the last resort antibiotic.
Vancomycin is called this because it is the only antibiotic effective against certain life-threatening infections that resist all other known \Jantibiotics\j, but even it is now losing its effect against some diseases, like the often-fatal \Jstaphylococcus\j infection. Now SnB has allowed scientists to determine the structure of Vancomycin.
The antibiotic, which has a number of nasty side-effects, has an extremely complex structure, so that it has always been "brewed" from soil microorganisms, cultured in huge vats. Now the structure is known, it may be possible to synthesise it directly, but more importantly, it will be possible to "ring the changes", varying the structure in small ways which will defeat the existing resistance mechanisms, while still killing \Jbacteria\j. With luck, the new structures may even prove to be missing the side-effects.
SnB, which has solved other complex structures in as little as hours or days, took about a month to yield the correct result, but this was nothing compared with the eight months or so of processing time that the team spent trying other methods, all of which failed.
#
"Watery comets after all?",245,0,0,0
(Aug '97)
The watery comets of Louis Frank ("Controversy in Space", May) are still in the public eye, with new evidence arising this month. The \Jmesosphere\j, the atmospheric region lying between 50 and 90 kilometres, is supposed to be almost bone dry, but \Jsatellite\j measurements suggest that the \Jwater\j vapor levels in the \Jmesosphere\j may be as much as 50% greater than predicted.
The results have come from an instrument deployed from the \J\Jspace shuttle\j\j. The Middle \JAtmosphere\j High Resolution Spectrograph Investigation, has found an abundance of hydroxyl radicals, a breakdown product of \Jwater\j, at this altitude.
This, say the watery comets "camp", is clear evidence that the outer reaches of the \Jatmosphere\j are being hit by fluffy, house-size comets 20 times a minute, releasing \Jwater\j that damps down the \Jmesosphere\j as it descends to the ground below. Opponents say that the levels detected are still only a quarter of what would be found if Frank's interpretation is correct. Louis Frank's view is that just finding excess \Jwater\j at those levels is a big step.
#
"Blindness cured by e-mail",246,0,0,0
(Aug '97)
It may sound almost like a faith-healer's trick, but Johns Hopkins researchers are serious about using an automated camera and e-mail to identify diabetics who have a potentially blinding eye disease well before it leads to actual \Jblindness\j.
The camera, called a DigiScope, is similar to devices used now by ophthalmologists to photograph retinas, but it can be operated by unskilled staff in a GP's surgery. This means that diabetics are able to have a check done close to their homes, with the images being transferred digitally to a specialist for accurate diagnosis. Doubtful cases may then be referred to an ophthalmologist for further checking.
Diabetic retinopathy is the most frequent cause in the USA of new cases of \Jblindness\j in adults. The screening program will be expanded in the future to include \Jglaucoma\j and age-related macular degeneration, which are also major causes of \Jblindness\j.
#
"Malaria still under attack",247,0,0,0
(Aug '97)
Yet another war has been declared on \Jmalaria\j, just a hundred years after the discovery by Sir Ronald Ross that mosquitoes transmit the disease. In 1887, Charles Laveran had shown that patients with \Jmalaria\j carried a protozoan parasite in their blood, and doctors began speculating that mosquitoes spread the parasite, but it was only in August 1897, when Ross dissected the guts of an \IAnopheles\i mosquito that had just fed on a \Jmalaria\j patient, that the link was confirmed.
The mosquito-\Jmalaria\j connection pointed to an obvious answer to \Jmalaria\j transmission: kill mosquitoes, and prevent them from biting or breeding. When mosquito control measures have been diligently followed, they have succeeded in limiting the disease, but the mosquitoes and the \Jmalaria\j parasites have been fighting back ever since.
To workers in the field, each "war" on \Jmalaria\j just brings a slightly more tired smile to their faces, and each loudly-hailed plan to conquer \Jmalaria\j produces a little more tightening of the jaw muscles, because \Jmalaria\j is more widespread now than it was a hundred years ago. Several million dollars have been set aside for the latest "war", so that \Jmalaria\j, causing 2.5% of the world's illness, is being attacked with 0.05% of all research funds. \JMalaria\j researchers are said to be underwhelmed.
#
"New worms",248,0,0,0
(Aug '97)
New species can turn up in the oddest of places. During July, some polychaete worms of a new species were found in the Gulf of Mexico. Most unusually, they were living on \Jmethane\j ice, and they were at a depth of 700 metres, some 250 km south of \JNew Orleans\j.
\JMethane\j ice is a solid crystal formed by \Jwater\j, \Jmethane\j, and other \Jhydrocarbons\j, which forms under conditions of low \Jtemperature\j and high pressure. Deep-sea biologist Charles Fisher from \JPennsylvania\j State University reported the find during August, saying that the worms appeared to be drawing \Jenergy\j from the \Jmethane\j, though whether they do so directly, or by feeding on \Jbacteria\j which harness the \Jmethane\j, nobody knows. The odds, however, are on \Jbacteria\j playing a role, which means there is probably at least one new species of bacterium still to be discovered.
Sections of the \Jmethane\j ice have been brought to the surface, where scientists have discovered that the worms have burrowed right through the ice.
#
"New subnuclear particle?",249,0,0,0
(Aug '97)
Once upon a time, a single physicist's name would attach to a new particle. Evidence is to be published on September 1 in \IPhysical Review Letters\i of a new sub-nuclear particle, one of the theorised "exotic mesons", and aside from professors Cason, William Shephard, John LoSecco and James Bishop, 47 other investigators are involved in this discovery.
The breakthrough experiment, called E852 and conducted at Brookhaven National Laboratory on Long Island, is reported in the dissertation of Notre Dame doctoral student David Thompson. Mesons are very unstable, medium-mass elementary particles with short life spans. They are similar to, but smaller than, a proton or neutron. All three particle types are made up of the most basic elementary particle, the quark. Protons and neutrons are made up of three quarks, while ordinary mesons are composed of one quark and one antiquark.
One of the exotic mesons, on the other hand, is made up of a quark, an antiquark and a \Jgluon\j, another elementary particle that "glues" together the quark and antiquark. The \Jmeson\j that has been identified is definitely not made up of a quark and antiquark, which means it must be an exotic \Jmeson\j. The particle cannot be used to make up any larger form of matter, because they are so unstable, but the discovery is exciting, since it will help physicists to learn more about the fundamentals of natural forces.
#
"Cannibals back in vogue",250,0,0,0
(Aug '97)
For many years, it has been the height of political incorrectness to suggest that any culture engaged in cannibalism. Any report of cannibalism, no matter how carefully documented, was to be disregarded as travellers' tales at best, and racist propaganda at worst.
Now it seems that the scepticism of the 1970s through to the 1990s is about to be rolled back again. Archaeologists now have a rigorous new set of criteria for identifying its marks on human fossils, and the dead bones are telling their tales. There appears to be a strong case for accepting that there was cannibalism in our family tree as long as 800 000 years ago, our Neandertal cousins were given to the practice, and so were the Anasazi, the Aztec of Mexico, and the people of \JFiji\j.
But the cannibals were not around in \JIce Age\j Europe, not unless they hunted their human prey with nets. Where we have a picture of those people as big game hunters, new finds in the Czech Republic suggest that the \JGravettian\j people, who lived between \JSpain\j and southern \JRussia\j some 29 000 to 22 000 years ago, used nets rather than speed and might to capture hares, foxes, and other mammals. That would make them the earliest known net hunters, and it may help explain the larger, more settled populations that seem to mark \JGravettian\j settlements.
#
"Dams cause environmental damage",251,0,0,0
(Aug '97)
For many years, hydroelectricity has been seen as the "clean alternative", offering power which is free of the carbon costs associated with burning \Jfossil\j fuels. In reality, these dams need to run for about 300 years to recoup the carbon costs associated with building them, making and hauling the concrete, the \Jphotosynthesis\j from ground cover that is lost, and the \Jmethane\j production that takes place, deep within the dams, as vegetation rots.
During all of that time, the dam acts as a source of environmental damage, and this has now been recognised for the first time. The US Federal \JEnergy\j Regulatory Commission licenses most of the power-producing dams in the United States: now it has recommended removing a dam on Maine's Kennebec River. The Commission says the removal would be the best and cheapest way to help the Kennebec's struggling migratory fish populations, which have been blocked by the dam from reaching their spawning grounds upriver. Now environmentalists will be seeking to make this a precedent for taking down other dams, right across the United States.
#
"Safer superconductors",252,0,0,0
(Aug '97)
Many of the best superconductors, those with a transition \Jtemperature\j above 120 kelvin rely on unpleasant elements like thallium and mercury. A new variety has just been discovered, a new family of \Jbarium\j \Jcalcium\j copper \Joxides\j with transition temperatures up to 126 kelvin.
#
"Armadillos are unique",253,0,0,0
(Aug '97)
Elephants' trunks, like human tongues, hold their shape because they are hydrostats. The pressure of the fluid inside the organ is balanced against the tension of fibres in a surrounding sheath, and manipulating these fibres allows the organ o move around. The sheath of fibre has to be strong, and in the past, has always consisted of helically would fibres which run in a helix, diagonally around the organ.
Now the pattern has been broken: a new study reveals that the penis of the nine-banded \Jarmadillo\j is reinforced by longitudinal and circumferential fibres, along and around the organ, a pattern that has never been seen before.
#
"Predicting heart attacks",254,0,0,0
(Sep '97)
Would you trust a computer to watch over your heart's health? In an issue of the American Heart Association journal \ICirculation\i this month, researchers report on some early successes in using artificial neural networks to diagnose heart attacks. The computer-based method was more accurate than the cardiologist in reading the electrocardiogram (ECG), a test often used to predict or detect heart problems in patients seen for chest pain in \Jhospital\j emergency departments.
Lars Edenbrandt, M.D., \JPh\j.D., and co-author Bo Heden, M.D., \JPh\j.D., of the University \JHospital\j, Lund, Sweden, say the neural networks performed better than an experienced cardiologist, indicating that they may be useful as decision support.
Neural networks are designed to "think" like humans. They do this by drawing on records of knowledge and decision-making, analysing the experience of large numbers of actual human decisions. To teach a neural network how to recognise hand-writing for example, you might give it ten thousand samples to work from, and to teach a system to recognise the signs of heart attacks, researchers followed the same method. They exposed the computer to thousands of electrocardiogram readings, more than any cardiologist could possibly read in a lifetime.
In all, the computer "studied" 1120 ECG records of people with heart attacks and 10 452 ECGs records that were normal. The neural networks were 10% better at identifying abnormal ECGs than the most experienced cardiologists on staff, say the researchers. As many as 25 percent of ECG readings are "misjudged or overlooked" by the physician, and a person in need of help may be sent home from the \Jhospital\j without a correct diagnosis: any improvement on this average is obviously a benefit.
So far, the machines have one drawback: doctors will still need to talk to patients about their symptoms and medical history, and draw conclusions from that. But it may not be long before even that is solved.
#
"Computer diagnosis?",255,0,0,0
(Sep '97)
One common piece of \JInternet\j "humour" concerns the psychological help line, where a recorded voice tells obsessive compulsives to press 1 repeatedly, dependent personalities are asked to get somebody to press 2 for them, multiple personalities are advised to press 3, 4, 5 and 6, schizophrenics are instructed to listen for the small voice which will tell them what to press, while paranoids are told they need press no button "since we know who you are and what you want", and depressives are not to bother since the service will take no notice anyhow.
The factual problem with this joke is that diagnosis is required first, but now even that may be possible, not only from a computer, but from a computer contacted by \Jtelephone\j. In mid-September, JAMA, the \IJournal of the American Medical Association\i, reported on how a computer which can diagnose common psychiatric disorders could become a helpful aid to busy physicians. Once again, the computer might not be perfect for the task, but it would make screening for \J\Jmental disorders\j\j much more common.
Curiously, with some disorders, patients seem more willing to confide in the \Jtelephone\j-based system which is described, than in their personal physicians. Kenneth Kobak, a \JWisconsin\j psychologist who headed the study, set up a computer to ask questions based on a common questionnaire, called the Primary Care Evaluation of Mental Disorders, which is used to diagnose alcohol abuse, major depression, bulimia, and other disorders. They set up an interactive voice response system, similar to that used by many companies for customer assistance, where the computer asks a question, and the listener responds.
So maybe all the heart people need is a suitable screening test, covering such matters as life-style, family history and diet.
#
"Gene therapy gets smarter",256,0,0,0
(Sep '97)
At least one of the problems of gene therapy now seems to be fairly well under control. Most forms of gene therapy rely on viruses to carry desirable genes into cells. But because a virus usually infects all types of tissues indiscriminately, the gene often appears in cells where it is not wanted. While viruses which only attack certain tissues might be a solution, a better approach is now available to the old blunderbuss method.
According to a report in the 1 September issue of the \IJournal of Clinical Investigation\i, a group of researchers at the University of \JChicago\j fitted a gene with a "genetic switch" that was able to turn the gene on only in smooth muscle cells in experimental animals.
Cardiologist Michael Parmacek and his colleagues at the University of \JChicago\j deleted two genes from the common cold virus. Without these genes, the virus is unable to cause any sniffling or fever, but they then replaced the deleted genes with a marker gene that turns out an easily detected protein and with the gene for the SM22 promoter, which "switches on" genes in smooth muscle cells that surround arteries. Injecting this virus into rats, they showed that the smooth muscle cells, and only those cells, had the marker gene active.
A gene which stops cell proliferation could be useful in many treatments, but what happens if a blunderbuss virus carries that gene into your liver or your lungs? If the new gene is tied in with the SM22 promoter, then the gene will only be switched on when the virus gets into smooth muscle cells, the tissue that sometimes blocks arteries in heart patients. So even if the gene got into your liver and lungs, the SM22 promoter would not be switched on in those cells, and so the cell proliferation control gene would not operate either.
The test on rats shows that any useful gene, once it is identified, can now be fitted into a virus, and sent in to do battle against smooth muscle cells. While this is only one of hundreds of tissue types in the human body, this single example would help the twenty people in ten thousand who have angioplasties each year, six of whom will need a second procedure after proliferating smooth muscle cells reblock the arteries. Gene therapy has not lived up to many of its promises so far, but an article in \INature\i during the month argued that the main problem has been in designing efficient delivery systems. Nonetheless, say the authors, the prospects are good that by the year 2010, gene therapy may be as routine a practice as heart transplants are today.
#
"Concern over gene-therapy experiment",257,0,0,0
(Sep '97)
There may be stormy days ahead as we move towards the year 2010. A September meeting sponsored by the Recombinant DNA Advisory Committee (RAC) of the (US) National Institutes of Health was warned that within 2 years, a researcher somewhere will propose a gene-therapy experiment that, although it starts out as a cure for disease, could eventually be used to enhance a trait in healthy people, as most discoveries seem eventually to spill out beyond the problem they were first supposed to solve.
The consensus was that decision makers should treat all such proposals with caution until ethical concerns such as fair distribution and the potential for \Jeugenics\j can be addressed. The nature of scientific discovery being what it is, even that sort of caution is unlikely to "keep the lid on" the problem much beyond the year 2000, as even approved experiment will have unforeseen spill-over effects. This is built into the very nature of science. If the results of experiments were predictable, there would be no point in doing them.
#
"British Columbia traveling north",258,0,0,0
(Sep '97)
In a textbook example of the way science develops, we now have a new model for the west coast of \JNorth America\j. Vancouver Island was once at the latitude of Baja \JCalifornia\j, several thousand kilometres to the south, according to a team of geologists and geophysicists in a September issue of the major weekly journal, \IScience\i. A few beautifully preserved fossils have validated traces of ancient magnetism that suggest this piece of crust has trekked north over long distances in the past 70 million years.
The idea is not new, having been around for about twenty years, based on the magnetic fields preserved in rocks. The \JEarth\j's magnetic field is horizontal at the equator but vertical at the \Jpoles\j, so the inclination of a rock's magnetism shows how far north it was when it formed. This value, called the palaeomagnetic inclination, shows lower values along the west coast of \JNorth America\j than should be found, if the rocks were formed at the latitudes where they are now found.
From this, many researchers assumed that these tracts of rock, or terranes, had slid up the coast from far to the south, much as \JCalifornia\j west of the San Andreas fault is sliding now, but others had their doubts about this. The shallow angle of the magnetic inclinations could have been misleading, because most of these measurements came from great masses of frozen \Jmagma\j, which could easily have been tilted from their original orientations.
Sedimentary rock would solve that problem because it is laid down in recognisable horizontal layers. But sedimentary rocks from the largest terrane, the Insular superterrane, which makes up much of the coastal crust from northern Washington state into \JAlaska\j, seemed to have been heated long after they formed, wiping them clean of their original magnetic signature.
So how do you find the rocks that have not been heated, that still retain their original palaeomagnetic inclinations? Joseph Kirschvink of the \JCalifornia\j Institute of Technology realised that \Jtemperature\j-sensitive fossils could identify rock that hadn't been heated and magnetically altered. Palaeontologist Peter Ward of the University of Washington, in turn, knew of fossils from islands off the east coast of Vancouver that could fit the bill.
\JFossil\j ammonites and inoceramids in 131 rock samples from the two islands, Hornby and Texada, off Vancouver Island, retain the pearly lustre of living animals. And the palaeomagnetic inclinations in the surrounding rock, about 25░ shallower than expected at Vancouver Island's current latitude, are thus trustworthy.
The pearly lustre of the fossils, indicates the presence of the original \Jaragonite\j, or mother of pearl. If the surrounding rocks had been subjected to heating, the \Jcalcium\j carbonate in the shells would have turned to black calcite. Ward reports that all of the ammonites discovered so far on neighbouring Vancouver Island are black.
Ward's calculation of the speed of the northward drift of the rocks is too fast for some. He suggests that the rocks began migrating from Baja \JCalifornia\j perhaps 75 million years ago, arriving at their present position about 60 million years ago. That rate of movement, about 3000 km (2000 miles) in 15 million years, indicates a slippage of about 21 centimetres (about 8 inches) a year, compared with the San Andreas fault's current movement of 4 centimetres a year. "That is fast, but not science- fiction fast," says Ward, who points out that crustal movements around \JIndonesia\j are just as rapid.
The huge landmass, he says, must have slipped its way north along a giant crustal fault stretching from \JCalifornia\j to British Columbia. But, say his critics, there is no evidence of such a fault. Ward believes the fault is certainly extinct, and because it is buried deep in the \JEarth\j's crust, it may never be detected. He also dismisses comments that fossils in the British Columbia rocks show no evidence of tropical marine life, typical of today's Baja \JCalifornia\j. He points out that the rocks were formed in the Cretaceous era when the \JEarth\j was uniformly warm, with very little latitudinal differences in animal and plant life.
Ward also rejects suggestions that the sedimentary beds may have been compacted and flattened, causing the crystal angle (and hence the magnetic inclination angle) to change, saying that if this had happened, the ammonites would also be flat and deformed. "An ammonite is very squishable," he says. "If you had compaction the \Jfossil\j would not be as pristine as our samples."
As with any other scientific discovery, people will continue to probe at it, looking for evidence that Ward's assumptions are wrong. This may seem mean-spirited, but it is the way that science advances. To this outside observer, Ward seems to have made a very strong case.
#
"Lead linked to dental decay",259,0,0,0
(Sep '97)
In February, we reported on an apparent link between lead levels and cavities in teeth (Lead linked to bad teeth, February). Now the same work has been replicated in rats, leading to a report in the September issue of Nature Medicine, suggesting that exposure to high amounts of lead is likely one cause of the high rates of tooth decay found among certain groups, such as children raised in the inner city.
It seems that lead makes the rats more susceptible to cavities in some way. William Bowen who carried out the work is a Rochester professor of Dental Research, and he says that lead, while it has been removed from most petrol (\Jgasoline\j), it is still present in old paint and commonly in soil or dust around contaminated buildings in those areas with the highest levels of dental \Jcaries\j.
The researchers found that the pups of lead-exposed rats produced 30 percent less \Jsaliva\j, which protects teeth against cavities by neutralising acids. \JSaliva\j also provides minerals, and helps protect the teeth in other ways. More importantly, they detected levels of lead in the mothers' milk which were 10 times higher than the lead levels in their blood.
In this study, the rats were drinking \Jwater\j that contained a relatively high 34 parts per million (ppm) of lead, giving the mother rats blood lead concentrations quoted as "40 micrograms/deciliter" (4 ppm), at the high end of human blood concentrations, but not extremely high in terms of what you could expect to find in humans.
In another article, dentists Martin Curzon and Jack Toumba of the Leeds Dental School in Britain point to breast milk as a likely route of lead transfer from mother to offspring. Their logic is that lead and \Jcalcium\j have similar chemistry, so lead is stored in bones. During \Jpregnancy\j and \Jlactation\j, when the mother breaks down bone to gain extra \Jcalcium\j, the lead is released as well. (People often fail to realise that bone is living tissue, and that bone is continually being broken down and reformed.)
#
"Building better bones",260,0,0,0
(Sep '97)
A study reported in early September indicates that supplements of \Jcalcium\j and vitamin D can significantly reduce bone loss and the risk of fractures in older people. The report, in the \INew England Journal of Medicine\i of September 4, authored by Bess Dawson-Hughes and colleagues at Tufts University, indicates that the treatment is cheap, easy and safe.
The finding applies to both men and women, and with older people living longer than ever, increasing the intake of \Jcalcium\j and vitamin D can be an important lifelong strategy for both sexes. It seems that as people get older, they reduce their ability to absorb \Jcalcium\j and vitamin D, just as production of vitamin D by the skin drops. This drop in vitamin D production and absorption reduces their ability to take up dietary \Jcalcium\j and contributes to bone loss as people age. This is a problem when low bone density is an underlying cause of increased hip fracture among the elderly.
Dawson-Hughes studied 389 men and women aged 65 and older for 3 years. The participants kept to their usual diets, in which they were generally getting the old recommended dietary allowances of \Jcalcium\j and vitamin D. At bedtime, about half of the study participants took \Jplacebo\j pills of no nutritional value. The other half took two separate pills, one containing 500 milligrams of \Jcalcium\j in the form of \Jcalcium\j citrate malate and the other pill containing 700 International Units of vitamin D. All participants visited Tufts every 6 months for measurements of bone mineral density and other tests.
Over the 3 years, the \Jcalcium\j/vitamin D group lost significantly less total body bone, and, in some areas, actually gained bone mineral density. In men, where the findings were more clear-cut, those taking placebos lost about one percent of their bone density at the hip over 3 years. Men taking the \Jcalcium\j/vitamin D combination increased their bone density by about 1 percent. The benefit at the hip for men added up to a 2 percent improvement in bone density for the supplemented group. For women, the positive effects were most notable in the total body bone density, with lesser effects at the hip and spine.
The group taking supplements did considerably better in avoiding fractures. Some 5.9 percent of the participants taking the \Jcalcium\j and vitamin D suffered fractures, compared with 12.9 percent of those who did not take the supplements. Most of the fractures occurred in women.
#
"Broccoli and cancer prevention",261,0,0,0
(Sep '97)
President George Bush won popularity with many in the population when he said he did not like broccoli, but his supporters in that matter may be wise to think again, given the plant's abililty to help protect us against cancer. Broccoli is good for you in any form, but now it seems that 3-\Jday\j-old sprouted broccoli seeds may be even better at protecting against cancer than the adult plants, according a paper published in mid-September in the \IProceedings of the National Academy of Sciences\i.
Epidemiologists have known that broccoli and other vegetables could prevent cancer for more than 20 years, and for the past five years, Johns Hopkins pharmacologists have had a possible biochemical explanation. Broccoli, \Jcauliflower\j, and related cruciferous vegetables contain a chemical called sulforaphane that stimulates a cell's natural anti-cancer machinery. This helps to switch on the so-called Phase 2 enzymes. These in turn attack cancer-causing chemicals such as free radicals and prevent them from damaging DNA, which can lead to cancer.
The problem is that the amount of sulforaphane in broccoli varies widely between strains of the plant. While they were trying to maximise the sulforaphane levels in laboratory broccoli, workers at Johns Hopkins University made a surprising discovery: that the seeds were exceedingly rich in these compounds, and that young sprouts contained between 20 and 50 times higher levels than mature plants.
Tests on rats show that the extracts from sprouts make significant cuts in cancer levels, cutting cancer rates by a half. Perhaps more importantly, Paul Talalay, the main researcher, reports that the sprouts have a 20 times lower concentration of related chemicals, called indole glucosinolates. These chemicals can work against sulforaphane and promote the growth of some tumours in laboratory animals, so once again, the sprouts appear to be better for you than the adult plants.
Clinical studies are currently under way to see if eating a few tablespoons of the sprouts daily can supply the same degree of chemoprotection as one to two pounds of broccoli eaten weekly. The sprouts look and taste something like \Jalfalfa\j sprouts, according to Talalay. But while \Jepidemiology\j studies clearly show the benefits eating mature broccoli, further study will be needed to make sure there are no other hidden effects in the consumption of the broccoli shoots.
But here's the bad news: the broccoli plant \Jgenus\j \IBrassica\i also takes in Brussels sprouts, cabbage, \Jkale\j, \Jcauliflower\j and turnips. What else is going to turn out to be good for us?
#
"Weeds and depression",262,0,0,0
(Sep '97)
To Australian farmers, St John's Wort, \IHypericum perforatum\i, is just a weed, and weeds are things that cause farmers to feel depressed. Now it looks as though they might be able to do something about the depression by eating the weeds.
Well, not really, but the plant is now being looked at as the source of a cure for this problem condition. A three-year study is planned, involving, among other things, 336 patients with major depression who will be randomly assigned to one of three treatment groups for an eight-week trial, with one third of the patients receiving a uniform dose of St. John's wort, another third getting a \Jplacebo\j, and the final third taking a selective \Jserotonin\j reuptake inhibitor (SSRI), a type of antidepressant commonly prescribed for depression.
St John's Wort is commonly used as a herbal remedy in \JGermany\j, but there have been no long-term and systematic studies of the herb's effectiveness.
#
"Search for safer cancer drugs",263,0,0,0
(Sep '97)
A study at The Rockefeller University in New York City is about to begin on a number of plant compounds which have the potential to be safer than cancer-thwarting nonsteroidal antiinflammatory drugs (NSAIDs), widely used aspirin-like drugs known to prevent colorectal cancer as well as reduce related deaths by a half. The NSAIDS have bad side effects such as irritating the \Jstomach\j lining or damaging the liver or \Jkidneys\j, but the three plant-derived compounds, curcumin, rutin and quercetin, may lack these side effects.
In the study, these substances will be compared with the NSAID sulindac. All three are potent \Jantioxidants\j and antiinflammatories, but it remains to be seen if they can act like sulindac in prompting cells to "turn on" a program of regulated cell death called apoptosis.
Curcumin has long been used as an antiinflammatory agent. It is the pigment that gives the yellow colour to the seasoning curry, mustard and turmeric, and it is the powdered form of the root of \ICurcuma longa\i, a member of the ginger family. Quercetin can be naturally found in most fruits and vegetables, such as cranberries and onions, as well as in tea. Quercetin, when digested in the colon, breaks down into rutin.
Many colorectal cancers begin as noncancerous growths, called polyps, in the mucosal lining of the colon and \Jrectum\j, the last part of the digestive tract. An inherited defective gene can cause some forms of the disease, but not all. The polyps develop because the normal routine of cell division and apoptosis (controlled death, directed by the p53 protein) goes awry. When apoptosis is disabled, tissues that rely on it no longer have a way to regulate their cell populations and cancer may follow, so the study will be looking for effects of these compounds on polyp development in people with a history of polyp formation.
#
"Green tea of benefit",264,0,0,0
(Sep '97)
"Take a little wine for thy \Jstomach\j's sake", we read in the \JBible\j. Wine drinkers who once had to rely on this admonition have recently had a better case, sipping their red wine for its beneficial \Jantioxidants\j. Now they may wish to switch their beverage of choice to a nice hot cup of green tea.
A report to the American Chemical Society this month showed that an antioxidant found in green tea is 3.5 times as effective as the antioxidant in red wine, and more than 100 times as effective as vitamin C at protecting cells and DNA from damage believed to be linked to cancer. \JAntioxidants\j are generally believed to mop up highly reactive compounds called free radicals before they have a chance to react and tear apart DNA or other cellular components.
These chemicals are common in fruits, vegetables, red wine and even tea leaves are known to have a similar effect. Lester Mitscher and his colleagues at the University of Kansas have now undertaken a detailed study of the comparative effects of the \Jantioxidants\j in tea and other foods.
They looked at the effects of a number of possible antioxidant compounds in green tea, red wine, and vitamins C and E for their ability to prevent bacterial cells from mutating. This is a standard method of assessing possible anti-cancer chemicals.
They incubated some of their bacterial cultures with \Jhydrogen\j \Jperoxide\j, a powerful free radical producer, along with varying concentrations of one of several \Jantioxidants\j. The most effective of the \Jantioxidants\j turned out to be epigallocatechin gallate (EGCG). EGCG protected about 68% of the cells from oxidative damage, resveratrol, the antioxidant in red wine, protected only 20%, while vitamin E protected 1.5% and vitamin C protected only 0.6%.
Interestingly, a number of cancers have a much lower incidence in \JJapan\j: this research could help to explain some of these effects, as green tea is commonly used in \JJapan\j.
#
"Kombucha tea cause of illness",265,0,0,0
(Sep '97)
Early October brought some not-so-good news about tea. Kombucha tea is a popular "cure-all" made by steeping Kombucha "mushrooms" (actually an aggregate of yeast and \Jbacteria\j covered by a permeable membrane, available in health food stores) in tea and sugar to create a tonic. It is claimed to be a cure for everything from wrinkles to cancer, as well as lowering blood pressure, increasing vitality, increasing T-cell counts, relieving \Jarthritis\j pain, cleansing the gall bladder, alleviating \Jconstipation\j, fighting acne, and restoring grey hair to its original colour.
The beverage (also known as Manchurian or Kargasok tea), has now been identified as the cause of a number of illnesses in the United States, reported in the \IJournal of General Internal Medicine\i in early October. The symptoms seem to be mainly in the form of allergic reactions, but the report's principal author, Radhika Srinivasan, suggests that there may be some form of toxicity in Kombucha tea. There are, unfortunately, only four cases of problems with the tea known so far, and there would be some ethical problems in testing the tea for toxic side-effects in humans, so they can do little more than sound a note of warning.
#
"Male fireflies eaten for repellent",266,0,0,0
(Sep '97)
More than 30 years ago, researchers discovered that females of the \Jfirefly\j \Jgenus\j \IPhoturis\i could lure males of another \Jgenus\j, \IPhotinus\i, by faking \IPhoturis\i females' blinking patterns. Both types of fireflies contain chemicals that protect them from predators, but the predatory \IPhoturis\i fireflies have only about one-third as much of the repellent compounds, called lucibufagens, as \IPhotinus\i.
It occurred to Thomas Eisner at Cornell University that perhaps \IPhoturis\i was stealing its protection from the gullible \IPhotinus\i males. The courtship ritual involves a male flashing its "fire", then the female responds with a pattern which identifies her species, and then the male approaches to mate. But when the \IPhotinus\i males approach a \IPhoturis\i female, they end up as dinner.
Eisner reared some larval \IPhoturis\i in Petri dishes. With no other fireflies to eat, the females grew into lucibufagen-free adults, which hungry jumping spiders were quite willing to eat. But after they had eaten some \IPhotinus\i males, other captive \IPhoturis\i females had high levels of lucibufagens and were rejected by jumping spiders. The spiders also avoided \IPhoturis\i females who were fed a solution containing lucibufagens.
That is the neat part: the messy part is that nobody knows where \IPhotinus\i gets the lucibufagens, but they certainly can't manufacture them. A number of predatory species of \IPhoturis\i live across the United States and \JSouth America\j, and all the females respond to the mating flashes of several other species, at least for meals, so now the scientists working on this group will need to re-examine their work and their conclusions.
#
"Sound robot design",267,0,0,0
(Sep '97)
Last August saw a report in the \IJournal of the Acoustical Society of America\i, its cover article, in fact, on Yale University electrical \Jengineering\j professor Roman Kuc's new \Jsonar\j robot. Inspired by the methods used by bats and dolphins to locate prey, the robot uses ultrasonic sound to locate objects and navigate. It is so sensitive that it can tell whether a tossed coin has come up heads or tails.
Getting a camera powerful enough to duplicate human vision is proving difficult, but Rodolph (short for robotic dolphin), Yale's new robot, is equipped with three \JPolaroid\j electrostatic transducers that can act either as transmitters or receivers to serve as the robot's "mouth" and "ears." This \Jsonar\j detection system, says Kuc, could prove easier and less costly than camera vision for identifying an authorised customer at an automated teller machine, detecting production flaws on an assembly line, or helping someone who is paralysed interact with a computer.
The transducers are similar to those used in \JPolaroid\j autofocus cameras to gauge an object's range, and in acoustic digital tape measures that use echoes to measure distances. The centre \Jtransducer\j emits a 60 kHz signal, as often as ten times a second, while the outer transducers, controlled by a Pentium 120 processor in a PC form rotating "ears" that help pinpoint and amplify the sound of the returning echoes. The system has a claimed accuracy of 0.1 mm, good enough for most purposes, says Kuc.
So far, only the sensing arms have been built. The next stage will be to mount the detector system on a mobile base, so that it can move around, exploring its environment.
#
"Fish and ultrasonic sound",268,0,0,0
(Sep '97)
It is unlikely that Rodolph will ever go to sea to hunt down a fish meal for us. Many fish will avoid dolphins when they can see them, but a report in \INature\i this month tells us that at least one dolphin-avoiding fish, the shad, can detect ultrasonic tones in the frequency range that dolphin use for \Jecholocation\j.
In the past, fish seemed only to hear low-frequency sounds, basically below 3000 Hz, but \Jherring\j will swim away when fishing boats switch on their echo sounders to locate schools of fish, and sardines and shad can be scared off by bursts of ultrasonic sound.
To find out what sounds the fish were actually hearing, University of Maryland neurobiologist, Arthur Popper, together with Zhongmin Lu and David Mann conditioned five fish, American shad, to associate a low-frequency sound-that the researchers knew the fish could detect-with an electric shock, which lowered the fish's heart rate.
Under normal circumstances, a conditioned fish will lower its hear rate when it hears the sound, in anticipation of the expected shock. While the fish did best at 200 to 800 Hz, the fish also responded to ultrasound between 25 000 Hz and 130 000 Hz. The shad responded to simulated dolphin \Jecholocation\j pings of 80 000 Hz.
The study may help acoustic engineers design better underwater sound systems to deter these fish from entering the \Jwater\j-intake pipes of power plants. "Pingers" have been used before to scare away fish, but without any real knowledge of what the fish can actually hear. Now, for the first time, we have some precise knowledge, and more importantly, a proven method for more investigations.
#
"Ciclid fish in danger of extinction",269,0,0,0
(Sep '97)
Pity the poor cichlid, though, who seems not to be all that hot in the hearing game, and now is having trouble seeing. The colourful cichlid fish, which live in \JAfrica\j's Lake Victoria, lying between \JKenya\j, \JUganda\j and \JTanzania\j, are notable for their rapid \Jevolution\j. Now they are also notable for their rapid decline.
Eutrophication of the lake, an increase in nutrient levels has caused lake turbidity (opaqueness) to increase. This is a problem for fish which maintain their differences by colour-associative mating, which is correlated with light conditions. The cichlid species will normally mate only with their own kind, but under low light conditions, they are unable to make the necessary visual distinctions, and many closely related species may end up interbreeding themselves out of existence.
The lake was once home to at least 500 species of small fish called haplochromine cichlids. Over the last 70 years, some hundreds of these species have become extinct. Originally, this \Jextinction\j was blamed on the predatory power of Nile perch, introduced into the lake in the 1950s, but it has recently become obvious that even species which the perch did not eat were disappearing.
Researchers then found that there were just five distinct cichlid species in murky \Jwater\j that allowed the transmission of a 100 nanometre segment of the light spectrum, but that there were about 20 species in clearer \Jwater\j that allowed the transmission of a full 500 nanometre segment. The researchers believe the murky \Jwater\j "turns the lights off," making even the most brightly colored males appear drab and indistinct to females, leading to breeding mistakes, the loss of distinct species, and an increase in hybrids.
So rather than classical \Jextinction\j, a slight change in the ecological balance is capable of "blinding" the fish to differences which would prevent them mating in clear \Jwater\j, wiping out species after species. In reality, these species were only species-in-waiting, populations which had been separated out into different breeding groups, and which, as separate genetic pools, were ready to embark on an evolutionary voyage which would have seen genetic barriers raised over time. Now, thanks to the turbid waters of the lake, that will never happen.
Eutrophication arises when logging and farming allow greater surface run-off, carrying mineral \Jnutrients\j such as nitrates and \Jphosphates\j into the lake. These \Jnutrients\j support larger concentrations of \Jalgae\j, and so of the plankton that live on the \Jalgae\j. You can see the same "clouding" effect by taking three jars of well-mixed pond \Jwater\j, giving one a small amount of "complete fertiliser", and second one a large amount of the same fertiliser, while the third remains untouched. Leave these on a sunny window-sill for a week or so, and watch to see how the \Jwater\j goes cloudy over a week or two.
#
"First World AIDS, Third World no aid?",270,0,0,0
(Sep '97)
How do you feel about providing expensive AIDS treatment to \JHIV\j sufferers in the Third World? One reaction is to say that AZT cannot be afforded in the Third world, so we should seek more affordable measures. Yet AZT is regarded as the standard in the developed world, and intensive treatment of pregnant \JHIV\j-positive women with AZT can reduce transmission of \JHIV\j from mother to child by nearly 70%. So how can you justify doing anything less in those countries in Asia and \JAfrica\j where \JHIV\j is running wild?
A closed meeting of the United Nations' AIDS program, UNAIDS discussed this issue and others during September, but no results had been released at the time of writing this report. Pressure groups and the \INew England Journal of Medicine\i argued during the month that any such trials would be unethical, and another problem is looming now, as hopes of an effective vaccine begin to surface.
How do you test an AIDS vaccine if everybody in the trial is also getting powerful anti-AIDS drugs? This was one of the considerations behind a number of American doctors indicating during September that they were prepared to trial AIDS vaccines on themselves.
One thing is certain: \JHIV\j/AIDS will have a huge financial cost on the world, so even if the economists can show that each life saved is only worth a few dollars in economic activity, the cost of leaving that case actively spreading the disease will spiral to a huge amount in a very small time: the world cannot afford not to do what it must to wipe out this disease, even if the victims are too poor to be able to pay for their treatment.
#
"Modified virus to fight AIDS?",271,0,0,0
(Sep '97)
In other \JHIV\j/AIDS news in September, researchers have engineered a vesicular stomatitis virus (VSV), which normally infects \Jcattle\j, to attack the AIDS virus in humans. In \Jcattle\j, the virus causes a mouth infection that stops the \Jcattle\j eating, but in the laboratory, the modified version has been shown to selectively target and destroys \JHIV\j-infected human cells, with no other effects.
After \JHIV\j binds to the \JCD\j4 receptor on a white blood cell, it also must link to another molecule found on the cell's surface, a chemokine receptor, and after this, the \JHIV\j can gain entry to the cell. Pieces of the virus then appear on the cell's surface, flagging the cell as one that has been attacked.
The modified VSV carries genes which code for \JCD\j4 and one of the \JHIV\j's chemokine \Jreceptors\j, CXCR4. This makes the VSV home in on the AIDS-infected cells, which it quickly kills, but the modified VSV lacks the surface protein it would need to attack normal cells, and so it leaves them alone.
This is a novel approach, but only time will tell us whether it is a practical approach: some workers in the area feel that the HSV levels needed may be too high to ever knock out all of the cells infected with \JHIV\j. Perhaps this treatment would be useful with the three patients described in the next story.
#
"Drug solution to AIDS?",272,0,0,0
(Sep '97)
A conference in Baltimore was told in mid-September about an \JHIV\j-infected German man who has "undetectable" levels of the virus in his blood 9 months after he stopped taking a powerful combination of drugs to tackle the infection. Usually, viral levels rebound when patients stop taking anti-\JHIV\j drugs, but in this case, this seems not to happen.
The patient sought treatment soon after being infected, and was given indinavir, ddI, and hydroxyurea, drugs that have different mechanisms of action, which reduced his \JHIV\j levels to the point that the most sensitive polymerase \J\Jchain reaction\j\j assays could not detect any virus, a fairly common occurrence with the drugs now available.
Five months after the start of treatment, the man developed hepatitis A, which should have driven up \JHIV\j levels by itself, but this did not happen, and nine months later, the virus still remains undetectable in his blood. All that remains is what is described as a faint signal in his \Jlymph\j node where the bulk of \JHIV\j is usually to be found.
The report is the second anecdotal account of long-term viral suppression in the past few weeks. A paper published in the 30 August \ILancet\i describes two other patients who also have not seen their \JHIV\j levels rebound after being off drugs for 1 year, but some researchers question whether these individuals really were infected, saying that the patients in question had extremely low \JHIV\j levels to begin with.
These patients were, however, also being treated with hydroxyurea, which is not an approved AIDS drug in the United States. It certainly points to a useful direction for further exploration.
#
"CPR debate",273,0,0,0
(Sep '97)
An American expert panel has questioned the usefulness of "mouth to mouth" in cardio-pulmonary resuscitation (CPR). They suggest that in many cases of adult cardiac arrest, mouth-to-mouth ventilation as a part of CPR rarely helps, and may even harm the patient.
Basically, they believe that mouth-to-mouth ventilation can interfere with the rescuer's efforts to perform chest compressions and cause significant adverse effects. It also makes CPR harder to teach, and apparently puts people off getting involved, especially when the patient is a bit "unhygienic", compared with the training dummies or manikins that they learned on.
Exhaled air contains 17% oxygen, less than the 21% of fresh air, and 4% carbon dioxide which can inhibit cardiac contraction. Studies have found that from 10 to 35 percent of patients who receive CPR inhale into their lungs \Jstomach\j contents, emitted after air is blown into the \Jstomach\j rather than to the lungs.
They say that the key factor in survival is the time from the cardiac arrest until defibrillation, when the heart is shocked back into a normal rhythm, but they also say that it is probably too early to do away with mouth-to-mouth ventilation just yet, and that it is still essential in cases of drowning. By the year 2000, they expect to have made a definitive statement on the matter.
#
"SHEBA's winter",274,0,0,0
(Sep '97)
Two Canadian Coast Guard ice-breakers left Tuktoyaktuk, Canada in the middle of the month to establish Ice Station SHEBA in the \JArctic\j Ocean. Getting its name from the Surface Heat Budget of the \JArctic\j Ocean project, SHEBA will see the one of the ships frozen into the ice and left as a floating science platform for 13 months.
They will arrive at their station, approximately 75 degrees north and 143 degrees west, around October 1. One, \IDes Groseilliers\i, will remain in place until next year, while the other ship, \ILouis S. St. Laurent\i, will leave the area around October 15. The purpose of the freeze-in is to gather data on \Jweather\j and climate in the \JArctic\j so that forecasts of global climate change can be improved. You can follow SHEBA's progress at http://sheba.apl.washington.edu/default.html, with links taking you to regular reports on the status of the ships, and photographs showing the ice (and stopping the presses for a moment, even a \J\Jpolar bear\j\j sighted from \IDes Groseilliers\i.
#
"Crystal star?",275,0,0,0
(Sep '97)
Astronomers pre-announced an October article in September, reporting on an interesting star. In theory, white dwarf stars may have a crystallised core. Most dwarf stars are thought to be too hot to crystallise, but now a team has identified a possible exception, a pulsating white dwarf star, called BPM 37093, tens of light-years away in the southern \Jconstellation\j \JCentaurus\j.
A two-week study, using the Whole \JEarth\j \JTelescope\j, has been scheduled for next March, which should provide enough data to determine the answer one way or another.
#
"Quark star?",276,0,0,0
(Sep '97)
An even more exciting story broke this month, in the 1 September issue of \IPhysical Review Letters\i, where astronomers argue that the radio pulses from spinning neutron stars may reveal signs of the stars turning into "quark stars", where the core turns into a soup of free quarks.
The effect would show up as changes in density. As the pulsar slows down, the "centrifugal force" gets less, and the star becomes denser. As it crosses a critical density threshold, the neutrons should break down into quarks, which are more compressible, so the star will move inwards. This has the same effect as a skater pulling her arms inwards, so the star will then spin faster, producing a characteristic radio signal.
#
"Evolution fraud rediscovered",277,0,0,0
(Sep '97)
Once upon a time, one of the cornerstones of \Jevolution\j could be found in a catchphrase to the effect that "ontogeny recapitulates phylogeny". In plain English, the development of "higher" animals was supposed to show the embryo going through all of the stages that it had passed through on its evolutionary path.
While it has long since been rejected as a key element, it is still often trotted out in \Jbiology\j textbooks, usually with Ernst Haeckel's drawings, now well and truly free of copyright, to show the effects. Now a British researcher has photographed the standard embryos drawn by Haeckel, and reminds us of something which Haeckel apparently admitted at the time: Haeckel had "fudged" the drawings somewhat, as he drew them from memory in such a way as to emphasise the alleged recapitulation.
#
"Lungfish our nearest relative?",278,0,0,0
(Sep '97)
"Old Four Legs", the \Jcoelacanth\j, has long been regarded as the closest of our gilled relatives, the species closest to the first fish to take a step out of the \Jwater\j onto the land. Now it seems that those scientists who thought the \Jlungfish\j was our nearest relative were right all along. A new DNA analysis sees the \Jlungfish\j promoted to number one spot in the land vertebrate tree. Of course, we need to avoid Haeckel's error, and remember that the \Jlungfish\j has been evolving as well, but if you want an idea of what our ancestors may have looked like as they came ashore, that is where you need to go.
#
"Birds get older",279,0,0,0
(Sep '97)
Most of the many North American bird species were thought to have originated at the time of the \Jglaciation\j of the Late Pleistocene, during the Ice Ages around 100 000 and 250 000 years ago. Logically, that was the time when individual populations of a species were isolated by barriers of inhospitable territory, either ice or \Jtundra\j, and would be driven to follow their own genetic paths and diverge into separate species.
Logical perhaps, but not factual, not if you take account of the mitochondrial DNA analysis of 35 pairs of \Jsongbird\j species reported in September in \IScience\i. The birds were selected because they were closely related, and so presumably only evolved recently. In theory, birds which diverged in the more recent \JIce Age\j should differ by about 0.2%, while those which diverged at the earlier \JIce Age\j should show a difference of about 0.5%. These differences are simple random fluctuations in unimportant DNA, and so can usually be relied on as an evolutionary clock.
Instead, many of the bird pairs showed differences suggesting that their last common ancestor was much older, around 2.5 million years old, in fact! The next step in a study of this sort will be for the doubters to question the validity of the "biological clock".
#
"Fallacies about evolution",280,0,0,0
(Sep '97)
There are two standard fallacies about \Jevolution\j: the first is that small is always replaced by large as \Jevolution\j proceeds, and the second is that every step in \Jevolution\j goes forward.
The second fallacy is easier to deal with, since if life forms always improved, they would never go extinct. The first fallacy was given to us by a 19th century American \Jfossil\j-hunter, Edward Drinker Cope in a proposition eponymously known as Cope's Rule: over time, the average body size within a \Jgenus\j of animals will tend to enlarge. Now we have evidence that Cope's rule has been flouted by a lowly \Jmollusc\j. In a paper in \INature\i this month, David Jablonski reports that molluscs have shrunk in size through the aeons.
Jablonski studied almost 1100 species of clams, oysters, and snails from the Gulf and Atlantic Coastal Plain of \JNorth America\j that evolved within 191 genera over 16 million years. One-third of the genera did indeed get bigger; but one-third got smaller. Smaller is better, it seems, when it is advantageous to reproduce early and often.
#
"Microprocessor size limit",281,0,0,0
(Sep '97)
According to Moore's Law, chips are getting smaller and more powerful all the time. (To be precise, Moore's Law states that the logic density of silicon integrated circuits has closely followed the curve d = 2^(t - 1962), where d is the density in bits per square inch and t is the year.)
But this era of growth may be coming to an end. The microscopic silicon chips of today are getting so small that eventually they will contain too few atoms to work, say observers.
"The 'road map' says that in about the year 2010 the limit will be reached. Microprocessors will be as small and as fast as they can get. This unhappy news will have an enormous impact on the national economy," said Kevin Jones, professor of materials science and \Jengineering\j and co-director of the University of \JFlorida\j's SoftWare and Analysis of Advanced Material Processing, the SWAMP Center.
If you think you have heard all that before, so has his fellow co-director, Mark Law. "For 30 years, people in the computer industry have predicted the 'just-a-decade-away' demise of the continually shrinking, ever-faster but still inexpensive computer chip," said Law. "But clever people have been able to push that 10-year window ever farther out." The point, say both men, is that the window cannot be pushed out forever.
The heart of the Pentium processor transistor, a layer that once was thousands of atoms thick, is getting so small that it soon will be only 50 atoms thick, and the Pentium processor's descendants may eventually shrink themselves out of function when the transistor inside the chips gets to be fewer than 10 atoms thick, in just over a decade. Unless there is a revolutionary change in computer technology, the trend toward smaller, faster computers will have reached its limit.
#
"Fossil contention",282,0,0,0
(Sep '97)
The press in September were rightly excited over a 65-million-year-old \ITyrannosaurus rex\i named Sue (after the discoverer's girl-friend), said to be the largest and most complete theropod ever found. Due to be auctioned on October 4, people wondered where the bones would end up, and whether they would still be accessible to science.
Sue was seized by the US government in 1992 precisely to ensure that the skeleton would be accessible to science. The bones were discovered on a South Dakota Indian reservation by commercial \Jfossil\j hunters. Although they paid the landowner, Maurice Williams, a Cheyenne River Sioux, $5000 to excavate the \Jfossil\j, federal officials seized the bones, claiming that it hadn't been determined that they were rightfully taken. Williams holds his land in a tax-free trust arrangement with the federal government, which meant government permission is required to sell it or what's under it. After confiscating Sue, the government charged the discoverer, Peter Larson, with numerous felonies related to trafficking in illegally excavated fossils, and he was convicted for not reporting international financial transactions, for which he received a 2-year sentence, last year. Meanwhile, the courts finally decided that Williams was the rightful owner of Sue.
To deflect criticism from scientists, Sotheby's offered museums three interest-free years to pay what was expected to be a very high price, since a less complete skeleton is currently on offer for US$12 million. In the end though, the price in early October, including the auctioneer's 10% commission, to be paid by the purchaser, \JChicago\j's Field Museum, was just on US$8.4 million.
#
"Largest specimen of Tyrannosaur unearthed",283,0,0,0
(Sep '97)
A fossilised skeleton believed to be the largest specimen of a tyrannosaur ever unearthed was found this summer in Montana by a field crew headed by J. Keith Rigby, a University of Notre Dame palaeontologist. It may be either a \ITyrannosaurus rex\i or something very much like it. It differs from other specimens of \IT. Rex\i, of which there are now about fifteen, and the pubis, one of three main bones in the pelvis, measures at least 52 inches, compared to 48 inches in the largest known \IT. rex\i. The femurs or thigh bones, which palaeontologists normally use to estimate the size of dinosaurs, await excavation at the site, or so said a press release in early September.
Unfortunately, the site near the Fort Peck reservoir was on a \Jcattle\j ranch, and the former owners, who had been forced off the land because of debts, decided to excavate the \Jfossil\j themselves, in order to sell it and pay off the debt which had forced them off the ranch in the first place. Rigby had employed two of the family to cook for them as work proceeded, so they were well aware of the location, which Rigby had left to return to teaching at Notre Dame, planning to finish the dig next year.
Judging from the position of the surface bones and the other bones so far unearthed, Rigby believes the whole bone bed may cover 6 hectares (15 acres), making it one of the largest \Jdinosaur\j graveyards of the Late Cretaceous ever found. The material on the site was due to be placed in a museum planned to open in the area in 2005.
When Rigby returned late in September, he found two-thirds of the left side of the skull had gone, along with both lower jaws, although the jaws have since been recovered.
#
"Better superconductors",284,0,0,0
(Sep '97)
Modern high-\Jtemperature\j copper oxide superconductors have a problem: as currents pass through them, magnetic vortices, eddies of magnetic force, are generated within the material. This results in finite electrical resistance and prevents the desired loss-free \Jconduction\j of current through the material. If mercury is introduced into the material, and then it is bombarded with high \Jenergy\j protons, this creates defects in the structure which inhibit the vortices, allowing it to act as a superconductor at temperatures above 130K.
#
"Cleaner water, better medicines",285,0,0,0
(Sep '97)
Getting chemical products or pollutants from \Jwater\j is not an easy task, and often involves the use of toxic organochlorine solvents. Liquid carbon dioxide would be an ideal solvent for the task as it is cheap and non-polluting, but until now, its use has not been practical. Joe DeSimone and his colleagues at the University of \JNorth Carolina\j, Chapel Hill, have used dendrimers, molecules which branch repeatedly from a central point, to do the cleaning job for them. The dendrimers are soap-like in their action, clinging to the carbon dioxide on one side, and to the target molecule on the other.
Working under high pressure, a layer of \Jwater\j and methyl orange was covered with a layer of liquid carbon dioxide containing dendrimers. The process took about three hours, but seemed to remove every trace of the dye from the \Jwater\j. Carbon dioxide is cheap and easy to obtain: this sort of process, or a similar one using another carbon dioxide solvent helper called the reverse micelle, could be used to extract heavy metals from \Jwater\j, or pharmaceuticals from a culture broth, once suitably "targeted" endings have been created for the dendrimers.
#
"Mounds as old as the pyramids?",286,0,0,0
(Sep '97)
Long before Europeans arrived in the Americas, the Native Americans dotted the eastern side of the continent with thousands of huge \Jearth\j mounds. Previously dated to the period from 3700 to 2700 years, a new dating sets them back to about 5300 to 5400 years, based on radiocarbon dates, putting them in the same age range as the oldest pyramids, at a time when archaeologists believed the agriculture and trade networks necessary to build such structures had not been established in \JNorth America\j.
The dating question arose because the Watson Brake mounds lacked any of the artefacts that were expected, items made by a group called the Poverty Point people. Auger cores revealed ancient soil horizons, suggesting great age, and radiocarbon did the rest.
#
"New tumor suppressor gene?",287,0,0,0
(Sep '97)
The p53 protein is found to be mutated in more than 50% of human cancers, earning it the name of "guardian of the \Jgenome\j". It appears that so long as p53 is active, cancers are held at bay, but when it breaks down, the cancers have an open go, because p53 is no longer able to control cell growth and death in a process called apoptosis. A previously unknown protein, called p73 has now been discovered and characterised, and it seems to share structural and functional similarity with p53. More tests need to be done before p73 can be confirmed as a \Jtumour\j suppressor, according to a report in \INature\i this month. Significantly, the gene for p73 lies on \Jchromosome\j 1, already believed to be "home" to one or more \Jtumour\j suppressors. The gene maps to the short arm of the \Jchromosome\j, in a region which is frequently deleted in neuroblastomas.
#
"DNA all wrapped up",288,0,0,0
(Sep '97)
Popular accounts of the \Jchromosome\j usually present it as some sort of long thread, tucked into the nucleus like a ball of string. The problem is that the typical cell has about two metres of DNA in it, and that much string, even in a large pocket, tends to get tangled: how much worse would it be with the super-fine thread that is DNA?
The picture is wrong. The simple DNA double helix is coiled into a super-helix (rather as the twisted wires in a keyboard cable are coiled into a helix, then that is coiled into a super-super-helix, and the whole thing is bound together with proteins called histones. Imagine wrapping a keyboard cable around your arm a few times, slipping your arm out and taping the coils together, and you will have the general idea.
Now, researchers have determined the x-ray crystallographic structure of the molecular machinery responsible for this feat, a fundamental DNA packaging unit called the nucleosome core particle. The structure's resolution-2.8 angstroms-is good enough to distinguish about 80% of the atoms in the protein component of the particle and all of those in the DNA.
#
"Left-handers rule in space",289,0,0,0
(Sep '97)
Proteins are made of amino acids which can occur in two mirror-image forms, just like our hands, DNA, and even computer keyboard cables. On \Jearth\j, nearly all life-based amino acids occur as the left-handed form, as L-enantiomers, rather than right-handed D-enantiomers. When an excess of L-amino acids was discovered on the Murchison \Jmeteorite\j, it seemed that our chemical bias might have had an extraterrestrial origin; but then questions were raised about contamination of the samples.
The bad news for right-handers is that the extraterrestrial nature of this excess has now been confirmed in a \INature\i report this month. The origins are confirmed by a careful study of the amount of \Jnitrogen\j-15 in the amino acids, which was different from that found in terrestrial material.
In a test tube, equal amounts of the two forms of an amino acid will form, but living things only react with the L-enantiomers. This finding makes it just a little more likely that life originated with the insertion of large amounts of left-handed material into the \Jearth\j's \Jatmosphere\j and waters. That still leaves open the question: what caused the imbalance in space?
#
"E. coli sequenced",290,0,0,0
(Sep '97)
\IEscherichia coli\i was first isolated in 1922, and it has been a standard workhorse for laboratory \Jgenetics\j for more than three decades. This month, researchers reported the complete sequence of all 4.6 megabase (4.6 million base pairs) \Jgenome\j of the bacterium. Because it has been so intensively studied for so long, researchers will now be able to go back over old work, tying it in to the new information, which includes a fold-out showing the arrangement of putative and known genes, operons, promoters, and protein binding sites.
#
"Cassini trials",291,0,0,0
(Sep '97)
The $3.3 billion Cassini space mission to Saturn was still due to take off on October 15, but amid a rising clamour of protest over Cassini's load of \Jplutonium\j in a reactor designed to power the mission into the next millennium. Cassini will have overcome \Jinsulation\j and cooling problems, discovered in early September, which caused a postponement of the launch from October 6, first to October 13 and than to October 15.
The problem was with the European Huygens probe, which is attached to the Cassini orbiter, but the whole \Jspacecraft\j had to go back to the Kennedy Space Center for checking, with only a narrow launch window that closes out on 4 November. Any launch after that date would use much more fuel, limiting what the \Jspacecraft\j could do when it got there.
The \Jplutonium\j at the centre of the fuss is contained within several layers of \Jinsulation\j which has been tested and re-tested using explosives. In no test case has the \Jplutonium\j been exposed to the \Jatmosphere\j. Also it is designed to break into chunks, rather than dissolve into dust, so it cannot be inhaled or carried on the wind, while the protests are based on a worst-case scenario, where all of the \Jplutonium\j is pulverised and then breathed by humans.3
Upon its arrival at Saturn in 2004, the \Jspacecraft\j will spend four years orbiting Saturn and many of its 18 known moons, providing a flood of new data on what many view as a miniature \J\Jsolar system\j\j. Professor Larry Esposito, chief scientist on the Ultraviolet Imaging Spectrograph, or UVIS, said it will be used to study the \Jatmosphere\j of Saturn, the surfaces and atmospheres of its moons and the structure and dynamics of the fabulous ring system.
#
"Science prize news",292,0,0,0
(Sep '97)
The 1997 Nobel Prize for \JPhysiology\j or Medicine has gone to Stanley Prusiner for his work on prions.
In other prize news, this year's Albert Lasker medical research awards, each worth $25 000, have been awarded to two scientists who have done pioneering work in \Jgenetics\j and to a physician who brought vitamin A therapy to children throughout \JAfrica\j and Asia.
The Basic Research Award went to Mark Ptashne, of the Memorial Sloan-Kettering Cancer Research Center in New York City, for his work on the molecular basis of gene regulation, the process that turns genes on and off. Thirty years ago Ptashne isolated the lambda repressor, a protein that binds to a specific DNA location and turns off expression of certain genes of a virus that grows inside \Jbacteria\j. The lambda repressor and its associated proteins became one of the best understood systems of gene regulation and helped Ptashne and other researchers understand the process in many higher organisms.
Victor McKusick, of Johns Hopkins University received a Special Achievement Award for advancing the study of the genetic basis of disease, work that lead to the Human \JGenome\j Project, while the Clinical Medical Research Award went to Alfred Sommer, also of Johns Hopkins, for his work on vitamin A therapy for children in the developing world. Although vitamin A deficiency was already known to cause \Jblindness\j in many developing countries, Sommer showed in 1983 that it also dramatically increased child mortality from other, more serious diseases. This finding was applied to aid programs sponsored by the World Bank, which believes that vitamin A supplements for children constitute one of the most cost-effective treatments in medicine.
#
"Nobel Prize in Physiology or Medicine (1997)",293,0,0,0
The Nobel Assembly at the Karolinska Institute has awarded this prize for 1997 to Stanley B. Prusiner for his discovery of prions, a new biological principle of infection, believed to cause transmissible spongiform encephalopathies (TSEs). These are diseases like scrapie in sheep, "mad cow disease" (BSE or bovine spongiform encephalopathy) and Creutzfeldt-Jacob disease (CJD) in humans. The award represents the first time since 1987 that this prize has been given entirely to a single person. All the other recent prizes have been shared by two or three researchers.
Most infectious diseases are caused by organisms which contain nucleic acids. Even viral diseases (where we can split hairs about whether or not viruses are really organisms or not) involve viral RNA. Prion disease is different: it appears to be caused by pieces of protein which "flop" into a different shape, and somehow act as a template, causing more of the protein to fall into the different shape.
This is important, because the chemical properties of any molecule will depend on, among other things, the shape of the molecule, and the way charge is distributed across it. When the protein "flops", it has a new shape, and the charge distances are different. According to Prusiner, this is how these diseases are able to be transmitted.
Prusiner himself created the name "prion", an \Jacronym\j derived from "proteinaceous infectious particle", in 1982. But while there is now widespread support for the prion theory, and many scientists are delighted for Prusiner, others say the award may be a bit hasty, that prions as the cause of disease remain theory not fact. Although normal prion protein molecules can be distorted into the disease shape in a test tube, simply by adding distorted protein from the brains of people who died of TSEs, this converted protein is not itself infectious.
For a century, disease causes have been identified in accordance with Koch's postulates, a set of rules assembled by Robert Koch. These state that:
1. The "causing" organism must always be found in diseased animals, but never in healthy animals;
2. The organism must be cultured in a pure culture away from the animal body;
3. When this culture is inoculated unto a susceptible host, the characteristic symptoms of the disease should appear; and
4. The organisms isolated and recultured from the experimental animals should be the same organism as those originally cultured.
Obviously Koch's postulates were written to fit the standard "germ cause of disease" paradigm or model of the late 19th century, but they can be applied fairly well to the prion hypothesis with just a few changes. The sticking point is postulate 4, where the final protein, if it is really the same, should still be able to cause the disease. It can't, so for the moment, there is something missing, and no one knows what the missing factor is. The \Icomplete\i cause of TSEs has not yet been identified, and this is the justification for the critics to say that Prusiner should not get the award just yet.
Other scientists argue that if you wait for something like this to be finally sorted out, it may never be the subject of a Nobel prize, as these awards are never made posthumously. The missing link, suggests Prusiner, may be a second protein which "chaperones" the conversion of prion molecules from the healthy state to the diseased state. Others want to assume that the "chaperone" is a virus or other traditional disease-causing particle.
The problem began with the mystery of scrapie, a disease in sheep, first known from \JIceland\j in the 18th century, and transferred to Scotland in the 1940s. As early as 1967, Tikvah Alper, an English researcher, performed experiments in which she showed that there seemed to be no genetic material involved in disease transmission. In 1968, John Griffith suggested that TSEs could, in theory, be caused by a single protein, but the idea more or less died at that point. Alper was a radiation biologist and Griffith was a physicist, and their evidence came from the fact that brain tissue remained infectious even after Alper had subjected it to radiation that would destroy any DNA or RNA.
In hindsight, Griffith's insight was a brilliant one: he suggested that perhaps a protein, which would usually prefer one folding pattern, could somehow misfold and then catalyse other proteins to do likewise, but as a physicist, attacking the whole basis of molecular \Jbiology\j, he had little chance of being taken seriously, so scrapie became an interesting minor footnote in \Jbiology\j books.
As the 1970s progressed, Prusiner became interested in the problem when one of his patients died of CJD. As often happens when there is a new idea, the opposition was strong, if only because Prusiner's ideas went against the prevailing paradigm. The accepted model was based on the central dogma of molecular biologists that all self-replicating life-forms had to contain DNA (or at least RNA) in order to reproduce.
Like any perceived "maverick" in science, Prusiner had to argue long and hard, and was often accused of dogmatism himself, so it is little wonder that he commented after the announcement of the prize that he felt vindicated. The critics, though, are still there, and they still argue that vindication is not proof.
Let us review the evidence for Prusnier's ideas. We know the prion protein gene is found in all mammals, and that the "scrapie" form of the protein is more stable than the normal form, remaining stable even when attacked by chemicals or subjected to high temperatures. This suggests that the dangerous form slowly accumulates until damage begins to happen.
This long incubation period makes it hard to collect pure samples of the prion protein, as it took mice some 200 days to develop useful quantities of the protein, a figure that was cut when it was shown that scrapie prion can be grown much faster in hamsters.
We know the normal prion protein is an ordinary component of white blood cells (lymphocytes) and that it is found in many other tissues as well. Prusiner has even shown that the hereditary forms of prion diseases like CJD are caused by mutations in the prion gene. Transgenic mice carrying the mutated form of the gene were shown to develop a scrapie-like disease. More importantly, mice which lack the prion gene altogether (and which thus lack the protein) are immune to the scrapie-like disease, even when they are treated with disease-causing prion protein. Curiously, the mice lacking the prion gene are apparently healthy, suggesting that the normal prion molecule is not an essential protein in mice, making its normal biological role something of a mystery.
Mink, cats, deer and moose are all affected by similar diseases, the "mad cow disease" is now probably the most widely known prion disease, but the kuru, or "laughing sickness of the Fore (pronounced "4ray") people of New Guinea was also caused by a prion, though at the time when Carleton Gajdusek received the 1976 Nobel Prize in \JPhysiology\j or Medicine, it was thought to be a "slow virus".
Another prion-like disease is Gertsmann-StrΣussler-Scheinker (GSS) disease, a hereditary \Jdementia\j resulting from a \Jmutation\j in the gene encoding the human prion protein. Approximately fifty families with GSS mutations have been identified. The illness takes from two to six years to kill its victims after evidence of first symptoms. Fatal Familial \JInsomnia\j (FFI) is due to another \Jmutation\j in the gene encoding the human prion protein. Nine families have been found that carry the FFI \Jmutation\j. FFI takes about one year from first symptoms to death, as does CJD.
Creutzfeldt-Jacob Disease affects about one person in a million. In between 85% and 90% of cases, CJD occurs spontaneously. The remainder are mainly caused by mutations in the prion protein gene, with a few rare cases caused by infection, like the infections transmitted through growth hormone preparations prepared from the pituitary \Jgland\j of infected individuals, or brain membrane transplants. About 100 families are known carriers of CJD mutations.
Recently, a new form of CJD has arisen. Here, the likely cause is thought to be related to the transfer of the BSE prions to humans. Since 1995 about twenty British patients have been identified who exhibit CJD-like symptoms, attributed to "vCJD" (variant CJD).
Within days of the Nobel committee's announcement, a report appeared in \IScience\i, indicating that vCJD and BSE in mice appear to be one and the same, and distinguishable from normal forms of CJD. This appears to confirm that the vCJD cases were caused by eating infected beef. While vCJD has similar symptoms to classic CJD, vCJD tends to strike younger people, and it develops much more quickly than the classic form.
Moira Bruce of the Institute of Animal Health in Edinburgh, Scotland, and her colleagues, injected mice with infectious brain samples from cows with BSE, from patients who died of vCJD, and with equally infectious samples from classic CJD patients. A paper in \INature\i at almost the same time came from a different direction, showing that BSE prions can turn normal human prions infectious in mice.
All in all, it does not seem fair to claim the Prusiner's prize was too early. But if it \Iwas\i too early, it was only be a matter of days. And Ralf Pettersson, deputy chair of the Nobel Committee at the Karolinska Institute, says the panel was not bothered by the unanswered questions. He says the prize was awarded for the discovery of the prion and its role in the disease process. "The committee is well aware of where the field stands," said Pettersson. "The details have to be solved in the future.
#
"Nobel Prize in Chemistry (1997)",294,0,0,0
The Royal Swedish Academy of Sciences has decided to award the 1997 Nobel Prize in Chemistry with one half to Professor Paul D. Boyer, University of \JCalifornia\j, Los Angeles, USA, and Dr. John E. Walker, Medical Research Council Laboratory of Molecular \JBiology\j, Cambridge, United Kingdom for working out the enzymatic mechanism underlying the synthesis of \Jadenosine triphosphate (ATP)\j in the cell, the other half going to Professor Jens C. Skou, Aarhus University, Denmark for the first discovery of an ion-transporting \Jenzyme\j, Na+, K+-ATPase.
ATP functions as a carrier of \Jenergy\j in all living organisms from \Jbacteria\j and fungi to plants and animals including humans. ATP is produced mainly by the Krebs cycle, which makes the synthesis of the molecule from adenosine diphosphate (ADP) and inorganic phosphate an important study. Skou's discovery of another \Jenzyme\j, sodium, \Jpotassium\j-ATPase, gives us vital information about how an organism maintains the balance of sodium and \Jpotassium\j ions in the living cell.
ATP captures the chemical \Jenergy\j released by the \Jrespiration\j ("controlled combustion") of \Jnutrients\j and transfers the \Jenergy\j to the reactions that require \Jenergy\j. This involves it in processes as different as the building up of cell components, muscle contraction, transmission of nerve messages, to name just a few.
Considerable quantities of ATP are formed and consumed in an organism. At rest, an adult human converts a quantity of ATP corresponding to about one half of his or her body-weight every \Jday\j, and during hard work the quantity can rise to almost a tonne per \Jday\j. This is possible because the products of ATP use, adenosine diphosphate (ADP) and inorganic phosphate, are the raw materials of ATP synthesis. Most of the ATP synthesis is carried out by the \Jenzyme\j ATP synthase, while at rest, Na+, K+-ATPase uses up a third of all ATP formed, making it the body's biggest ATP user.
ATP was discovered by the German chemist Karl Lohmann in 1929. Around 1939-41, Fritz Lipmann (Nobel laureate 1953) showed that ATP is the universal carrier of chemical \Jenergy\j in the cell and coined the expression "\Jenergy\j-rich phosphate bonds". In 1948, Alexander Todd (Nobel laureate, 1957) chemically synthesised ATP.
By the 1970s, researchers had discovered that ATP synthase is made up of three connected sets of protein assemblies: a wheel-like structure inside an internal mitochondrial membrane, a rod with one end fixed to the wheel's hub, and a large cylinder that wraps around the other end of the rod. They already knew that ATP is created at a trio of sites on the cylinder, and that the rod played a key role in turning on the catalytic activity at these sites. But again, the exact mechanism was unclear.
Boyer theorised that the whole system might spin as protons pass through the mitochondrial membrane. This rotation slightly alters the structure of three active sites within the \Jenzyme\j, causing each in turn to catalyse the binding together of building blocks that make up ATP, synthesise a molecule of ATP, and release it. In 1994, Walker and his colleagues at the Medical Research Council Laboratory of Molecular \JBiology\j in Cambridge, United Kingdom, verified Boyer's theory by using x-rays to create an atomic-scale map of the catalytic portion of the \Jenzyme\j.
Skou approached the fuel cycle from the other end. In 1957, he discovered the first \Jenzyme\j that burns up ATP to transport ions across cell membranes. Skou's \Jenzyme\j, known as sodium, \Jpotassium\j-ATPase, uses ATP to \Jpump\j sodium and \Jpotassium\j ions across cell membranes, an activity essential for nerve signal firing as well as a host of other cell functions. Hundreds of related enzymes have since been discovered which have similar functions.
As far back as the 1920s, the ion composition within living cells was known to be different from that in the surroundings. Inside the cells the sodium concentration is lower and the \Jpotassium\j concentration higher than in the liquid outside, so there had to be some sort of one-way \Jpump\j moving the ions. Englishmen Richard Keynes and Alan Hodgkins (Nobel laureate 1963) established at the start of the 1950s that when a nerve is stimulated sodium ions pour into the nerve cell.
The difference in concentration is restored by sodium being transported out once more. It seemed likely that this transport required ATP, since the transport could be inhibited in living cells by inhibiting the formation of ATP. Starting from this point, Jens Skou searched for an ATP-degrading \Jenzyme\j in the nerve membrane that could be associated with ion transport. In 1957 he published the first article on an ATPase, which was activated by sodium and \Jpotassium\j ions (Na+, K+-ATPase).
(In standard biochemist-speak, the -ase ending added onto a chemical name means an \Jenzyme\j which breaks down that chemical, so RNase breaks down RNA, while a protease is an \Jenzyme\j that breaks down protein.)
Jens Skou worked mainly with finely ground crab nerve membranes. The ATP-degrading \Jenzyme\j found in the preparation required the presence of \Jmagnesium\j ions and was stimulated with increasing quantities of sodium ions up to a certain limit. Above this Skou was able to obtain further stimulation if he added small quantities of \Jpotassium\j ions.
Significantly, he observed the largest stimulation at the concentrations of sodium and \Jpotassium\j that normally occur in the nerve, suggesting that the \Jenzyme\j was coupled to the ion \Jpump\j in some way. In his further studies of the \Jenzyme\j mechanism, Skou showed that sodium ions and \Jpotassium\j ions bind with high affinity to different places in the \Jenzyme\j. He also showed that the phosphate group separated from ATP also binds to ATPase.
This is described as a phosphorylation of the \Jenzyme\j, and we now know that the \Jenzyme\j is dependent on sodium ions when it is phosphorylated and on \Jpotassium\j ions when it is dephosphorylated. Substances known to inhibit sodium/\Jpotassium\j transport are certain \Jdigitalis\j \Jalkaloids\j such as oubain, and Skou showed that oubain interferes in the \Jenzyme\j's activation by sodium.
Skou was thus the first to describe an \Jenzyme\j that can promote directed (vectored) transport of substances through a cell membrane, a fundamental property of all living cells. Many other enzymes have since been demonstrated to have essentially similar functions, but Skou was the first, and so well deserving of his share in the prize.
#
"Nobel Prize in Physics (1997)",295,0,0,0
The Royal Swedish Academy of Sciences has decided to award the 1997 Nobel Prize in Physics jointly to Professor Steven Chu, from the USA, Professor Claude Cohen-Tannoudji, from \JFrance\j, and Dr. William D. Phillips, also from the USA, for developing methods used to cool and trap atoms with \Jlaser\j light.
Chu was born 1948 in St. Louis, Missouri, USA., and has been the Theodore and Frances Geballe Professor of Humanities and Sciences at Stanford University since 1990. He was awarded the 1993 King Faisal International Prize for Science (Physics) for development of the technique of \Jlaser\j-cooling and trapping atoms.
Cohen-Tannoudji was born 1933 in Constantine, \JAlgeria\j, and is now a French citizen, and has been a Professor at the CollΦge de \JFrance\j since 1973. Cohen-Tannoudji was awarded the 1996 Quantum \JElectronics\j Prize (European Physical Society) for, among other things, his pioneering experiments on \Jlaser\j cooling and the trapping of atoms.
Phillips was born 1948 in Wilkes-Barre, \JPennsylvania\j, USA. He was awarded the 1996 Albert A. Michelson Medal (Franklin Institute) for his experimental demonstrations of \Jlaser\j cooling and atom trapping. He works at the National Institute of Standards and Technology, Gaithersburg, Maryland, USA.
Hailed as master jugglers and as "masters of manipulating atoms with light", able to "make atoms float in optical molasses", their achievement has been to use \Jlaser\j beams to slow atoms down by bombarding them with \Jlaser\j light, until their \Jtemperature\j is just millionths of a degree above \J\Jabsolute zero\j\j. At the temperatures we live at, gas atoms and molecules rush around in different directions at a speed of about 4000 km/hr. While it may be considered unsporting to shoot a sitting bird, physicists find it far easier to study a sitting atom. The problem was: how do we absorb all that extra \Jenergy\j, and slow the atoms down?
In a real sense, \Jtemperature\j is a measure of speed, so a cooled atom moves more slowly, but even temperatures as low as -270░C involve speeds of about 400 km/hr. Only as the \Jtemperature\j approaches \J\Jabsolute zero\j\j (-273░C) does the speed fall greatly. When the \Jtemperature\j is one-millionth of a degree from this point (1 ╡K, or 1 microkelvin), free \Jhydrogen\j atoms move at speeds of less than 1 km/hr (= 25 cm/s).
The other problem is that gases turn into liquids and solids at very low temperatures, making it harder to pick out individual atoms. The \Jhydrogen\j atoms in normal gas samples, are no longer free, so the problem becomes one of cooling very thin gas in a high vacuum to a very low \Jtemperature\j. In a high vacuum, condensation and freezing effects can be minimised, and we have a mass of slowly-moving gaseous atoms.
Chu, Cohen-Tannoudji, and Phillips developed methods of using \Jlaser\j light to cool gases to the ╡K \Jtemperature\j range and found ways of keeping the chilled atoms floating or captured in different kinds of "atom traps". The \Jlaser\j light functions as a thick liquid ("optical molasses"), in which the atoms are slowed down. Individual atoms can be studied there with very great accuracy and their inner structure can be determined. As more and more atoms are captured in the same volume, a thin gas forms, and its properties can be studied in detail.
As a result of their work, we now know a great deal more about the interactions between radiation and matter. In particular, their work has opened the way to a deeper understanding of the quantum-physical behaviour of gases at low temperatures, where most of our intuitive ideas about the "real world" break down.
In our everyday world, the one that our intuition understands, light is light and matter is matter. Yet inside the world of quantum physics, light may be described as a stream of particles that we give the name \Jphoton\j. These photons have no mass in the normal sense but, just like a rolling billiard ball, they have a certain momentum. A flying stone that collides with an identical rock can transfer all its momentum (mass times velocity) to that second rock, and itself become stationary. In the same way, a \Jphoton\j that collides with an atom can transfer all its momentum to that atom.
This will only happen if the \Jphoton\j has the right \Jenergy\j level, which is the same as saying the light must have the right frequency, or colour. This is because the \Jenergy\j of the \Jphoton\j is proportional to the frequency of the light, which in turn determines the \Jphoton\j's colour, a matter that was first worked out by Max Planck.
Because of this relationship, red light is made up of photons with lower \Jenergy\j than those of blue light. To move beyond this point, though, we need to consider the Doppler effect.
Aside from giving a train whistle a higher pitch when the train is approaching than when it is standing still, the Doppler effect also changes the frequency of light shining on an atom, depending on the way the atom is moving. If the atom is moving towards the light, the light must have a lower frequency than that required for a \Jphoton\j approaching a stationary atom, if it is to affect that atom.
Imagine a moving atom, moving fairly fast, and coming into collision with a collection of photons. If the photons have the right \Jenergy\j the atom will be able to absorb one of them and take over its \Jenergy\j and its momentum. The atom will then be slowed down somewhat, but within a short period of time, around a hundred-millionth of a second, the slowed-down atom emits a \Jphoton\j, at a random direction, giving the atom a certain small recoil velocity.
Because the directions of \Jphoton\j emission are random, while the absorption is all from one direction, the overall effect is to slow the atom down. With just the right arrangement of the \Jlaser\j beam, the slowing effect is comparable to the effect of the gravity of a \Jplanet\j the mass of a hundred thousand \Jplanet\j Earths. This will only happen if the \Jlaser\j frequency is matched to the atoms it is blasting.
Around 1985, Chu and his colleagues set up three opposing \Jlaser\j beam pairs at right angles to each other at the Bell Laboratories in Holmdel, \JNew Jersey\j. Sodium atoms from a beam in vacuum were first stopped by an opposed \Jlaser\j beam and then conducted to the intersection of the six cooling \Jlaser\j beams.
The light in all six of the \Jlaser\j beams was slightly red-shifted compared with the characteristic colour absorbed by a stationary sodium atom, meaning that the photons would interact with atoms coming towards them, but "ignore" atoms going away from them. So whichever direction the sodium atoms tried to move, they were met by photons of just the right \Jenergy\j and pushed back into the area where the six \Jlaser\j beams intersected.
This leads to something that observers said looked like a glowing \Jcloud\j the size of a pea, consisting of about a million chilled atoms. This is called Doppler cooling, and the sodium atoms were found to have speeds of about 240 ╡K. This is equivalent to a sodium atom speed of about 30 cm/s, and agreed very well with a theoretically calculated \Jtemperature\j-the Doppler limit-then considered the lowest \Jtemperature\j that could be reached with Doppler cooling.
At this point, the atoms are cooled but not captured: they still fall under the influence of gravity. Here we need to turn to the work carried out by Phillips and his colleagues. By the beginning of the 1980s by William D. Phillips and his co-workers were using magnetic traps to slow down and completely stop atoms in slow atomic beams.
They used a "Zeeman slower", a coil with a varying magnetic field that affects the atoms' characteristic \Jenergy\j levels by the Zeeman effect, so that they can be slowed even more as they are slowed by a \Jlaser\j beam. This sort of trap is relatively weak, and though Phillips had "stopped" sodium atoms by 1985, the atoms were hard to retain. Once Chu managed to cool atoms in optical molasses, Phillips designed a similar experiment and started a systematic study of the \Jtemperature\j of the atoms in the molasses. He developed several new methods of measuring the \Jtemperature\j, including one in which the atoms are allowed to fall under the influence of gravity, the curve of their fall being determined with the help of a measuring \Jlaser\j.
By 1987, the researchers had constructed a magneto-optical trap (MOT). This uses six \Jlaser\j beams in the same sort of array as in the first experiment, but it also has two magnetic coils that give a slightly varying magnetic field with a minimum in the area where the beams intersect. A Zeeman force develops which is greater than gravity and which therefore draws the atoms in to the middle of the trap. The atoms are now really caught, and can be studied or used for experiments.
In 1988, Phillips found that a \Jtemperature\j as low as 40╡K could be attained, six times lower than the theoretically calculated Doppler limit. This would later be explained by the recognition that the Doppler limit calculations were based on an unrealistically simple model of the atom.
Meanwhile, Claude Cohen-Tannoudji and his co-workers at the ╔cole Normale SupΘrieure in Paris had already studied more complicated cooling schemes in theory. The recoil velocity an atom gains when it emits a single \Jphoton\j corresponds to a \Jtemperature\j termed the recoil limit, which depends on the atom's mass. For sodium atoms the recoil limit is 2.4 ╡K and for heavier caesium atoms, about 0.2 ╡K. Working with Cohen-Tannoudji and his colleagues, Phillips showed that caesium atoms could be cooled in optical molasses to about ten times the recoil limit, i. e. to about 2 ╡K. Later, they found that under the right conditions, atoms can be cooled to a \Jtemperature\j about five times higher than the recoil limit.
The next step was to make stationary atoms go into a "dark" state, where they are unaffected by photons. Cohen-Tannoudji and his group developed a method based on the Doppler effect which converts the slowest atoms to a dark state between 1988 and 1995, at least with \Jhelium\j atoms, able to get as low as 0.25 ╡K with two opposed \Jlaser\j beams, sixteen times lower than the recoil limit. Then with six \Jlaser\j beams, they reached a state in which the whole velocity distribution corresponded to a \Jtemperature\j of 0.18 ╡K. Now the \Jhelium\j atoms were crawling along at a speed of only about 2 cm/s.
Since then, Chu has built an atomic fountain, in which \Jlaser\j-cooled atoms are sprayed up from a trap like jets of \Jwater\j, until the pull of gravity slows them. By illuminating the atoms at the top of their climb with \Jmicrowaves\j, it is possible to sense the atoms' inner structures, hopefully leading to atomic clocks with a hundredfold greater precision than at present. Other applications may include lasers made of coherent "waves" of atoms, and other uses have been identified in areas as far apart as \Jbiology\j, physics, and even sensing gravitational anomalies that can reveal underground oil fields.
In 1995, other researchers used the same methods to create a "Bose-Einstein condensate," in which the cold atoms' quantum-mechanical waves all overlapped to create a new state of matter.
#
"Economics Nobel Prize (1997)",296,0,0,0
Even the "dismal science" has its equivalent of a Nobel Prize these days. The Royal Swedish Academy of Sciences has decided to award the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 1997, to Professor Robert C. Merton, \JHarvard University\j, Cambridge, USA and Professor Myron S. Scholes, Stanford University, Stanford, USA for a new method to determine the value of derivatives. Merton was born in 1944, Scholes was born in 1941. Together with the late Fischer Black, they developed a pioneering formula for the valuation of stock options. Their method has paved the way for economic valuations in many areas and facilitated more efficient risk management in society.
While many see the futures markets as new places and ways for gamblers to lose money, these markets were created to assist producers and manufacturers to manage the risk of huge price shifts up or down. The derivative is a reflection of the market's judgement on future trends, and necessarily rises or falls faster than the commodity price from which it is derived. This is why it was possible for bad bets on derivatives to bring down Barings, Britain's oldest bank, and for equally bad bets to drive Orange County, \JCalifornia\j, into \Jbankruptcy\j.
Markets for options and other so-called derivatives have a legitimate use in daily business. They are important in the sense that agents who anticipate future revenues or payments can ensure a profit above a certain level or insure themselves against a loss above a certain level. (Futures carry an obligation to trade at a nominated price and are two-sided, but options allow for \Jhedging\j against one-sided risk-options give the right, but not the obligation, to buy or sell a certain security in the future at a prespecified price.)
Before you can efficiently manage risk, you need to be sure that such instruments are correctly valued, or priced. A new method to determine the value of derivatives is thus one of the most important contributions to economic sciences over the last 25 years. And that is where Merton, Scholes and Black come in, although Black is unable to share in the prize, as he died in mid-1995. In 1973, Black and Scholes published what we call the Black-Scholes formula (which, as Black made clear some years ago, had an input also from Merton). Robert Merton also devised another method to derive the formula that turned out to have very wide applicability; he also generalised the formula in many directions.
The \JChicago\j Board Options Exchange introduced trade in options in April 1973, one month before publication of the original Black-Scholes option-pricing formula. By 1975, traders on the options exchange had begun to apply the formula, using especially programmed calculators, to price and protect their option positions. Nowadays, thousands of traders and investors use the formula every \Jday\j to value stock options in markets throughout the world, in a global trade that is believed to be worth some $70 trillion each year.
#
"Nobel Prize for Literature (1997)",297,0,0,0
This has been awarded to Dario Fo "who emulates the jesters of the \JMiddle Ages\j in scourging authority and upholding the dignity of the downtrodden". Fo, a dramatist and actor, was born at Lago Maggiore, and is 71. His education included studies at the Academy of Arts in Milan. He is married to the actress and writer Franca Rame.
#
"Nobel Peace Prize (1997)",298,0,0,0
The Norwegian Nobel Committee has decided to award the Nobel Peace Prize for 1997, in two equal parts, to the International Campaign to Ban Landmines (ICBL) and to the campaign's coordinator Jody Williams for their work for the banning and clearing of anti-personnel mines.
There are at present probably over one hundred million anti-personnel mines scattered over large areas on several continents. Such mines maim and kill indiscriminately and are a major threat to the civilian populations and to the social and economic development of the many countries affected.
#
"Gairdner Awards for 1997",299,0,0,0
Many Nobel Prize winners have earlier won Gairdner Awards, Canada's top medical prize, and this year's four winners, all US researchers, must be hoping for a future call from Scandinavia.
Three awards, valued at $25 000 were presented in Toronto on October 24. Alfred Knudson Jr. of the Fox Chase Cancer Center in Philadelphia won for his contributions to cancer \Jgenetics\j, which led to the idea of \Jtumour\j-suppressor genes, Erikki Ruoslahti of the Burnham Institute, La Jolla, \JCalifornia\j, and Richard Hynes of the \JMassachusetts\j Institute of Technology shared a prize for their research on cell adhesion, and Cory Goodman of the University of \JCalifornia\j, Berkeley, earned his award for contributions to developmental neurobiology.
Knudson found in the 1960s that children who inherit a \Jmutation\j in a particular gene have a higher risk than others of developing a cancer called retinoblastoma. Knudson reasoned that a random \Jmutation\j might strike out the functional copy in the millions of dividing retinal cells, unleashing cancerous growth of the cell.
Ruslahti and Hynes discovered and characterised fibronectins and integrins, molecules responsible for making cells stick to substrates and to each other. They also recognised that cells change in response to the surrounding matrix, a fact that has clinical importance in cancer, blood coagulation, and wound healing.
Goodman has helped to shed light on how the brain wires itself during development.
#
"Ig Nobel awards",300,0,0,0
(Oct '97)
Those who were missed by the Nobel juries, who failed to be considered for the Gairdners, can at least take comfort in the thought that they were not picked for an Ig Nobel award either. These spoof awards, issued each October, reflect the lighter-hearted side of science.
At the seventh "First" Annual Ig Nobel Prize Ceremony, ten individuals or research teams were honoured for achievements that "cannot or should not be reproduced." Unlike the Nobels, posthumous awards are available, and the \Jmeteorology\j prize went posthumously to Bernard Vonnegut of the State University of New York, Albany (novelist Kurt Vonnegut's older brother) who showed that you cannot assess wind speeds by firing a chicken into a \Jtornado\j with a cannon, as it is impossible to tell whether the chicken loses its feathers in being fired, or in the \Jtornado\j.
Vonnegut's tongue may have been in his cheek, but the medical award (for showing that the immune response is stimulated by Muzak) featured a Muzak employee and reputable workers, and appears to be a genuine piece of research.
Only one recipient turned up to receive his award, urban ecologist Mark Hostetler, who scraped dead insects off the windshields of \JGreyhound\j buses all the way from \JMassachusetts\j to British Columbia in researching his Ig Nobel-prizewinning book, \IThat Gunk on Your Car: A unique guide to insects of North America\i (Ten Speed Press, 1997), which has been garnering rave reviews. The book features colour illustrations-before and after windshield contact-as well as lots of genuine natural history. Speaking after receiving his award, Hostetler said triumphantly "At least these insects did not die in vain."
#
"Former Nobel laureates call for greenhouse cuts",301,0,0,0
(Oct '97)
During late September, more than 1500 of the world's leading senior scientists, including the majority of Nobel laureates in science, signed a landmark consensus declaration urging leaders world-wide to act immediately to prevent the potentially devastating consequences of human-induced global warming. The "World Scientists' Call for Action at \JKyoto\j" was presented to the Clinton Administration at the end of September at a Science Summit on Climate Change in Washington, DC.
Nobel laureate Henry Kendall, Chairman of the Union of Concerned Scientists, and author of the scientists' statement, said it would be ". . . a grave error to believe that we can continue to procrastinate. Scientists do not believe this and no one else should either."
In December, world leaders will gather in \JKyoto\j, \JJapan\j, to negotiate final agreement on a treaty to reduce emissions of carbon dioxide and other heat-trapping gases that are altering the climate. In the run-up to that conference, political statements and political deals will become more common: as we have reported in previous months, the Australian government has been especially active in trying to avoid taking uniform action with the rest of the world. The Australian attack has been two-pronged: casting doubt on computer models that show climate problems from increased greenhouse emissions while accepting happily even the flimsiest economic computer model showing job losses if \JAustralia\j does not continue on its present course.
The scientists' call represents opinions from 63 countries around the world, and a majority of the world's Nobel winners in science-98 out of 171-signed the statement, which cautiously makes the following points:
ò Global warming is under way and our overuse of \Jfossil\j fuels is partly to blame.
ò Climate change is projected to raise sea levels; increase the likelihood of more intense rainfall, floods, and droughts; and endanger human health by greater exposure to heat waves and encroachment of tropical diseases to higher latitudes.
ò Climate change is likely to exacerbate food shortages and spread undernutrition by adversely affecting \Jwater\j supplies, soil conditions, \Jtemperature\j tolerances, and growing seasons.
ò Climate change will accelerate the appalling pace at which species are now disappearing, especially in vulnerable ecosystems. Possibly one-third of all species may be lost before the end of the next century.
ò Continued destruction of forests will undermine the environment's natural ability to store carbon, thereby enhancing global warming.
#
"Emissions trading system proposed",302,0,0,0
(Oct '97)
On October 22, President Clinton tried to look on the bright side of global warming, presenting it as a "golden opportunity" for the United States rather than a costly catastrophe, as he outlined his administration's new "flexible, market-based" plan for curbing the emission of heat-trapping gases into the \Jatmosphere\j.
He proposes an international "emissions trading system" that would allow companies in developing countries to buy the right to pollute from companies with cleaner technologies. He also said the United States should provide emissions credits and tax cuts to industries that reduce emissions early, make binding pledges to reduce its own greenhouse gas emissions early in the next century, and give a US$5 billion boost over the next 5 years to research and development aimed at using \Jenergy\j more efficiently.
This is to be the basis of the plan the US will take to \JKyoto\j in the first two weeks of December, and it may be about right, as it was immediately attacked by both the American \JPetroleum\j Institute and the Sierra Club. Unlike the Australian government, which is leading an attack on cuts because it fears crimps on economic growth, Clinton pointed out that similar objections were raised against efforts to reduce \J\Jacid rain\j\j. In fact, he says, \J\Jacid rain\j\j controls are 40% ahead of schedule and 50% below projected costs.
In anticipation of the meeting in \JKyoto\j, the President's Committee of Advisors on Science and Technology has weighed in with a proposal that the US government spend $1.1 billion more on \Jenergy\j research to foster more efficient, and renewable, technologies. Yet if the truth be known, a lot could be done, simply by avoiding appliances that lie, says one American critic of the \Jelectronics\j industry.
No, this is not a reference to artificial intelligence, just a case of "off" switches that aren't. You may think you have turned off the TV, the microwave, or the \JCD\j player, but behind the scenes, that appliance is still drawing power to operate timers, memories, or remote control sensors that will bring the appliance back to life again. More and more, the rule seems to be: if it's plugged in, it's on.
A typical first world home uses about 50 watts to power devices that supposedly aren't on, which accounts for 5 percent of the total electricity use. Such unseen consumption, often referred to as leaking current, constitutes the electric analogue of heat seeping out of poorly insulated homes. In the United States, it adds up to more than $3 billion worth of electricity annually, the output of four large generating stations. And in the European Union, leaking household electricity equals the output of two additional large generating stations.
The problem? Well, for most people, the power stations are a long way away, so electricity is seen as clean and non-polluting, but somewhere, every watt that we use has a carbon cost associated with it: even \J\Jalternative \Jenergy\j\j\j sources involve mining, transporting and smelting processes, so it would be worthwhile finding ways to reduce these costs. To supply the estimated amount of electricity demanded by leaking household appliances world-wide, power plants already spew some 18 million tons of carbon into the \Jatmosphere\j annually!
#
"A million solar roofs",303,0,0,0
(Oct '97)
Meanwhile, it may only be a start, but October also saw President Clinton announce a plan to install solar panels on a million roofs across the USA by the year 2010.
As is usual with all such claims, the solar roofs have been hailed as clean and \Jenergy\j-saving. In fact the industry tends to be a fairly dirty one in terms of polluting by-products, but the US government hopes to get this under control. Equally, large-scale production should bring in economies of scale which will reduce the \Jenergy\j costs involved in creating each solar roof. Then as long as the \Jweather\j stays fine, everything should work rather well . . .
#
"El Nino effects--not an ill wind after all?",304,0,0,0
(Oct '97)
Some 70% of the world's annual \Jtuna\j harvest, around 3.2 million tonnes, comes from the \JPacific Ocean\j. Skipjack \Jtuna\j (\IKatsuwonus pelamis\i) dominate the catch. Although the skipjack are distributed in the surface mixed layer throughout the equatorial and subtropical Pacific, catches are highest in the western equatorial Pacific warm pool, a region of the ocean with low primary productivity rates that has the warmest surface waters of all the world's oceans.
Assessments of \Jtuna\j stocks indicate that western Pacific skipjack catches approaching one million tonnes annually are sustainable, but how can this be done efficiently? A report in \INature\i during October shows that the skipjack population depends on the warm pool, which is a fundamental factor in the El Ni±o Southern \JOscillation\j (ENSO). The authors show that apparent shifts in the skipjack population distribution are linked to large zonal displacements of the warm pool that occur during ENSO events. This relationship can be used, they suggest, to predict (several months in advance) the region of highest skipjack abundance, within a fishing ground extending over 6000 km along the Equator.
Meanwhile, another report in \IScience\i journal during October suggested that an El Ni±o event or any other warm period may help temporarily slow the continual rise in atmospheric carbon dioxide due to human activity. The mechanism behind this braking appears to be a delayed burst in plant growth world-wide that appears to sop up excess levels of one of the greenhouse gases, carbon dioxide.
El Ni±o events and volcanic eruptions may send up global temperatures within just a few months, or depress them just as fast, but the impact on the \Jbiosphere\j seems to take from one to three years to appear. A global warm spell may lead to an initial surge in polar and temperate areas. At the same time, heat-stressed tropical and semiarid regions may show an initial drop in plant production. Later, plant growth leaps ahead in the regions closer to the equator, mopping up atmospheric carbon dioxide.
According to David Schimel, from the US National Center for Atmospheric Research (NCAR) in Boulder, \JColorado\j, these results highlight the usefulness of computer models that connect the \Jatmosphere\j and \Jbiosphere\j. "We were looking specifically for delayed \Jecosystem\j responses in this study because they had been predicted by the models," Schimel said.
The observed patterns of warming correlated globally with carbon dioxide levels and regionally with vegetation growth. Global carbon dioxide levels, which are steadily rising due to human activities, tended to rise more quickly over the first few months after a global \Jtemperature\j peak. The carbon dioxide levels rose at a slower pace during the one-to-three-year period after the \Jtemperature\j peak, followed by another gradual acceleration.
\BEven on the Great Lakes\b
As time passes, the world-wide effects of El Ni±o are seen to spread further afield. Scientists at the University of \JMichigan\j now believe that the phenomenon affects the centre of north America as well, with peaks in the cycle matching surges in storm strength, \Jwater\j levels and destruction on the shores of the Great Lakes.
Don't take to the hills, though-the US Geological Survey has been looking at the relationships between El Ni±o-enhanced rainfall and landslides, according to reports appearing during October.
#
"New climate cycle",305,0,0,0
(Oct '97)
And now we have a new climate cycle. It looks like El Ni±o, it feels like El Ni±o, and if you are watching fish stocks, reservoir levels or farm production, you would say it is El Ni±o. But it isn't.
That, at least, is what University of Washington researchers are saying about the Pacific decadal \Joscillation\j, or PDO. Right now, it is positive, and this helps to explain why US West Coast ocean temperatures have been warmer than average, why winters have been wetter than usual in the South, and why \JAlaska\j salmon harvests have been at historic highs, while there have been record declines in salmon catches along the West Coast.
According to one of the researchers, El Ni±o is a part of the PDO, but they are some years away from fully understanding the phenomenon. According to Nate \JMantua\j, a UW research associate, scientists probably will not have the ability to begin making accurate forecasts for at least another five years. A PDO prediction system, he says, would allow long-term planning in such areas as fisheries, \Jwater\j supplies, agriculture and \Jenergy\j production. "The science right now is more like our understanding of El Ni±o 15 to 20 years ago," says \JMantua\j.
The PDO effect was identified by combing a century of records, looking for patterns, and at first, El Ni±o emerged as the dominant recurring pattern of year-to-year climate variability on the \Jplanet\j for recent times. But when records were studied back to 1900, with the focus on the region north of Hawaii in the Pacific basin, the PDO revealed itself with positive and negative phases lasting from 10 to 30 years.
One frustrating aspect of attempting to forecast the PDO is that it develops over such a long period that a negative or a positive phase can have passed before researchers even discover it. "We can recognise the phenomenon, but we can't say what phase we're in at the time," says David Battisti, University of Washington atmospheric sciences associate professor, who was the first to show why El Ni±o recurs on an average of every four years. "But that's only because we don't yet fully understand it. After all, it has only been in recent years that we've recognised it even exists."
\BBut is it real?\b
With some scientists still doubting the reality of global warming, the \Jpermafrost\j of the Alaskan interior is melting, causing large bills for road repairs. The \JArctic\j generally has warmed by about one \JCelsius\j degree over the past thirty years, but this has been uneven, and \JAlaska\j is now about three degrees warmer than thirty years ago. Perhaps this can be related to the effects of the PDO?
#
"Antibiotic resistance in bacteria",306,0,0,0
(Oct '97)
This is an emerging issue which is starting to attract public attention. It has almost become a regular item in the Updates section. This month, we will look at the question in more detail, but without covering issues already raised in Antibiotic-Resistant Bug Found (April), Resistant \JAntibiotics\j (July) and Bacterial Resistance (August).
The United States has recently approved the use of the antibiotic fluoroquinolone in animals intended for food in the United States. This means there will be a need for continued surveillance for quinolone-resistant \ISalmonella\i, says a report in the October issue of \IEmerging Infectious Diseases\i.
#
"TB from a contaminated endoscope?",307,0,0,0
(Oct '97)
With resistance in tuberculosis \Jbacteria\j on the increase, a worrying new infectious pathway may just have been identified. A contaminated endoscope may have transmitted tuberculosis between two \Jhospital\j patients, according to a report in the October 1, 1997 issue of the \IJournal of the American Medical Association\i (JAMA). Researchers found identical DNA "fingerprints" in bacterial cultures isolated from two TB patients.
Since the cases were detected six months apart, no link between the two people was suspected until DNA fingerprinting revealed a perfect match. The only identifiable link between the two patients was the \Jhospital\j where they both had been bronchoscoped. While the suspicion remains no more than a suspicion, they were definitely bronchoscoped with the same instrument in the same room, and no other bronchoscopies were done between the two patients.
#
"Antibiotic resistance is permanent",308,0,0,0
(Oct '97)
Medical logic says that antibiotic-resistant \Jbacteria\j are committing resources to resistance, and that in the absence of \Jantibiotics\j, these \Jbacteria\j will lose out in competition with normal \Jbacteria\j. In this case, the logic appears to be a little too hopeful.
A random survey last year of \IEscherichia coli\i \Jbacteria\j collected from a \Jday\j-care centre in \JAtlanta\j has indicated that these microbes will remain resistant long after doctors stop prescribing the \Jantibiotics\j which drove them to develop resistance. Experiments point to a disturbing hypothesis: a second \Jmutation\j may be blocking the resistant \Jbacteria\j from reverting to sensitivity.
Meanwhile a report in \INature\i warns that the widespread use of \Jantibiotics\j in the farming industry has selected for antibiotic-resistant \Jbacteria\j which are commonly found in food, such as soft cheeses and raw meats. Now a plasmid has been identified that may be responsible for antibiotic resistance in a strain of \ILactococcus\i bacterium present in cheese. The plasmid contains a collection of antibiotic resistance genes and genetic elements transferred from many species of \Jbacteria\j and has the potential to transfer this antibiotic survival kit to other species.
Now the trickle of warnings about resistance has started to flow out into the general public conscience. An American study indicates that by the year 2000, half of the infections caused by the bacterium responsible for 7 million cases of middle ear infection (otitis media) in children and 500,000 cases of \Jpneumonia\j in children and adults each year in the US will have some resistance to \Jpenicillin\j.
The bacterium involved in middle ear infections, \IStreptococcus pneumoniae\i is a "community bug". While it is also transmitted among patients in hospitals, it is transmitted more commonly outside of hospitals, particularly among children in daycare centres.
Aside from the medical risks of antibiotic resistance, there are straightforward financial costs involved. Two strains of \IStaphylococcus aureus\i, methicillin-resistant \IS. aureus\i (MRSA) and methicillin-susceptible \IS. aureus\i (MSSA), have been identified in a Duke University study. On average, MRSA causes twelve days more hospitalisation than MSSA infections, boosting the cost due to MRSA infections to $27,082, compared with $9,661 for MSSA.
The best data available at the moment seem to come from the United States, where more than 2 million Americans each year acquire nosocomial, or in-\Jhospital\j, infections, and between 60,000 and 80,000 die. The most common cause is \IStaphylococcus aureus\i or "golden staph".
#
"Engineered bacteria to fight tumours",309,0,0,0
(Oct '97)
The October issue of \ICancer Research\i indicates that an engineered \ISalmonella typhimurium\i strain can be used to stunt the growth of \Jmelanoma\j tumours in mice, without causing an infection. The only problem at the moment is that the \ISalmonella\i kill the mice, so scientists are breeding weakened strains, lacking the gene for a powerful toxin called lipopolysaccharide, which can trigger an often fatal systemic infection called septic shock.
As far back as the 18th century, \Jgangrene\j was known to cure some cancers, and last century, William B. Coley of Harvard injected "Coley's mixed toxins", a brew of \ISerratia marcescens\i and other \Jbacteria\j, into cancers of the colon and uterus. It is likely that these toxins triggered the production of TNF (\JTumour\j Necrosis Factor), a cytokine which is currently of interest to researchers around the world.
He successfully treated cancers of the uterus and prostate, and some of Coley's patients survived another thirty years, while the mice only prolonged their lives by about four weeks: the difference may have been that Coley kept on injecting his patients with \Jbacteria\j so as to keep them permanently at the feverish stage. Now today's researchers are looking at fitting the bacterium with extra cancer-fighting genes, so it will pack a real punch, but that is a side-issue to the continuing problems of controlling the \Jbacteria\j rather than having them control us.
Meanwhile, Israeli researcher, Haya Lorberboum-Galski, has been developing fusion proteins that can introduce a bacterial toxin into adenocarcinomas, the tumours found in cancers of the breast, colon, lung, \Jovary\j and prostate. The protein consists of a short chain of amino acids linked to a toxin from \IPseudomonas\i.
#
"Pathfinder and Sojourner lose contact",310,0,0,0
(Oct '97)
The end of October may (or may not) have seen the last of NASA's Pathfinder and its Sojourner exploring machine on Mars. The batteries appear to be dead, but the \Jday\j-time solar cells may also be out of commission, or the intense (minus 100 degrees \JCelsius\j) cold of the Martian night may have cracked some key chips. Sojourner and Pathfinder may just lie there until picked up by an expedition next century and hauled back to take pride of place in a space museum, or perhaps even made key exhibits in the first Museum of Mars.
Perhaps, just perhaps, Pathfinder may come back again, and transmit yet more data, but what has been sent already will keep scientists working happily for many years. Pathfinder was last heard from on October 7, although there have been almost daily attempts to reestablish the connection. Whatever happened, the nominal lifetime had been well and truly exceeded, but whenever a machine keeps going, scientists are never backward about finding more tasks from their servants.
\BSurface rocks suggest a wet early Mars\b
Close-up images sent back by Pathfinder in early October have provided independent evidence for a benign and possibly life-supporting early climate. The images appear to show cobbles, pebbles which are several centimetres across, that may have been rounded in flowing \Jwater\j, along with conglomerates, a sedimentary rock made of rounded pebbles and sand, rushed along in a torrent.
#
"Getting to Mars more cheaply",311,0,0,0
(Oct '97)
A scheme has been proposed which would see 180 kg Mars probes being carried aloft as piggyback units during major launches of Ariane, the \JESA\j launch vehicle. Once beyond the \Jearth\j, the probes would go into a holding \Jorbit\j, ready to be launched when a launch window opens (this happens when Mars and \JEarth\j are in the right relative positions).
The probe would need 65 kg of fuel, allowing a payload of 115 kg for instrumentation, propulsion and communication equipment, and a \Jballoon\j system that would lower the probe gently through the thin Martian \Jatmosphere\j.
\B. . . and more quickly\b
If humans are ever to get to Mars, the trip will have to take less than 600 days, and scientists at the University of \JFlorida\j are working on a nuclear propulsion system they say could shorten a manned trip to Mars by more than a year.
The \Jspacecraft\j would leave \Jearth\j aboard a standard chemically fuelled rocket, and then a nuclear thermal propulsion rocket would take over. According to the developers, a small reactor about the size of a 55-gallon barrel (a standard 200 litre drum) can provide enormous power for propulsion and carry a \Jspacecraft\j at much higher speeds than can an equivalent chemical system, getting there in 200 days, rather than 600, and saving astronauts from both boredom and cosmic radiation.
Nuclear power can more than double the \Jspacecraft\j's speed. It would produce \Jhydrogen\j heated to "more than 5,000 degrees Fahrenheit" (about 2800 degrees \JCelsius\j) which would then leave the rocket nozzle and provide thrust. Like the Apollo missions of the 1960s and 1970s, these missions would \Jorbit\j Mars while sending a Mars lander down to the surface. The researchers believe that the problems of sending large amounts of fissile material are smaller than people think, but it is possible they are considering practical rather than political problems.
Those who think otherwise should look at the public reactions to the small nuclear power source which is part of the Cassini mission.
#
"Cassini on its way",312,0,0,0
(Oct '97)
Amid controversy and protests about \Jplutonium\j, the Cassini mission is under way. Headed for Saturn, the mission will probe the \Jplanet\j's spectacular ring system, bizarre moons and atmospheric gases. After a number of delays, including wind problems, Cassini got away on October 15, well inside the launch window which allows a low-fuel trip with plenty of manoeuvring power at the end.
The mission's seven-year journey to Saturn began with the lift-off of a Titan IVB/\JCentaur\j carrying the Cassini orbiter and its attached Huygens probe, at the start of a 3.2 billion \Jkilometre\j haul. The scheduled arrival date, after several close passes of the inner planets to gain speed by \Jgravity assist\j, it will arrive at Saturn and Titan on 4 July 2004.
The launch was marked by protests about the \Jspacecraft\j's main power source, which is \Jplutonium\j-fuelled. The protests about the \Jplutonium\j relate to a misunderstanding or misrepresentation of the power unit installed on board the \Jspacecraft\j. This contains about 33 kg of \Jplutonium\j 238, and the protests have been based on the assumption that the \Jplutonium\j would all be vaporised into fine dust, and breathed into human lungs, after a catastrophic accident. In reality, the \Jplutonium\j is present in a ceramic form and protected in a series of separate parcels, so the protesters' scenario is an impossible one: any accident would cause far less damage than they claim.
After its arrival in the Saturn system in 2004, the \Jspacecraft\j will spend four years orbiting Saturn and many of its 18 known moons, providing a flood of new data on what may be regarded as a miniature \J\Jsolar system\j\j. In all, Cassini is made up of an orbiter equipped with 12 scientific experiments and a probe carrying six instrument packages that will parachute into the thick \Jatmosphere\j of Titan, Saturn's largest and most intriguing moon. The earlier visits, Voyager 1, Voyager 2 and Pioneer 11, were all flybys. This longer, slower visit should reveal a great deal more to us.
Titan's \Jatmosphere\j, ten times thicker than \Jearth\j's, is likely to contain \Jnitrogen\j and a wealth of \Jhydrocarbons\j, the building blocks of life. The \Jtemperature\j is a chilly -290 degrees F (-180 degrees \JCelsius\j), but many scientists hope that Titan's surface may contain lakes of liquid \Jmethane\j and \Jethane\j, and that organic molecules may constantly be raining down from the moon's thick clouds onto its surface.
And then there are the rings of Saturn. The rings are of special interest to scientists generally, but especially to Larry Esposito, who found the F ring in 1979 from "Pioneer" photos. According to Esposito, "Saturn's rings have a very violent history. I think they were created by the break-up of a small moon perhaps 100 million years ago. The fact that we can view Saturn's rings today may be due purely to chance. I expect at some point all the ring material may reform itself into a moon or be ground into dust."
#
"Uranus has two new moons",313,0,0,0
(Oct '97)
At the end of October, with almost no acknowledgment in the daily press, two new moons were added to the collection circling Uranus. Moons 16 and 17 were announced on October 31 to an almost total lack of interest.
Admittedly the moons are no big deal. Described as "80 and 160 km across" (in other words, 50 and 100 miles across), they are a long way from Uranus, but amazingly, they were discovered from observations on \Jearth\j, not from the Hubble Space \Jtelescope\j or from pictures returned to us from other \Jspacecraft\j.
The discovery team used the Hale \Jtelescope\j on Mount Palomar, \JCalifornia\j, to look for slow-moving objects near Uranus that might be moons. The Voyager \Jspacecraft\j found ten moons on its 1986 flyby, but there had been no careful search for moons with modern equipment. The Hale \Jtelescope\j, a 5 metre reflector, located two moons in eccentric orbits almost immediately, making these the first moons discovered from ground-based equipment since 1948.
#
"Conserving Madagascar",314,0,0,0
(Oct '97)
We reported in August on some exciting finds in Madagascar's Anjaharanibe-Sud Special Reserve, where a living \Jfossil\j had been located: now we have more good news for the island's environment. Madagascar's newest national park, Masoala National Park, was formally inaugurated by President Didier Ratsiraka on October 18. Including Madagascar's largest remaining rain forest, the park will help to retain the island's incredible \Jbiodiversity\j, currently under threat from a variety of causes.
Lying east of \JAfrica\j in the Indian Ocean, Madagascar is 1600 km (1000 miles) long. It holds an estimated 5% of the world's species, and some 75% of its species are found nowhere else on \Jearth\j. There are some 14 million people living in poverty on the island, and 75% of the land has now been logged and cleared.
The park will preserve the habitats of two birds, previously thought to be extinct, \Jcoral\j reefs, and whale breeding grounds. It will also provide a continuing living for some 45 000 people who live in the area covered by the park.
Meanwhile, five captive black-and-white ruffed lemurs raised at the Duke University Primate Center were taken to Madagascar during October, with a mid-November release date scheduled, after a month of getting used to the climate in outdoor cages in the Betampona Natural Reserve. More lemurs are being prepared at Duke, and also at a Wildlife Conservation Society site on St. Catherine's Island off the coast of Georgia.
Even with these boosts to the local populations, lemurs will remain highly endangered in Madagascar, as their habitats are destroyed and they are hunted for food. The black-and-white ruffed lemurs, known for the fur that frames their faces and the lush coats of black and white fur, are among Madagascar's most \J\Jendangered species\j\j, as they can only survive in primary tropical rainforests. Significantly, meat from the black-and-white ruffed \Jlemur\j is reportedly the tastier than that of any of its relatives.
Funding has come from many quarters, even from actor-producer John Cleese, who donated the proceeds from the London premier of his comedy film \IFierce Creatures\i, which featured captive lemurs in addition to its human cast.
#
"Brazil's new park",315,0,0,0
(Oct '97)
\JBrazil\j established the world's largest \Jrainforest\j reserve during October, the Ama±a Sustainable Development Reserve, which is the third of a network of protected areas in the Central Amazon Basin. Together, they cover some sixty thousand square kilometres of unbroken habitat across an area larger than Costa Rica.
The Ama±a region is known for its spectacular and untouched \Jbiodiversity\j including endangered Amazonian manatees, black caiman, river dolphins, anacondas, jaguars, black uakari monkeys, harpy eagles, and a wealth of plants and aquatic life. Residence is permitted in the protected areas, but the "locals" are encouraged to participate in the areas' conservation.
The park may be just in time: the US NOAA-12 \Jsatellite\j recorded no less than 24 000 fires in the Brazilian Amazon between early August and mid-December.
#
"Breast cancer not caused by PCBs",316,0,0,0
(Oct '97)
Breast cancer rates are highest in developed countries, leading some researchers to suggest that industrial \Jpollution\j could be an important cause of the condition. The main suspects have been two "environmental oestrogens", synthetic chemicals that can act like \Jhormones\j in the body, and commonly found in people's fatty tissues.
The suspects, polychlorinated biphenyls (PCBs) and DDE, a metabolic by-product of the \Jpesticide\j DDT have seemed to be connected to breast cancer, with women with high blood levels of DDE having a four times greater risk for breast cancer than women with normal levels of DDE, but a later study seemed to go against this finding. Now a large-scale study, reported in the \INew England Journal of Medicine\i during October, has dealt a heavy blow to the theory.
A team led by epidemiologist David Hunter of the Harvard School of Public Health has analysed 240 breast cancer cases that occurred in a group of nearly 33,000 women after they gave blood samples in 1989 and 1990, finding no greater incidence of breast cancer in women with higher levels of the substances.
Another report published in Toronto in October says that over the past few years, the number of women dying from breast cancer declined almost five per cent, the largest short-term decrease since 1950. Professor Judy-Anne Chapman oversaw the analytical review of 153 breast cancer studies for the National Cancer Institute of Canada, and found that data from US studies revealed the incidence of breast cancer between 1940 and 1982 increasing by around 1% each year. Between 1982 and 1987, the increase was about 4% each year.
Between 1989 and 1992, the mortality rate in the USA declined 4.7%, apparently due to better detection at an earlier stage, when tumours are smaller. Another factor might be the new tests to screen women for the BRCA1 gene which leaves them predisposed to develop breast cancers. These tests catch 90% of the American women carrying mutated forms of the gene.
Dutch researchers reported several new mutations in BRCA1 during October. These discoveries may explain the standard test's failure to flag mutations in 20% to 30% of European women with a family history of breast cancer, even when their cancer could be linked to BRCA1 by damage to \Jchromosome\j 17, where the BRCA1 gene is located. Many of the faults were "deletions", where portions of the gene were missing, rather than damaged.
#
"X-Ray laser in development",317,0,0,0
(Oct '97)
Reports appeared during October in \IPhysical Review Letters\i from a \JMichigan\j group and in \IScience\i from an Austrian group, each indicating \Jlaser\j-like behaviour, but at x-ray wavelengths. The method is called high harmonic generation, and even though it isn't technically a \Jlaser\j, it does a good job of imitating one.
The American team aimed ultrashort bursts of infrared \Jlaser\j light at a small jet of \Jhelium\j gas. Like all light, the \Jlaser\j light is made up of rapidly oscillating electric and magnetic fields. Here, though, the \Jlaser\j's electric field is so intense, and the pulse so brief, that it can strip an \Jelectron\j from a \Jhelium\j atom and slam it back in a single \Joscillation\j cycle. As it returns violently to the parent atom, the \Jelectron\j emits a high-\Jenergy\j (x-ray) \Jphoton\j. Because many nearby atoms in the gas are hit by the oscillating \Jlaser\j beam at the same time, the emitted x-rays are coherent, meaning that they oscillate in step and emerge as a remarkably \Jlaser\j-like beam.
Usually this sort of result requires huge pieces of equipment, while this achievement has come from "tabletop" apparatus. The exciting thing about this development is that coherent x-rays with a wavelength of less than 4.4 nanometre, in the "\Jwater\j window" where carbon absorbs better than \Jwater\j, will allow the imaging of living cells. So while the physics is nothing fundamentally new, the applications are likely to shake a few trees over the next few years.
#
"Zebra mussels get a setback",318,0,0,0
(Oct '97)
One of the greatest feral animal problems in the world right now is the zebra mussel of north America's Great Lakes. Eight years ago, zebra mussels from \JRussia\j crossed the Atlantic as stowaways in ship ballast \Jwater\j and then infested the waters of Lake Erie. Once there, they caused huge damage by "glomming" onto native clams and suffocating them. A report in \INature\i at the end of October tells us that some clams have survived by burrowing into soft mud that suffocates the zebra mussels attached to their shells.
During an ecological study before a new dyke was built, no less than 21 native species of clam were found in a haul of 7000 individuals, now in secure lodgings at Ohio State University, from which they will return to the wetland they were found in, once work is complete.
#
"Pig organs infected",319,0,0,0
(Oct '97)
An October report in \INature\i says that pigs carry two "porcine endogenous retroviruses" (PERVs for short) in their tissues, and that these viruses may put paid to any plans to use pig organs to relieve the chronic shortage of donor organs for transplantation into people.
Tests last February showed that the viruses could infect cultured human cells, but now we know that the PERVs are found in all pig tissues, including \Jkidneys\j and hearts, and the two versions, called PERV-A and PERV-B, can both infect human cells. It appears that any advances in the use of pig organs will depend on developing a breed of pigs purged of PERVs. This, say the authors, will not be easy, with 20 to 30 copies of the virus particles in most of the cells checked.
#
"X chromosomes mutate more",320,0,0,0
(Oct '97)
Old fathers may be a genetic liability for their offspring, according to a study, published in \INature \JGenetics\j\i at the start of October, which shows that males really do mutate more than females. A study of \Jmutation\j rates in the Y \Jchromosome\j (only found in males) and the X \Jchromosome\j (both sexes) might shed light on this, but with three X chromosomes and one Y \Jchromosome\j in a mating pair of mammals, there might be variations in natural selection rates which could hide differential \Jmutation\j rates.
Luckily, birds have a "WZ system", where it is the female bird which carries the mismatched WZ pair, just as a male \Jmammal\j is XY. The answer: even in birds, the "ZZ" males mutate more than females!
The male Y \Jchromosome\j has long been called our genetic junkyard, a clutter of meaningless DNA surrounding a handful of genes, useful only for making more men. Now MIT researchers Bruce Lahn and David Page have found five genes that are used throughout the body to help keep cells working properly, all on the Y \Jchromosome\j, along with another seven genes that are unique to the Y \Jchromosome\j and seem to be expressed just in the testes. The genes lie in regions known to be involved in \Jinfertility\j: if the region is missing, the male is infertile.
#
"Genetic transmission",321,0,0,0
(Oct '97)
Once upon a time, there was "mitochondrial Eve", the only female ancestor of us all in a direct female line. Other women of Eve's time have passed their genes on, but at some stage, those genes have passed through a male. We know this from looking at the small packets of DNA in the mitochondria, cell components that we get only from our mothers. Analysis tells us a simple story: in the direct female line, we have just one ancestor for all humans on this \Jplanet\j, and she lived in \JAfrica\j, between ane and two hundred thousand years ago.
For the past ten years, people have wondered if analysis of the Y \Jchromosome\j, passed only through the male line, would reveal just one male ancestor for us all. Now two separate teams have reported just such a result. What is more, "Y-\Jchromosome\j Adam" appears to have been in \JAfrica\j at about the same time, one to two hundred thousand years ago.
The genetic trail is so clear that it allows researchers to compare the migration patterns of men and women tens of thousands of years ago-and what it reveals is that most of the human genetic spread is due to women. Men, it seems, have been the less adventurous and less travelled gender of the species. In brief, the variations in the Y \Jchromosome\j have a different geographic distribution from variations in mitochondrial DNA.
#
"Upright ancestor gets older",322,0,0,0
(Oct '97)
We have always assumed that it was only our crowd, the hominids, who walked upright. Now Meike K÷hler and Salvador Moyα-Solα say that a 9-million- to 7-million-year-old apelike animal called \IOreopithecus bambolii\i was two-legged as well.
This idea originally surfaced about forty years ago, but there was a lack of good anatomical evidence. K÷hler and Moyα-Solα used partial fossils and undescribed material at the Natural History Museum in \JBasel\j, \JSwitzerland\j to assemble their proof. In the \IProceedings of the National Academy of Sciences\i, they report that the lower back of the animal was arched forward like ours, and the knee joint was vertically aligned, again like ours. In a \Jhominid\j, this would be clear evidence that the owner used to walk upright.
Parts of the ancient \Jape\j's pelvis resemble corresponding areas of \IAustralopithecus afarensis\i, the \Jhominid\j species that includes the partial skeleton of the specimen known as Lucy, they add, but \IOreopithecus\i had a foot like that of no other primate. Its big toe sticks out at about 90░ from the remaining toes, probably giving the animal a short, shuffling stride.
#
"Shrinking a genus",323,0,0,0
(Oct '97)
Students of human \Jevolution\j fall into one of two groups: lumpers and splitters. Recently, the splitters have been having a field \Jday\j, acclaiming \IHomo ergaster\i, \IHomo erectus\i, \IHomo rudolfensis\i, and \IHomo habilis\i and more, all from a set of specimens which seem to run across a continuum. Now a group of scientists from the other side has struck back, implying that all of the robust hominids are one species. Or that they may be . . .
A set of new \IAustralopithecus boisei\i specimens from Konso, \JEthiopia\j lies behind their comments. These show an unexpected combination of cranial and facial features, given what we have believed in the past. In particular, we now have a complete skull (that is, mandible or jaw and associated cranium), which is undoubtedly a 1.4 million year old specimen of \IA. boisei\i, but which has characteristics of other species: \IA. robustus\i from South \JAfrica\j, and \IA. aethiopicus\i, also from \JEthiopia\j.
As usual, the turn of the splitters will come next, probably drawing attention to the well-established similarities between \IA. afarensis\i and \IA. aethiopicus\i which are striking, but which would not cause even the most hardened lumper to group the two species as one.
The problem is unlikely to go away: when you are dealing with a popluation as varied as modern humans, but where two individuals may be as much as a hundred thousand years apart, opinions will always tend to follow observers' fond beliefs.
#
"More vitamin C means fewer cataracts?",324,0,0,0
(Oct '97)
Women who took vitamin C supplements for at least 10 years are only 23 percent as likely to develop cataracts as women who received the vitamin only in their diet, a new study indicates.
In the October \IAmerican Journal of Clinical \JNutrition\j\i, the scientists describe evidence that the human eye derives significant benefits from vitamin C. The study recruited women from the Nurses' Health Study, a \JHarvard University\j project which has been charting diet and disease in more than 120,000 women since 1972. Researchers identified some 56- to 71-year-olds who in the early 1980s had taken vitamin C supplements and others who had not. Of the women, 165 supplement users took eye tests, as did 136 women with no added vitamin C.
None of the women had been diagnosed with cataracts, but 188 showed at least early signs of the disease. Around 60% of these early cataracts appeared in the smaller group of women who had never taken supplements. More importantly, the risk of cataracts decreases when supplementation goes on for a longer period of time. The mean dietary intake of vitamin C for women not taking supplements was 130 milligrams per \Jday\j, about twice the recommended amount but less than one third the average of women taking supplements.
#
"How the universe will end",325,0,0,0
(Oct '97)
In short, the universe is likely to end, not with a bang, but by dissipation. This is what we learn from a study reported in \INature\i in late October, and another report, under review and likely to be published soon in \IAstrophysical Journal Letters\i. It seems that the universe's expansion rate has slowed so little so far that the universe's own gravity will never be enough to pull everything back together again into a Big Crunch, a reversal of the original Big Bang.
Two separate groups of astronomers have analysed the light from distant exploding stars to reach a preliminary verdict on the fate of the expanding universe, and both verdicts are the same. One group is led by Saul Perlmutter of Lawrence Berkeley National Laboratory and the University of \JCalifornia\j, Berkeley, and the other by Brian Schmidt of Mount Stromlo and Siding Spring Observatory in \JAustralia\j.
The two studies looked at exploding stars called type Ia supernovas, which make good standard candles, dotted through the universe. Stars further away can be examined for their red shift, an aspect of the Doppler effect, and this should flush out any subtle variations in the speed of travel, or in the force due to gravity, which \Imay\i not be constant. The variations would tell us that the universe has been changing over time, under the force of its own gravity.
We would be able to detect this because the supernovas further away actually exploded earlier, and their light has just taken longer to reach us. The supernovas have equivalent brightness, wherever they are, the only variation being a result of their distance, since brightness is controlled by the inverse square law, so this allows us to be certain of just how far away they are. Then all we have to do is plot the distance against the pattern of their red shifts. If everything is constant, the plot will be a \J\Jstraight line\j\j, otherwise, it will be a curve.
The result is now shown to be a \J\Jstraight line\j\j. Hence, we will end by being dispersed into thinness.
#
"Tiniest transistor",326,0,0,0
(Oct '97)
In simple terms, a transistor acts to control voltage or current in such a way as to achieve gain or switching action. At some point, transistors will have to stop getting smaller, because at a certain limiting gate size, electrons will start to leak through, even when the transistor is off. Standard wisdom says the limiting gate size is around 30 nanometres.
This limit on size then acts to limit the amount of circuitry on a chip of a given size, and to limit the operational power of central processing units or CPUs, the "chips" which are at the centre of every computer. Now a team at NEC has found a way of halving that limit with clever design, producing transistors which are a mere twentieth of the standard size currently found on the most crammed commercial chips.
The NEC team built their working mini-transistor adding a second gate, shaped something like a top hat with the crown above the first gate and brims above both sides of the channel. This second gate allowed them to leave insulating gaps between the channel and the doped source and drain regions.
A differing view, reported in \INature\i during October, is that we still have a long way to go. The authors suggest that the answer may lie with colloidal crystals, periodical arrays of suspended colloidal particles which may arise in a colloidal suspension where the particles are al identical in size, shape and interactions. The American researchers report measurements of electrical transport in a single-\Jelectron\j transistor made from a colloidal nanocrystal (a crystal of nanometre dimensions) of cadmium selenide.
#
"Neurochip development",327,0,0,0
(Oct '97)
A "neurochip," a silicon rectangle about 4 centimetres wide immersed in a Petri dish, may be the sign of things to come, the forerunner of bionic eyes or machine-mind interfaces, moulded from combinations of silicon and living neurons. The first silicon chip equipped with living nerve cells is now a reality.
The neurochip was reported to a meeting of the Society for Neuroscience in \JNew Orleans\j, late in October, but the purpose of this possible "ancestor" is just to allow us better to understand how nerve cells grow and communicate with each other.
Jerome Pine is a neurophysicist at the \JCalifornia\j Institute of Technology in Pasadena. Together with a team of electrical engineers and biologists, he has formed up a finely etched silicon landscape that confines individual neurons. While confining them, the silicon surface also allowed them to establish connections. The surface has sixteen tiny wells, each about 1/40 of a \Jmillimetre\j in diameter, with short tunnels leading to the surface.
The researchers placed an embryonic rat brain cell in each well. As the cells grew, they sent out long dendrite arms through the tunnels toward neighbouring wells. Wires in the underlying silicon monitored the electrical behaviour of the neurons. This is a development to watch . . .
#
"Collagen brought low",328,0,0,0
(Oct '97)
Collagen is one of those handy things to hang as a label on beauty products. As a beauty treatment, collagen is probably about as useful as diseased warthog liver, but it sells stuff. Now here is a hint for the advertising people: collagen was used around the Dead Sea, 8000 years ago!
Unfortunately, these people, who were still not able to make simple pots, used collagen as a glue. It was a protective lining on rope baskets, containers and embroidered fabrics, as a crisscross-patterned decoration on tops of sculptured skulls and as an adhesive holding together tools and utensils.
Dr Arie Nissenbaum from Israel's Weizmann Institute of Science at first thought the material on various archaeological finds was \Jasphalt\j, but chemical analysis revealed that it was collagen. The chemistry of the samples, along with \Jelectron\j \Jmicroscope\j studies, showed that it was probably collagen derived from animal skin. Carbon 14 dating tells us that the collagen is about 8100 years old. So if collagen does not stop you ageing, at least you know that it can last a very long time . . .
#
"Sparrows not dinosaurs after all",329,0,0,0
(Oct '97)
One of the accepted models of \Jevolution\j for many biologists has been that the dinosaurs did not die: they just sprouted feathers. This belief saw its most memorable presentation in the line from \IJurassic Park\i which reads "I think they're flocking this way" as a stampede of small dinosaurs flees a hunting party of \IT. rex\i.
Now it looks as though the model may need to be reconsidered, after comparative "hand" studies. The theropod dinosaurs, widely believed to be the pre-birds, are generally considered to have preserved digits I, II, and III, while a new study of the hands and feet of developing embryos of birds, alligators, and turtles shows that their hands develop to preserve digits II, III, and IV.
In other words, all of these animals have lost the digits that we would call the little finger and the thumb, and retained the other three. The theropods retain their versions of the ring finger and little finger, as tiny bumps, so there can be no real doubt about this. And if the facts are as stated, there can be no plausible route for the theropods or their immediate ancestors to have become birds.
Given that the theropods first appear in the \Jfossil\j record at least 30 (and perhaps 80) million years after the first birds, perhaps this should not really be so surprising, but it seems a shame. "The great tragedy of science - the slaying of a beautiful hypothesis by an ugly fact", said T. H. Huxley.
There is hope yet, though: a recent \INature\i report describes a v-shaped wishbone from the shoulder region of a \IVelociraptor\i (the small nasty ones in \IJurassic Park\i). The author, Mark Norell, says it is a very convincing wishbone, and in just the right place.
#
"Discovering more about ozone holes",330,0,0,0
(Oct '97)
It is very difficult to study the "ozone hole" over the Antarctic in the southern winter, and it has always been assumed that the hole was a summer phenomenon. A paper in \IScience\i during October indicates that data gathered at \JFaraday\j (65░S) throughout the winters of 1990, 1991, and 1994, shows that ozone depletion starts in June, at the height of the southern winter.
And now there is an \JArctic\j ozone hole as well. In recent years, the \JArctic\j "hole" has been getting worse, according to a letter to \INature\i during October. Worst of all was the unusually cold winter of 1995/96, when the \JArctic\j \J\Jozone layer\j\j fell from about 450 Dobson units to about 300. This change appears to be due to longer stratospheric winters, which allow nitric acid, a "preserver" of ozone, to fall from the \Jstratosphere\j on the surface of ice particles.
#
"Antimatter disproved?",331,0,0,0
(Oct '97)
\JScience fiction\j authors appear to have just lost another range of sub-plots, with the news that no antimatter galaxies lurk in the far corners of the universe. The report revealing this is still to be published by Andy Cohen of \JBoston\j University, Alvaro de R·jula of CERN, and Sheldon Glashow of \JHarvard University\j, but it was released in early October. To appear in print in the February 1998 \IAstrophysical Journal\i, the report reminds us once again of the unfairness of the Big Bang in favouring matter over antimatter.
The universe that the \J\Jbig bang\j\j made should have contained equal parts of matter and antimatter, but we have long known that our cosmic neighbourhood is all matter. We were restricted to hypothesising about antimatter in a Galaxy Far Far Away, or that some of the blobs of matter collided with some of then blobs of antimatter, turning into gamma radiation. Given a few constraints, you can predict how much of this radiation there will be.
The three scientists tested their idea by computing the spectrum of diffuse photons from matter-antimatter annihilation in the early universe. They conclude that even in the most conservative analysis, matter-antimatter annihilation should produce a signal five times as large as the observable diffuse gamma ray background. From this, it looks as though the antimatter just isn't out there.
Nonetheless, a \J\Jspace shuttle\j\j study is scheduled for May 1998, led by physicist Sam Ting of the \JMassachusetts\j Institute of Technology and CERN, which will look for antimatter cosmic rays, such as nuclei of anticarbon, coming from distant antigalaxies. While that study now seems less likely to turn up a result, perhaps the \J\Jscience fiction\j\j fans can keep on hoping for a bit longer.
#
"No successor to Clementine mission",332,0,0,0
(Oct '97)
The proposed successor to the Clementine mission (see No Ice on the Moon After All, June) has been dropped. The plan, which would have fired probes into several \Jasteroids\j, will not now go ahead. Some Republicans in Congress had backed it as a high-tech attempt to learn more about intercepting objects in space while gathering useful scientific data, but the program was not supported by the Defense Department.
#
"Tsunamis prediction",333,0,0,0
(Oct '97)
These giant seismic sea waves, sometimes as high as a five-storey building when they reach shallow \Jwater\j, are produced by undersea earthquakes, landslides or volcanic eruptions. Hard to detect at sea, the tsunamis give little warning to human populations living on the shores of the Pacific, but that may be about to change. Plans are in place for a network of instruments to be deployed and placed on the ocean floor, giving humanity a precious tool to predict and track tsunamis in real time.
The tsunamis travel at speeds close to 600 miles an hour (about 1000 km/hr) in the open ocean and at 100 miles an hour (around 150 km/hr) closer to the shore, so if they are identified near their sources, which may be on the other side of the \JPacific Ocean\j, a huge loss of life could be prevented. A 1960 tsunami, resulting from an undersea \Jearthquake\j near \JChile\j, killed 5000 people near the quake site. The tsunami set off by the tremor killed 61 people in Hawaii and caused millions of dollars in property damage. Then after Hawaii, the tsunami continued for nine more hours, finally striking \JJapan\j and killing another 150 people.
Bottom-pressure recorders (BPR's) and seismic instrument arrays for real-time monitoring of tsunamic development will be deployed by 1998, and more theoretical studies will be undertaken. For example, if the sea floor shakes from side-to-side, then the tsunami that follows will be minimal. But, if there is an up-and-down motion, a tsunami develops. That at least, is the theory, though more work is needed to work out the full dynamics.
At the end of October we learned of a major \Jearthquake\j on the Northwest coast of the United States around 1700 hours, which caused a tidal wave that hit the Japanese island of \JHonshu\j on 26 January 1700 hours. The \Jgeology\j has always suggested that there could have been such a quake, but now this has been confirmed, and located around the Cascadia fault zone.
Gordon Jacoby and his colleagues took core samples from 33 \Jsitka\j spruce trees, each at least 300 years old, that stand along a 100-\Jkilometre\j stretch of Washington and Oregon coastline, and they report in the November issue of \IGeology\i, that they found signs of waterlogging or trauma that had disrupted many of the trees. The evidence came from tree rings dated to around 1699. These signs include changes in ring width; the presence of "traumatic \Jresin\j canals" (sap-conducting tubes formed by altered cells); and "reaction wood," (dense cells formed in response to tilting).
More importantly, a report in \INature\i at the end of October describes dead \Jcedar\j trees along the Washington coast, still standing in what are now salt marshes. These trees last added cells to their timber in the northern growing season of 1699.
Based on the size of the \Jtsunami\j, as recorded in Japanese records, researchers had previously estimated the \Jearthquake\j at \Jmagnitude\j 9, even though no \Jearthquake\j in the area identified, the Cascadia fault region, with a \Jmagnitude\j 5 or above has ever been recorded by seismologists. The \INature\i article points out that the tree dates mean the northwestern United States and adjacent Canada are plausibly subject to earthquakes of \Jmagnitude\j 9.
#
"New world-record prime number",334,0,0,0
(Oct '97)
Back in January 1997 (\JMathematics\j update), we reported the discovery of the largest Mersenne prime so far known, the 35th found, a number derived from n = 1398269, discovered in the Great \JInternet\j Mersenne Prime Search.
We advised you then that anybody can now join in adding to the list of Mersenne primes by contacting George Woltman's Great \JInternet\j Mersenne Prime Search at http://www.mersenne.org/prime.htm and reading what they find there.
Well, if you wanted to find Mersenne prime number 36, you are too late. There are 1700 participants in the hunt now (all you need is a moderately powerful computer and some software provided by Woltman), and Gordon Spence has just found that one of the numbers that he happened to pick to test is the latest record-breaking prime.
The largest number yet identified that is evenly divisible only by itself and 1 is 2^2,976,221 - 1. This number would contain 895,932 digits if written out in full. Spence used a very ordinary Pentium-based desktop computer to find the record-breaking prime. It took 15 days of calculation to obtain the result, which was later verified independently on a supercomputer. Woltman, who organised GIMPS more than a year ago, and who wrote the software that searchers use, announced the discovery during October.
So will one of our readers be the proud discoverer of Mersenne prime number 37?
#
"South-East Asian smoke clouds",335,0,0,0
(Oct '97)
The fires of \JIndonesia\j continued through October, with forests still being cleared by fire during the developing El Ni±o \Jdrought\j. What was once 36 million square kilometres of dense virgin forest, two centuries ago, is now down to 4 million square kilometres across the SE Asian region.
\JMalaysia\j has borne the brunt of the smoke \Jpollution\j, and this has produced some strong reactions, ranging from a millennialist fervour among Islamic fundamentalists in \JKelantan\j to proposals to limit motor traffic in Kuala Lumpur. \JMalaysia\j's ironically named Air \JPollution\j Index or API (\Iapi\i is the Malay and Indonesian root word for "fire") has been placed under question. Measuring five chemical pollutants: ozone, \Jnitrogen\j dioxide, sulphur dioxide, carbon monoxide, and particulate matter, the API has given results that lay observers felt did not match their personal experiences.
Each pollutant is scored on a range from 0 to 500, where 500 denotes "very hazardous", but this is merely a linear addition, and takes no account of interactions between components. As a result of public the protests, the Malaysian government has now redefined the safe limits for API values: where 500 was previously "hazardous", the 300-500 range is now "hazardous", 200-300 is "very unhealthy", and 100-200 is "unhealthy".
Meanwhile, the smoke clouds kept pouring into the skies.
#
"How second-hand smoke kills",336,0,0,0
(Oct '97)
A report in the \IBritish Medical Journal\i during October looks at the mystery whereby 20-\Jcigarette\j-a-\Jday\j smokers have a 78% increased risk of \Jheart disease\j over the general population, a risk elevation which is only three times that of passive smokers, even though active smokers inhale roughly 100 times as much smoke. It seems that even a relatively small amount of smoke causes clotting factors in the blood to become significantly stickier, and even passive smoking causes enough aggregation to explain the \Jheart disease\j risk.
#
"Aspirin and heart disease",337,0,0,0
(Oct '97)
If more people would take an aspirin when they experience chest pain or other symptoms of a severe heart attack, five to ten thousand lives could be saved in the United States each year, according to an American Heart Association scientific statement published in the association's journal \ICirculation\i during October.
#
"Obituary for October 97",338,0,0,0
(Oct '97)
The death was reported in October of Hans Eysenck, one of the most stimulating and controversial psychologists of this century.
#
"Latex allergy",339,0,0,0
(Nov '97)
With AIDs and hepatitis risks causing concern among health workers, \Jlatex\j \Jallergy\j has become a serious concern, and was the subject of an article and an editorial in the \INew England Journal of Medicine\i during November.
Most of the people with problems fall into clearly defined groups with high exposure: health care workers, \Jrubber\j industry workers, and children with spina bifida (meningomyelocele) and urogenital abnormalities. \JLatex\j \Jallergy\j is fairly rare in members of the general population. Avoidance of \Jlatex\j products is the only measure that can avert a serious allergic reaction to \Jlatex\j.
A survey of all active-duty dental officers in the US Army in 1990 suggested a prevalence of \Jlatex\j \Jallergy\j of about 8.8%, indicating how serious the problem may become, but in spina bifida bases, the rates appear to lie in the 20% range, although one study showed that 37% (35/93) had \Jlatex\j-specific IgE in blood tests.
Natural \Jrubber\j is a processed plant product that has found widespread use since the second half of the nineteenth century. Today, almost all natural \Jrubber\j comes from the \Jlatex\j, or milky sap, of the commercial \Jrubber\j tree \IHevea brasiliensis\i. Over 200 other species of plants produce \Jrubber\j, but only \IH brasiliensis\i and the guayule bush, \IParthenium argentatum\i (a metre-tall desert plant that grows in the Southwest US and Northern Mexico) are used to produce \Jrubber\j in commercially significant quantities.
The allergic reaction appears to be caused by a protein fraction in the \Jrubber\j, a portion which can be removed by washing, heat treatment, chlorination, and \Jenzyme\j digestion of the \Jrubber\j. \JRubber\j derived from guayule contains very little of this protein, and seems to cause no problems in people who react to \IHevea\i \Jrubber\j.
Interestingly, cross-reactivity has shown up for certain foods, at least for health workers. In one study, 17 people (36%) had clinical reactivity to at least one food. Some 53% were positive to prick testing with avocado, and a smaller number were reactive to \Jpotato\j, banana, \Jtomato\j, chestnut and kiwi fruit.
One of the problems with \Jrubber\j gloves is that they are commonly dusted with corn starch powder, which is a potent carrier of \Jlatex\j proteins, but in one extreme case, a physician's skin or clothing was apparently tainted with \Jlatex\j \Jantigen\j, as his wife began to show allergic responses which disappeared when the husband washed carefully and changed his clothes after work.
Most health care workers with \Jlatex\j \Jallergy\j can remain at work by switching to non-\Jlatex\j gloves and asking colleagues to use powder-free gloves. \JLatex\j condoms have been associated with problems in both males and females, and a documented life-threatening reaction in a female has been reported. Polyurethane condoms are now available.
In short, says the \INEJM\i article by Dr. Jay Slater , \Jlatex\j \Jallergy\j is rare, but can have devastating effects. High risk groups should be educated regarding potential risks and tested if they have reported symptoms of \Jlatex\j \Jallergy\j. This view is backed up in the editorial, which points out that the extreme allergic reaction, \Janaphylaxis\j to \Jlatex\j, can be fatal.
#
"Cochineal an allergen",340,0,0,0
(Nov '97)
\JCochineal\j, a food colouring agent, was once a valuable dye-stuff, the sort of thing that \Jbuccaneers\j like William Dampier regarded as valuable booty. To historians, \Jcochineal\j is the dye which gave us the uniforms of the "redcoats" and tinted the "thin red line" of the British Army. Now \Jcochineal\j, or carmine, as it is also called, seems to be an allergen.
The colouring is "natural", being made from female \Jcochineal\j bugs, which are harvested in Central and \JSouth America\j and the Canary Islands specifically to be made into dye. It is common in a variety of foods, and some cooks even add it to gravy. Because it is natural, the dye is commonly not listed as a separate ingredient, but after a report in the November issue of the peer-reviewed journal \IAnnals of \JAllergy\j, \JAsthma\j & \JImmunology\j\i, this may need to change. Only one case has been recorded so far, where a patient went into "life-threatening anaphylactic shock" (allergic reaction) in a patient after she ate a popsicle containing the colour, but other cases of mild \Janaphylaxis\j and hives may also have been caused by the additive, without anybody making the connection to the dye.
The woman in question had previously suffered from allergic reactions after applying a blush which also contains carmine. Two other patients have since been found with allergic reactions to the dye.
#
"Polyunsaturated oils no guarantee",341,0,0,0
(Nov '97)
Olive oil, is one of the dietary "good guys", polyunsaturated, and widely recommended. That is to say, it was-now a study in Denmark, reported in the journal \IArteriosclerosis, \JThrombosis\j and Vascular \JBiology\j\i, has found that one high-fat meal almost doubles the peak concentration in the blood of a known risk factor for heart attack and stroke. This is irrespective of whether the fat is-monounsaturated, polyunsaturated, or saturated.
So why are people who feed on "Mediterranean" diets, high in olive oil, so much healthier? Nobody is sure, but the test subjects in the study, all Danes, are in the habit of eating high-fat diets, so further tests are to be carried out, using other Danes who have been "starved" of fats for three weeks before the new study begins.
#
"Saccharin cancer causing?",342,0,0,0
(Nov '97)
\JSaccharin\j is back in the news again. With some medical scientists arguing that rat evidence is not sufficient to say that \Jsaccharin\j will cause cancer in humans, others still say that the risk is too great, especially if the substance ends up in children's drinks.
The evidence is equivocal at best: \Jsaccharin\j \Imay\i cause more bladder cancers in non-smoking women, and it does cause cancers in male rats, but that is hardly damning evidence. But to play safe, an advisory panel to the National \JToxicology\j Program (NTP) in the USA voted on October 31 to keep \Jsaccharin\j in the federal Report on Carcinogens, which has listed the sweetener as an "anticipated human carcinogen" since 1981.
#
"Pathogens in food",343,0,0,0
(Nov '97)
As if that was not enough for most Americans, November saw a major report on food-borne diseases from the Centers for Disease Control and Prevention, \JAtlanta\j. Written by Robert V. Tauxe, the report looks at a wide range of bacterial infections, and basically seems to offer more gloom than hope as we approach the festive season.
\IVibrio vulnificus\i, \IEscherichia coli\i O157:H7, and \ICyclospora cayetanensis\i are examples of newly described pathogens that are often food-borne, while other older \Jbacteria\j are able to flourish as food preparation and storage patterns change around the world, providing desirable ecological conditions for the pathogens. Whatever you eat, there is probably something deadly lurking there, in among the succulent juices.
#
"Oral vaccine against botulism",344,0,0,0
(Nov '97)
\JBotulism\j is poisoning from the by-products of the anaerobic \IClostridium botulinum\i. The bacterium thrives in sealed tins of food, where it produces botulin, or botulinum toxin, which is usually described as nature's deadliest poison.
Now two Philadelphia scientists have reported in the November edition of the journal \IInfection and Immunity\i the development of an oral vaccine against the toxin. Even more importantly, the vaccine may show the way to developing vaccines against other diseases such as \Jdiphtheria\j, whooping cough and tetanus which also do their harm through toxin production. The researchers' long-term aim is to produce a range of oral vaccines which could be inserted into ordinary foods.
Lance Simpson, Nikita Kiyatkin, and Andrew Maksymowych used the complex tools of molecular \Jbiology\j to create a modified and non-toxic version of botulin. When somebody eats food tainted with botulin, the toxin passes through the digestive system and enters the blood circulation unaltered. When it reaches the central \J\Jnervous system\j\j, it causes general paralysis.
The new oral vaccine is similar to the active toxin, and able to pass through the intestinal tract, but it does not attack the central \J\Jnervous system\j\j, though it is sufficiently similar to cause the body to make \Jantibodies\j that will attack the real toxin, if it should appear. So far, the vaccine has only been successfully used to protect mice against \Jbotulism\j, but even that much is a healthy start. The next step may be to use the vaccine on horses and chickens, both likely to die from the effects of \Jbotulism\j if they come in contact with it.
The principle could also be useful against other diseases which produce toxins, and the researchers are already toying with the interesting notion of producing a banana which carries a gene that produces the vaccinating compound, providing cheap and effective protection in Third World countries.
Where our usual antibiotic defences provoke \Jbacteria\j into mutating into resistant forms, it is hard to see how \Jbacteria\j could work their way around this defence. Those few \IClostridium\i \Jbacteria\j which attack humans are lost to the reproductive future of their species, once the can is opened and air gets in. Of course, at the moment, the \Jbacteria\j cannot live on humans. If we became immune to the bacterium we now know and fear, and if we could suddenly carry and nurture the presently toxic \Jbacteria\j without harm, this could even favour the preservation of the toxic form, to the benefit of humans and \Jbacteria\j alike.
In other words, this could be an interesting development to watch, for a number of reasons.
#
"Test tube vaccine",345,0,0,0
(Nov '97)
A team of chemists from Birmingham University has succeeded in making a completely synthetic vaccine. Using no biologically-derived components, the vaccine may be the first of a new range of vaccines, virtually free of side-effects.
The synthetic vaccine was produced by a team led by Dr Geert-Jan Boons. While the first real synthetic vaccines are probably several years away, the present work shows that the approach is feasible. Pathogens carry specific molecules on their surfaces, called epitopes, and it is these which trigger the immune response.
\JVaccination\j involves setting off this immune response in the absence of pathogens, and this was originally done with whole cells which have been "crippled". Later vaccines concentrated on providing just that part of the cell which contained the epitope, but other materials are often included in the vaccine, and these can cause complications.
Epitopes are often oligosaccharides, molecules made up of a small number of sugar sub-units. By studying the human pathogen \INeisseria meningiditis\i, the team focused on the crucial portion of the epitope. They arranged the specific sugar sub-units correctly in relation to one another, and managed at the same time to get the whole molecule's three-dimensional shape correct. They then attached this molecule to a synthetic \Jpeptide\j and a lipid, two other essential components of a vaccine.
#
"DNA as a vaccine?",346,0,0,0
(Nov '97)
DNA can be injected as a vaccine, and this is generally seen as one of the most exciting new approaches in disease protection. Now 1996 Nobel laureate, Rolf Zinkernagel, suggests that animals may gain lifelong immunity to some viral infections by retaining a bit of viral DNA inside their cells like a souvenir.
In a November paper in \INature\i, Zinkernagel and his colleagues (P Klenerman and H Hengartner) describe studies they have carried out on mice which remain immune to lymphocytic choriomeningitis virus (LCMV), long after the virus has completely disappeared from their bodies. This is unusual, as the immune system normally begins to lose its "memory" of an invader as soon as the attack fades away.
Zinkernagel's team undertook an odd search, looking for traces of viral-looking DNA. Even Zinkernagel described it as "a crazy idea," because LCMV contains RNA, not DNA. Like many other viruses, it does not use DNA for its genetic information.
Until at least 225 days after infection, they were still able to find DNA "copies" of viral RNA in the mouse cells. These copies were clearly made from RNA by the \Jenzyme\j reverse transcriptase (RT), since azidothymidine (AZT), an RT inhibitor, blocked viral DNA production in infected cells. (AZT is probably better known in its role as a treatment against \JHIV\j/AIDS infection.)
Hamsters, which can be infected by LCMV showed the same traces, but human, monkey, dog, or cow cells, all of which are not susceptible to the virus, showed no signs of the "viral DNA". But does the DNA actually cause the mice to be immune to LCMV, or is it just a coincidence? That remains to be proven, but if anybody can show that the "viral DNA" is translated into a protein, which in turn can trigger an immune response, this fascinating observation may suddenly become very important as well.
#
"Acupuncture proven effective",347,0,0,0
(Nov '97)
All over the western world, people have accepted \Jacupuncture\j as an alternative treatment for some diseases. Scientists have remained rather sceptical, because "Yin and Yang" explanations do not fit well into the western scientific tradition. Now that attitude is beginning to change. During November, an expert panel assembled by the US National Institutes of Health (NIH) reported that \Jacupuncture\j is effective treatment for nausea and some forms of pain.
Scientific theories about acupunctures cures tended to be about notions like the possibility that \Jacupuncture\j triggers the production of many different chemicals, including pain-killing \Jendorphins\j, calming endogenous benzodiazepines, and mood-lifting \Jserotonin\j.
The panel found most studies on \Jacupuncture\j, used to treat everything from nausea and ovulatory problems to paralysis and drug abuse, to be flawed. Nonetheless, the panel found "clear evidence" that needle \Jacupuncture\j can relieve nausea from operations and \Jchemotherapy\j, and possibly also morning sickness. They also found some support for using \Jacupuncture\j for postoperative dental pain, and signs of pain relief for other conditions.
But is it the power of the mysterious East? The sceptics still seem to have an answer: if the treatment does not release chemicals, then electrical stimulation, either with needles or conductive pads, is "a very simple technique" for stirring up a storm of \Jhormones\j that act on nerves, and which have the potential to create the cures confirmed so far. But does it cure viruses?
#
"Can computers develop immunity to viruses?",348,0,0,0
(Nov '97)
The average computer anti-virus package will often claim to defend your computer against 8000 or more viruses. This number will always be a bit rubbery, as many viruses use a standard piece of code to deliver a different message, and are really the same computer virus. In any case, these "viruses" are generally made by pimply wannabes, who lack any real programming skills, and whose "products" will never be a real problem. The real worry is with viruses that mutate, whether they are the viruses that cause disease in living things, or the viruses that attack computers. Now a unique study is combining both of those virus types.
It began with doctoral student Derek Smith, working closely with Associate Professor Stephanie Forrest and Los Alamos Laboratory Fellow Alan Perelson to build a computer model of the human immune system and doing experiments \Iin machina\i (in machine) to see how different vaccine strategies might work on mutating viruses. Traditionally, studies have been \Iin vitro\i (in the test tube-literally "in glass") or \Iin vivo\i through experiments on live animals.
The computer immune system model they developed simulates a virus on a computer by breaking it down into its basic components and modelling only the patterns where the components bind molecularly. By looking at these patterns, the researchers can work out how the immune system responds to viruses that mutate-and make estimates about the effectiveness of potential vaccines.
The other side of this picture comes when you model a computer virus or intrusion detection system after the human immune system. There is a strong analogy between the way natural immune systems protect animals from dangerous foreign pathogens, including \Jbacteria\j, viruses, parasites and toxins, and the way we would like intelligent software to protect our computers. Immune systems will respond to an entirely novel virus, and learn from the encounter, and it would be nice to have computer virus protection which does the same thing.
Our immune system stores its information in proteins: what is the equivalent of a protein in computers? A \JPh\j.D. student, \JAnil\j Somayaji has come to the conclusion that a computer program's system calls are the missing link. He suggests that a "normal" profile of a computer program's pattern of system calls can be created. This profile can then be monitored, with changes to the norm readily detected. The aim is to distinguish "self" from dangerous "other" (or "non-self") and then eliminate the dangerous non-self. Non-self might be any of an unauthorised user, a foreign code in the form of a computer virus, or even corrupted data.
In cyberspace, people like to talk about convergence, the way in which all of the media are coming together, absorbing \Jtelevision\j, radio, mail, newspapers and more. From this work, it looks as though there may soon be a second convergence, where cyberspace and meatspace begin to converge.
#
"Immunity to student plagiarism?",349,0,0,0
(Nov '97)
Teachers frequently ask about ways of stopping students who lift wholesale from \JCD\j-ROMs. With certain products, the simple style of the writers can make it hard for teachers to be sure whose work they are reading.
The same problem of authorship can arise in training exercises for people learning to work in the computer industry. One of the major tasks for somebody going into software production is to write a compiler, a collection of code which takes a set of instructions written in human terms, and converts it into bug-free code that a computer can run. If a student has lifted this from another student, perhaps in another institution, or from another year, how can the examiners tell? There are just too many other versions around to check them all.
Alex Aiken, an associate professor of computer science at UC Berkeley, has developed a reliable and easy-to-use piece of software that lets anyone check within minutes whether a student in the class has plagiarised their programming assignment. The software also automatically eliminates matches to code that are expected to be shared, such as code libraries or instructor-supplied code, and so eliminating false positives that arise from legitimate sharing of code.
The software, which Aiken calls MOSS for Measure Of Software Similarity, looks for similar or identical lines of code sprinkled throughout a program, then creates a web page where the instructor can see the top 40 matches. Aiken has now posted MOSS on the \JInternet\j, so others can use it as well. MOSS automatically determines the similarity of programs written in any of several computer languages, most commonly C, C++ and Java, but also Pascal, Ada, ML, Lisp and Scheme.
There is still some distance to go: this story has three sentences quoted verbatim from a press release, put out by Aiken's university. So far, there is no way that MOSS or any other program can identify "lifts" of this sort, for which many working journalists (who usually lift far more than three sentences from any given press release) will give hearty thanks.
#
"Tuberculosis in the news",350,0,0,0
(Nov '97)
A curious study on tuberculosis, reported in the \IProceedings of the National Academy of Sciences\i during November sheds some interesting light on the human immune system. More than half of the Yanomami people of \JBrazil\j who had been vaccinated against TB do not produce a regular immune response to the tuberculosis bacterium. This seems to suggest that because previous generations of Yanomami had never been exposed to TB, their immune systems had not evolved a mechanism for mounting an immune response to it.
Europeans, on the other hand, lived with tuberculosis for centuries, so that there was strong selection for resistance, in some form or another, to TB. The Yanomami Indians, isolated from the outside world until the 1920s, are in a situation similar to the first time Europeans were exposed to the disease. The disease was first noted among the Yanomami in 1992, having come with gold miners in the 1970s and 1980s. By then, 6% of the population had TB, and even though they had been vaccinated in 1989, 58% of the Indians had a weakened or nonexistent immune reaction in skin tests that measure cell response to the tuberculosis bacterium.
In other words, a majority had poor defences, and only 29% had \Jantibodies\j that could activate T cells-a necessary part of a strong immune response. The Brazilian government closed the Yanomami Reservation to outsiders from 1993 until last year, but now it is open again, and doctors who are helping the Yanomami are also recording the progress of the disease and the victims' responses, in the hope of knowing more about how humans are affected by new epidemics.
In more experienced populations, where drug-resistant TB is on the rise, a major problem is still the delay in diagnosing TB and in obtaining drug susceptibility results which allow physicians to provide the most suitable treatment. The normal procedure is to obtain a sputum specimen, culture it, identify the \Jbacteria\j present, and test them for drug susceptibility.
Most laboratories still use slow methods, even though the use of rapid diagnostic methods for \IMycobacterium tuberculosis\i, a combination of solid medium and radiometric broth cultures, nucleic acid probes for identification, and radiometric broth drug-susceptibility testing, substantially decreases the time to diagnosis.
If the rapid methods were used, says a November Harvard (USA) School of Public Health report, the time to diagnosis would drop from an average 38.5 days to 23 days, while the time to appropriate therapy would drop from an average of 6.6 days to 2.0 days, and mortality would be 22 to 33% lower, in addition to decreasing the average health care costs per patient, which would be 9 to 22% lower in the USA).
#
"Wine good for the heart?",351,0,0,0
(Nov '97)
"Drink no longer \Jwater\j, but use a little wine for thy \Jstomach\j's sake . . .", said St Paul in his first epistle to Timothy. He goes on to add that wine will also treat Timothy's "often infirmities". Now a report in mid-November to the American Heart Association says that wine is good for the heart. Could this have been Timothy's "often infirmity"?
Whether it was or not, this hardly detracts from the proposal that drinking moderate amounts of alcohol-about one drink a \Jday\j-cuts the risk of a deadly heart attack in men who already had one heart attack or stroke. There was a clear reduction in the risk of death when light to moderate drinkers were compared with total abstainers, according to Michael Gaziano, M.D., director of cardiovascular \Jepidemiology\j at \JBoston\j's Brigham and Women's \JHospital\j, leader of the study.
The study investigated 4797 male physicians who had suffered a previous heart attack and 953 who had experienced a stroke. Men who drank one or two alcoholic drinks reduced their risk of premature death and the risk of a fatal heart attack by 20 to 30 percent. These men were drawn from among the more than 90,000 doctors who filled out a questionnaire to enter the Physicians' Health Study, a trial designed in part to learn if low doses of aspirin reduce the risk of a first heart attack. The physicians in Gaziano's study were rejected for the aspirin trial because of their previous heart and stroke problems.
The study only examined men, so women will have to wait for further investigation. Now doctors will face a real dilemma: whether to encourage their patients to drink, knowing that uncontrolled and heavy drinking can lead to avoidable deaths, or whether to avoid them to shy away from any alcohol drinking at all. Gaziano believes all physicians should discuss alcohol intake with their patients first and identify and counsel problem drinkers. However, for patients with previous heart attack or stroke who are light to moderate drinkers, this behaviour appears to be safe and may confer modest benefits, he said.
And the real crunch: it may even be that drinking two or more drinks a \Jday\j is even more effective-so few of the doctors in the study fell into this category that nobody can really tell, one way or the other.
#
"Ethanol increases toxin levels",352,0,0,0
(Nov '97)
If alcohol is good for the heart, it seems that it is more of a problem for the lungs. Used as an \Jautomobile\j fuel additive to make gasohol, \Jethanol\j improves air quality by reducing hydrocarbon and carbon monoxide emissions, but it also causes increased levels of toxins called aldehydes and peroxyacyl nitrates (PAN), according to a report from U.S. Department of \JEnergy\j's Argonne National Laboratory and published in \IEnvironmental Science & Technology\i during November.
Aldehydes are much more reactive in the \Jatmosphere\j than the alcohols they are made from. They react with other chemicals in urban atmospheres to set off \Jchemical reaction\js leading to PAN, and once created, they can last for days if the conditions are right.
PAN is highly toxic to plants and is a powerful eye irritant. This pollutant has been measured in many areas of the world, indicating that it can be carried by winds throughout the globe. The Albuquerque study was particularly appropriate as federal regulations require cars to use \Jethanol\j-\Jgasoline\j fuel blends and to ban wood-burning in order to maintain air quality during the winter months. More than 99 per cent of the vehicles in the area use blended fuels containing 10 percent alcohol in the winter, while blended fuel use declines substantially during the summer.
#
"Digital x-rays cheaper",353,0,0,0
(Nov '97)
Just as ordinary photography is being revolutionised by digital cameras, now x-ray technology looks set to be changed forever by digital methods. This method has the potential to replace the current film x-ray technology, while reducing health care costs and improving patient care.
Around 70% of all x-rays now are stored on film, but this is an indirect method, because fluorescent materials must first absorb the x-ray \Jenergy\j and convert it into light during the exposure process. Then the light must be converted to electronic signals. During this second step, the emitted light scatters and can reduce the sharpness of the image.
With digital radiography technology, x-ray \Jenergy\j is captured and converted into electronic signals that form a precise digital image on a video screen. These images can be duplicated and transmitted electronically with no loss of quality, making it easier to consult with distant experts via the \JInternet\j. There is also software available that would allow a radiologist to focus on or enhance a specific area of interest on the digital x-ray.
The system has been under development for some time, and was unveiled at the end of November at the 83rd Scientific Assembly and Annual meeting of the Radiological Society of \JNorth America\j meeting (RSNA) in \JChicago\j, where two papers on the technique were read by Gary S. Shaber, M.D., research professor of Radiology, Jefferson Medical College, Philadelphia.
The x-rays can also be stored in the computer for easy and quick access by physicians, and the digital x-ray technology would also allow \Jhospital\j radiology departments to see more patients and cut down on repeat examinations. In the longer term, this sort of system could even allow more rapid reporting to the primary care physician who referred the patient for x-rays. And if that isn't enough, digital technology also offers a significantly lower dose of radiation than conventional imaging.
#
"Anticancer drugs making liver transplants stick",354,0,0,0
(Nov '97)
Liver transplants are extremely difficult, as the organs will fail if they are not transplanted within 24 hours. Now a lucky observation has led to the discovery that a powerful anticancer drug, interferon, can double the lifetime of donor livers. This could greatly improve the supply of donor livers and relieve some of the pressure on the complicated transplant procedure.
The main problem is preserving cells called hepatocytes. This is done best by immersing fresh donor livers in a chilled solution of electrolytes and sugars. The hepatocytes \Jpump\j out something like 5000 different proteins which are involved in everything from blood clotting to detoxification. At the cold temperatures usually used, the liver's sinusoidal cells become bloated and cover the blood vessels. Then after transplantation, these bloated cells detach from the hepatocytes and die, weakening the blood vessels and starving the liver of blood.
When a Duke University team began looking at this problem in 1995, one of the team members had a relative who had just been diagnosed with breast cancer, leading the unnamed researcher to brush up on the subject. What the researcher saw was a "striking similarity" between the earliest stages of \Jtumour\j development and the injuries to refrigerated livers. A report in the November issue of \IGastroenterology\i says that the key observation was that liver sinusoidal cells were bloated just like their counterparts, endothelial cells, when blood vessels grow into a \Jtumour\j, a process called angiogenesis.
Tests on rats suggest that a single dose of interferon, given a few hours before the liver is removed, could make donor livers last perhaps 48 hours, at least in rats: human results will still need to be obtained.
#
"Growing smarter cotton",355,0,0,0
(Nov '97)
A genetically altered form of cotton, highly resistant to the \Jherbicide\j glyphosate, is now growing well in \JFlorida\j, according to Raymond Gallaher, agronomy professor at the University of \JFlorida\j Institute of Food and Agricultural Sciences.
Typical \Jherbicide\j spraying of unaltered cotton may need up to five applications. The savings from fewer sprayings-to both the environment and the farmers' bank balances, are significant. Cotton and weeds cannot co-exist, so the weeds must go, in order to save the cotton plants from competition, but the problem has always been to do this as cheaply as possible.
The resistance gene was first identified in 1984, and it was incorporated into soy beans in 1989, with the first field trials of the "gene beans" in 1992. The same gene is expected to be field tested in corn during the 1998 growing season.
#
"Frogs deformed by chemicals?",356,0,0,0
(Nov '97)
The frogs of \JFlorida\j and other parts of the United States may welcome the introduction of more resistant corn. Frogs in an Ohio pond have been turning up with too many legs, and while nobody is sure of the reasons why, one possibility is run-off from a neighbouring cornfield.
A report released in mid-November considers herbicides, a naturally occurring parasite, and other possible causes, indicating that it will be 1998 at the earliest before the cause is known with any certainty. All over the world, frogs are dying out, and it is just possible that the deformities may be related in some way.
The expected background level for deformities is about 1%, but on this site, the level has reached 5%. The deformities have been reported as far back as the 1700s, but reports have increased dramatically in number since 1995. The largest deformed frog populations have been found in \JWisconsin\j, \JMinnesota\j and parts of Canada, where anywhere between 10 and 75 percent of specific species of frogs have deformities.
The problem may have trematodes as the direct cause. Best known to humans as the cause of bilharzia, these can alter limb development by burrowing into the limb buds of tadpoles, though it is improbable that trematodes are the only cause, especially as some of the problems are facial deformities, less likely to be caused by trematodes. In the pictured example, an extra leg is growing from the frog's \Jsternum\j, or breast-bone, and this is most certainly not a trematode problem.
#
"Jumping gene",357,0,0,0
(Nov '97)
Jumping genes were discovered Nobel Laureate Barbara McClintock in the 1940s, and recognised properly only in 1983, when she received her Nobel Prize. McClintock's genes can "jump" from one \Jchromosome\j to another. These so-called transposable elements, or transposons, squeeze themselves into host chromosomes.
In simple terms, a transposon is just a gene for an \Jenzyme\j called a transposase. The transposase has just one mission: to cut its own gene free from a \Jchromosome\j and insert it into another \Jchromosome\j or at another site on the same \Jchromosome\j. The transposase recognises its own gene by the special stretches of DNA, the "recognition sequences", on either side of the gene. The transposase cuts the gene, recognition sequences and all, and moves the whole package to a new location.
McClintock found her jumping genes in the \Jmaize\j \Jgenome\j, and the genes have close counterparts in fruit flies. In 1994, the first vertebrate transposons were discovered in that popular genetic animal, the zebra fish, but the genes were inactive and appeared to have been silenced millions of years ago, given the number of mutations that appeared to have occurred since then.
Perry Hackett, together with Zoltan Ivics, Zsuzsanna Izsvak, and Ronald Plasterk, examined the salmon \Jgenome\j, looking for signs of transposons which might have been active more recently. They found recurring base sequences in salmon which suggested that the ancestors of these fish had possessed transposons, but that both the transposase genes and their recognition sequences had mutated to the point where they were no longer functional. From the extent of the mutations, they estimated that it was some 15 million years since the salmon "jumping genes" last leapt.
After searching for inactive transposons and eliminating mutations in the genes, they patched together a sequence, based on observations and knowledge of such sequences in other organisms. The work was written up in \ICell\i during November. Their new construction, a newly-revived gene which they have called "Sleeping Beauty," can not only slip into chromosomes, but a small test gene spliced into the transposon was also imported into the DNA of both zebra fish and human cells. When they combined "Sleeping Beauty" with lipids to deliver genes to cells, the transposon showed a 20-fold increase in efficiency in getting genes into chromosomes.
This work could overcome a stumbling block for gene therapy by providing a way of ensuring that a gene is actually inserted into a target cell's \Jchromosome\j, where it will order the production of useful proteins. There are still problems, though. "Sleeping Beauty" inserts itself almost randomly into the host cell's chromosomes, and so could land in the middle of an important or essential gene, causing a harmful \Jmutation\j. It will also have to be adapted to carry genes large enough to code for many therapeutic proteins.
Up until now, the only way to get material into cells was to use viruses, or to inject DNA into cells directly. The viruses are often destroyed by the host's immune system, and the injection method gives patchy results at best. In any case, the viruses used most often can only get into the cell nucleus when the cell is getting ready to divide.
The fact that Sleeping Beauty can carry genes into both zebra fish and humans suggests that it will work on most vertebrates, making the gene a potentially exciting tool to work with. With further development, Sleeping Beauty might be used to transport normal genes into cells containing defective forms of the genes that cause conditions such as haemophilia or cancer.
The researchers are also considering using the transposon as a new and better way to create mutants to study embryonic development, since the gene inactivated by having "Sleeping Beauty" poked into its middle can also be "tagged" using a short DNA sequence inserted by the transposon. So experimenters can throw tagged transposons at a developing embryo, check to see which gene is not working, and then use the position of the tag to find where on the \Jchromosome\j that gene lies.
#
"How plague kills macrophages",358,0,0,0
(Nov '97)
In early November, Olaf Schneewind and colleagues at the University of \JCalifornia\j School of Medicine in Los Angeles, announced to the world through the pages of \IScience\i that they had found the signal that the plague bacterium, \IYersinia pestis\i, uses to release a powerful toxin that attacks and destroys the macrophages which are the front line of our immune defences.
Under attack, the \Jbacteria\j inject a toxin that rapidly paralyses the macrophages. The toxin proteins, called \IYersinia\i outer proteins (Yops), had no apparent amino acid signal calling for secretion. A possible gene was found, but even when this was drastically mutated, the toxin proteins were still secreted. It seems that the bacterium uses a novel signal, a piece of messenger RNA (mRNA), the template for assembling the protein in question.
So far so good: we now know what switches on the reaction. Now scientists can start to look for novel ways of combatting the bacterium, simply by attacking this pathway. More interestingly, it may be possible to use these \Jbacteria\j as miniature factories for injecting proteins directly into human cells. In the long term, bubonic plague may be rather more welcome than it was when we called it the \JBlack Death\j.
#
"Phosphate problems ahead",359,0,0,0
(Nov '97)
Plants rely on several key \Jnutrients\j: \Jpotassium\j, \Jnitrogen\j and \Jphosphorus\j being the main ones. There is no shortage of \Jpotassium\j, and \Jnitrogen\j can be added to the soil from "fixed" atmospheric \Jnitrogen\j, but \Jphosphorus\j is likely to be a problem at the end of the 21st century, when the available rock phosphate deposits run out.
The farmers of next century are going to need to guard their \Jphosphorus\j reserves jealously, and that means we need to know more about how \Jphosphorus\j is taken up by plants. The first steps in working this out were announced during November in the \IProceedings of the National Academy of Science\i. K. G. Raghothama, Purdue assistant professor of \Jhorticulture\j, describes in that journal how he has isolated the genes that help plant roots take up phosphate, a common form of \Jphosphorus\j.
The problem of phosphate extraction is made worse by the way in which soils hold on to \Jphosphorus\j. This is especially severe in the very acid soils of the tropics, which are rich in iron and \Jaluminium\j, which both latch onto and tie up nearly all available \Jphosphorus\j. And in some alkaline soils, \Jcalcium\j reacts with the \Jphosphorus\j and essentially fixes it, although the hold is less tight than in acid soils.
Plants have a variety of strategies to get at \Jphosphorus\j. Some plants develop more roots, while others may produce and release organic acids and enzymes that can pry the nutrient away from the attraction of the soil clay and organic matter. In others again, the plants flip a genetic switch that changes certain molecules in roots and makes plants better at acquiring phosphate.
Studies with yeast and fungi have already identified protein molecules called "phosphate transporters" which actively take up phosphate, and the responsible genes are also known. So Raghothama, in collaboration with Jose Pardo from Instituto de Recursos Naturales y Agrobiologia in \JSpain\j, set out to find the mechanism which makes plants better at \Jphosphorus\j uptake, then track down the genes that turn on that mechanism.
Their aim was to identify phosphate transporter genes in higher plants. They did this by studying \IArabidopsis\i plants: this is a member of the mustard family, and a popular experimental plant. Raghothama and postdoctoral researcher U. Muchhal "starved" their \IArabidopsis\i plants to activate any phosphate-scavenging mechanisms, and then probed the DNA libraries of the starved plants for genes that produce phosphate transporter proteins.
They found the genes and identified them, and also noted that the phosphate-starved plants sent out significantly more messages calling for production of phosphate transporter proteins. In this way, they have advanced our understanding of the way in which \Jphosphorus\j is taken up by plants. It is quite likely that their names will be revered by the farmers of the late 21st century.
#
"Moon a part of earth?",360,0,0,0
(Nov '97)
Two hundred years ago, the only sort of story we could read around rocks was that engraved on the surfaces by other humans, but even that had to wait on people learning to decipher \Jrunes\j and \Jhieroglyphics\j. Now we can read the stories written inside the rocks just as well, almost all the way back to the \Jearth\j's formation, and certainly back to the moon's formation. All we had to do was learn how to decipher the tale that was waiting there for us.
\JTungsten\j \Jisotopes\j have shown up as the key item in dating the moon, working out where it came from. Writing in \IScience\i, early in November, a group of researchers explained how they analysed \Jisotopes\j of \Jtungsten\j in rock samples from the lunar surface to unlock the secrets of the moon's origin.
"Our data indicate the moon formed within the time window of 4.52 billion to 4.50 billion years ago. The \Jtungsten\j isotopic composition of the moon is consistent with the hypothesis that the moon was derived from the \JEarth\j itself, or from a large object colliding with the \JEarth\j which had a similar chemical composition," said Alexander Halliday, one of the authors, in a press release.
As they see it, using simulations, the collision produced temperatures of 10 000 kelvin, mixing and melting the rocks of the young \JEarth\j. The researchers dated 21 lunar samples, studying very small amounts of \Jtungsten\j-182, formed from the decay of hafnium-182 in the rocks.
#
"Mars wet and liquid because of dry ice",361,0,0,0
(Nov '97)
Nothing we see in daily life seems as cold as dry ice, or as dry as dry ice. The name, once a trade mark for commercially supplied frozen carbon dioxide, is now our standard everyday name for that product, but on Mars, it seems as though the substance may have been responsible for keeping the Red \JPlanet\j both warm and wet.
University of \JChicago\j Professor of Geophysical Sciences Raymond Pierrehumbert and his colleague Franτois Forget, from the Laboratoire de MΘtΘorologie Dynamique du CNRS in Paris, believe they can explain why there are deep channels to be seen in surface photographs from Mars, created by surface \Jwater\j on a young Mars, some four billion years ago. The answer, they say in a November article in \IScience\i, is reflective carbon-dioxide ice clouds that retain thermal radiation near the \Jplanet\j's surface, clouds made of dry ice.
People have tried atmospheric models involving CO\D2\d before, but if you have enough of the gas to warm the \Jplanet\j, there is enough carbon dioxide present for it to condense out, and this should have produced thick clouds that would reflect sunlight back to space and actually cool the \Jplanet\j. The carbon dioxide ice clouds, unlike the \Jwater\j ice clouds found on \JEarth\j, are made up of particles which are large enough to scatter infrared light more effectively than visible light coming from the \Jsun\j.
Ordinary, \JEarth\j-type clouds would absorb heat from the \Jplanet\j's surface and re-emit it both back to the surface and to outer space, losing half of the heat in the process. The carbon dioxide clouds act like a one-way mirror, so while only a small amount of sunlight gets through to the \Jplanet\j's surface, what does reach the \Jplanet\j is converted to heat, which the clouds then reflect back to the surface, according to Pierrehumbert.
Working from this, Pierrehumbert suggests that this may tell us what sorts of life forms are possible on Mars. "If we're going to be looking for analogues of terrestrial life forms on Mars," he said, "then we should be looking for the kinds of organisms that might evolve in extreme environments, like the bottoms of oceans or in caves. The conditions on early Mars were a little more like the conditions at the bottom of the ocean than like a \Jrainforest\j. It would have been dark, warm enough for liquid \Jwater\j, but without a large \Jenergy\j source for \Jphotosynthesis\j."
The researchers' model extends the habitable zone on extrasolar planets and increases the likelihood that life exists outside our \J\Jsolar system\j\j. Previously, scientists thought that only planets orbiting within 1.37 astronomical units (one AU is the distance between \JEarth\j and the \JSun\j) of a \Jsun\j-type star could have \Jwater\j above the freezing point. But if the planets have carbon-dioxide ice clouds, they could have liquid \Jwater\j as far away as 2.4 AU. Mars is 1.52 AU from the \JSun\j.
Despite his name, Pierrehumbert is not a strong speaker of French, so they wrote in English, and the question of how to express some of the terms they use in language acceptable to the Academie Francaise did not arise. It was, however, something of a problem for Forget, who recently had to translate "runaway \J\Jgreenhouse effect\j\j" into French. A check of the literature showed that "\J\Jgreenhouse effect\j\j" translates to a straightforward "effet-serre," though there was no clear translation for the "runaway" part. So, Forget had to coin a term himself, and wound up writing about "effet-serre gallopant." "Dry ice", says Pierrehumbert, is easy: it is just "glace carbonique".
#
"Einstein's frame dragging",362,0,0,0
(Nov '97)
Every science eccentric likes to target Einstein, explaining confidentially that he (it is usually a "he") has been able to show that Einstein got it wrong. Somehow, these "proofs" never seem to impress mainstream scientists, who are more interested in the evidence gained from observations.
Real scientists are especially interested in observations which bear out Einstein's views, because so much of our modern science seems to be perfectly explained by Einstein's ideas. And at least the observations serve to keep the "crackpots" at bay.
Einstein predicted an effect, called "frame dragging," 80 years ago. Like many other aspects of Einstein's famous theories of relativity, it is so subtle that no conventional method could measure it. In simple terms, frame dragging results in space and time get pulled out of shape near a rotating body. In an extreme case, a rotating Tipler machine may even interfere with the causality principle through frame dragging.
Using recent observations by x-ray \Jastronomy\j satellites, including NASA's Rossi x-ray Timing Explorer, a team of astronomers reported in November that they had found evidence of frame dragging in discs of gas swirling around a black hole.
Drs Wei Cui, Nan Zhang, and Wan Chen began with Einstein's prediction that the rotation of an object would alter space and time, dragging a nearby object out of position compared to predictions by the simpler math of Sir Isaac Newton. This effect had not been observed in the eighty years since Einstein predicted it. In this, it was unlike the other, more familiar Einsteinian predictions, such as the conversion of mass into \Jenergy\j (as seen in atomic bombs and stars) and back, the Lorentz transformations that make objects near the speed of light grow thinner and heavier and stretch time, and the warping of space by gravity (as seen when light is bent by a massive object)
And no wonder: the effect is incredibly small, about one part in a few trillion, which means that you have to look at something very massive, or build an instrument that is incredibly sensitive and put it in \Jorbit\j. Cui, Zhang, and Chen took the first option, and studied radiation coming from around black holes in binaries with other visible stars. Over time, the black hole strips material from the star, producing an \Jaccretion\j disc of material which gets hotter as it approaches the event horizon of the black hole, and gives off radio waves, visible light, and-just before it disappears-x-rays.
They found that the discs precessed, wobbling like a child's toy top. By studying the radiation from supraluminal jets in two black holes, called GRS 1915+105 and GRO J1655-40, they found that the rate of precession was far greater than could be explained by the sorts of effects seen in children's toys. Conclusion: Einstein's frame-dragging is real.
The sensitive instrument option will follow next: NASA is developing it as Gravity Probe B, described in the entry on \J\Jgeneral relativity\j\j. This is a \Jsatellite\j containing precision gyroscopes inside a liquid \Jhelium\j bath. Gravity Probe B will point at a selected star, and sensitive instruments will measure how much the gyros precess after conventional effects are nullified. The leftover effects should provide a precise measure of frame dragging. Because the Rossi \Jsatellite\j observations are somewhat uncontrolled, the final proof of frame dragging will come when Gravity Probe B points at a known star of known mass, and turns in consistent results.
#
"Einstein cleared of plagiarism",363,0,0,0
(Nov '97)
Meanwhile, Einstein was cleared of a charge of plagiarism during November. Many people have long believed that the mathematician David Hilbert completed the theory of General Relativity five days before Albert Einstein in November 1915, but it now seems unlikely that Einstein copied the correct field equations of General Relativity from Hilbert, even though his paper is dated five days after Hilbert's.
Proofs of Hilbert's key paper (dated November 20, 1915), have been found which are dated December 6, 1915, after Einstein had completed his paper. These proofs contain only an immature version of General Relativity, without the explicit field equations. These equations must have been inserted only later, after 6 December and before the published version appeared in 1916.
#
"Cost of El Nino",364,0,0,0
(Nov '97)
The importance of understanding the \Jweather\j comes home best when you are in a small country in central or \JSouth America\j, and relying on hydro-electric power. A new computerised forecasting model designed to predict the financial impact of El Ni±o \Jweather\j patterns could save millions of dollars in \Jenergy\j costs, according to its developers.
More than half the electric power for \JColombia\j, Panama and Costa Rica comes from hydroelectricity. During a normal year in this region of \JLatin America\j, the heavy rainy season from June through October feeds the reservoirs that supply the nations' power sources. During an El Ni±o year, however, the rains are reduced. Then the only other option is to import oil.
Even the USA gets 13% of its \Jenergy\j from hydro power, which translates to 500 million barrels of oil annually. A tiny 1% increase in the US efficiency in hydro power would mean a saving of some 4.9 million barrels of oil, costing around US$90 million.
With rainfall in Panama down by 39 percent compared to the yearly average, the government there has already tried to reduce electricity consumption and outflow of \Jwater\j from the reservoirs by 30 percent so they don't run out by January, 1998.
\BThe health cost of El Ni±o\b
The costs go beyond simple economic costs. Doctors in \JPeru\j blamed climate changes caused by El Nino for an increase in diarrhoea and \Jdehydration\j in infants and young children. Unusually high temperatures in Lima and along the north-central coast have corresponded with the rise in childhood illness and doctors are preparing for even worse to come.
In a letter to the \ILancet\i, they pointed to the coming southern summer as a problem time, and suggested that the changing \Jweather\j patterns will burden financially strapped health services and lead to an outbreak of \Jcholera\j. This is because higher temperatures support blooms of plankton which provide a perfect breeding ground for the \IVibrio\i \Jbacteria\j which cause the disease.
#
"Global warming: could we lose the conveyor?",365,0,0,0
(Nov '97)
A great cost of global warming might come about if we lost a giant ocean current called the conveyor. This system, which among other things drives the Gulf Stream, is all that prevents Ireland from having a climate like Spitsbergen (Svalbard), 600 miles north of the \JArctic\j Circle.
This Doomsday scenario is no mere piece of \J\Jscience fiction\j\j. Worked out by Wallace S. Broecker, Newberry Professor of \JEarth\j and Environmental Sciences at Columbia University's Lamont-Doherty \JEarth\j Observatory, it was published in the journal \IScience\i at the end of November. In something of an understatement, Broecker describes the results as devastating, saying that Dublin could encounter a fall of 20░F (about 11░C) in just ten years, even as the rest of the world was heating up.
The Conveyor is delicately balanced and vulnerable, and it has shut down or changed direction many times in \JEarth\j's history, according to Broecker. Each change has produced massive climatic variation in a matter of decades, causing large-scale wind shifts, fluctuations in atmospheric dust levels, glacial advances or retreats, and other changes as the Conveyor jumps from one stable mode to another.
So while the current warming from the enhanced \J\Jgreenhouse effect\j\j may be a slow one, it may well be all that is required to take us "over the hump", and into a new climatic regime. Right now, the system is driven by cold salty \Jwater\j sinking to the bottom of the North \JAtlantic Ocean\j. This then pushes waters through the world's oceans, a flow 16 times greater than the flow of all the world's rivers combined.
The waters of the equatorial Indian Ocean are too warm to sink, while the north Pacific is too diluted by the snow and rains of the western United States and Canada. If the north Atlantic warms by just a few degrees, or if it gets a bit more rain, the whole flow could stop, and once stopped, who can say if it will start again?
Ice core evidence tells us that when a climatic change comes, it happens over a short period, geologically speaking, with just a few decades of transition. Broecker believes that the Conveyor is the key factor that we need to watch and worry about. And who is Broecker? Just one of the world's leading authorities on global climate change. He has won nearly every major geological award, including the Vetlesen Prize, considered by many to be the equivalent of the Nobel Prize in \Jearth\j sciences. Last year he was awarded the National Medal of Science and the Blue \JPlanet\j Prize, for achievements in global environmental research. When people like Broecker are worried, it is time to be worried too.
Coincidentally, evidence appeared in nature, just two weeks before that, giving us the same message. A 53-metre-long sediment core, retrieved from the \JBermuda\j Rise in the western North Atlantic, now gives us the most detailed picture yet of events during the previous interglacial. The key feature: "its termination seems to have been marked by a sudden reduction in the ocean 'conveyor' circulation which today carries ocean heat north from the tropics and warms much of Europe." In this case, the blip took less than four hundred years to throw the world back into a severe \JIce Age\j.
#
"Earth's 1500-year rhythm",366,0,0,0
(Nov '97)
Another sea core finding in November, reported in the November 14 issue of the journal \IScience\i, indicates that the \JEarth\j's climate cools significantly and abruptly every 1,500 years or so in a persistent, regular rhythm. Or so a team led by scientists at Columbia University's Lamont-Doherty \JEarth\j Observatory would have us believe.
As this is written in early December, more information is coming in to support this claim, which will be dealt with more fully next month. The main points: the cycle has continued uninterrupted for 32 thousand years, with a period of 1470 years, plus or minus 500 years. The last such cycle may have taken place 300 years ago, and the cycle appears to persist when the \Jplanet\j is in the grip of an \JIce Age\j or when it is in an interglacial as we are now.
The finding throws new light on historical events, such as the Little \JIce Age\j, a cold spell that gripped the world in the 17th and 18th centuries and might prove to be the most recent manifestation of the phenomenon. The evidence for their theory is in the form of tiny particles of rock and volcanic glass, carried by glacial icebergs and \J\Jsea ice\j\j to the North Atlantic, deposited on the sea floor and buried by subsequent sediments. In the cold snaps, the number of particles doubled or tripled in ocean sediments on both sides of the north Atlantic, indicating that the amount of floating ice had both increased and extended further south. This is borne out by the study of the abundance of cold-\Jwater\j-loving plants, which increased, and the amount of warmer-\Jwater\j plankton, which decreased in the same 1500-year cycle.
In each cycle, cold, ice-bearing waters, which today circulate around southern \JGreenland\j, pushed as far south as \JGreat Britain\j. The cold waters penetrated a warm North Atlantic current that prevails today, and may have disrupted the global ocean circulation pattern that keeps the North Atlantic region warm. The ocean circulation disruption may well have had far-flung, world-wide effects, say the researchers.
For the moment, the findings only go back 32 thousand years, as this is the effective limit of \J\Jradiocarbon dating\j\j in the sea-floor sediments. The next step will be to see if the cycles persisted even before the last \J\Jice age\j\j began, as far back as the Eemian Period, more than 115 thousand years ago, when the \JEarth\j was relatively warm, as it is today.
#
"More Arctic change",367,0,0,0
(Nov '97)
It seems that the \JArctic\j is quite capable of changing its climate, even without human effects. A study published in \IScience\i during November reveals that the \JArctic\j experienced its highest temperatures in 400 years between the mid-19th and mid-20th centuries.
A team of 18 north American researchers used a whole range of palaeoenvironmental data, including information from glaciers, tree rings and marine, lake and pond sediments. As an example, one researcher focused on diatoms, a type of \Jalgae\j known to respond in a measurable way to environmental change, in \JArctic\j ponds. In this way, she was able to reconstruct past environmental conditions using the diatom assemblages that are preserved in lake and pond sediments. The data from the various sources were all consistent in the story they told.
#
"Ancient eruptions caused global warming?",368,0,0,0
(Nov '97)
There is some evidence to suggest that multiple massive volcanic eruptions occurred roughly 55 million years ago in the Caribbean Basin. The evidence, based on drill cores from the sea floor south of \JHaiti\j, points to an abrupt inversion of ocean waters, triggering one of the most dramatic climatic changes ever. It was published in the November issue of the journal \IGeology\i. It covers cores taken by the 470-foot \IJOIDES Resolution\i, the world's largest scientific drill ship, in late 1995 and early 1996.
The inversion would have caused release of massive amounts of sea floor \Jmethane\j into the \Jatmosphere\j, presumably leading to global warming and possibly speeding the \Jevolution\j of countless new plant and animal species, including many primates and carnivores. At the same time, close to half of all deep-sea animals went extinct, asphyxiated in the suddenly warmer and stagnant deep waters.
Who beats whom in the carbon dioxide emission leagues? Here is a table which will let you compare absolute production levels with per capita outputs. Notice how \JAustralia\j, singled out for special leniency at \JKyoto\j in December, scores.
On a per capita basis, the United States retains first place, but \JAustralia\j leaps up the ladder to take second place, just ahead of Canada in third place, followed by North Korea, \JKazakhstan\j, \JRussia\j, \JGermany\j and \JJapan\j, with the rest of the world trailing behind.
#
"Filtering Spam",370,0,0,0
(Nov '97)
Spam, apart from being a nutritious meat product invented by George A. Hormel, and a long running Monty Python joke, is nuisance mail which arrives as e-mail. As more and more losers try to make a profit by scattering their get-rich-quick schemes and unsolicited offers of unwanted services across the \JInternet\j, so e-mail users around the world become more and more annoyed.
There are a number of ways of dealing with Spam mail. You can try writing back to the sender, but they have usually lost their account privileges by the time you react, so your mail will simply bounce back to you. Or if the account has not been closed, it may have never existed: spammers are quite good at "forging" the origin address -- at least on the surface. Usually a determined person can uncover the real address that the Spam came from, and then write to the service provider, but by then the damage is done, as most spammers need only to trawl their announcement, with a phone number, past as many users as possible. After that, the \JInternet\j account is of no further use, and it is abandoned, with the damage done.
There are also more complex approaches, involving single-minded pursuit of the spamming enemy. The information you need can be found easily on the \JInternet\j. The Sendmail Anti-spam site is at http://www.sendmail.org/antispam.html, and this provides a numbers of links, one of the best being http://spam.abuse.net/
Spam Hater is free Windows software which works with many mail readers: it analyses the Spam, and does most of the heavy work of preparing a letter of complaint to the relevant Postmaster. Downloads of Spam Hater can be obtained from http://www.compulink.co.uk/~net-services/spam/
Useful advice is available at http://kryten.eng.monash.edu.au/gspam.html. On the other hand, if you want a scholarly (warning: that means "heavyweight") paper on Spam, turn to http://server.berkeley.edu/BTLJ/articles/11-2/carroll.html but be warned that this single Web page is more than 170k long, and contains more than 200 footnotes. A check at the end of November revealed that the footnote numbers are still there, but the actual footnotes have gone.
As an alternative, you may prefer to turn to the new service that Lucent's Bell Laboratories are offering. People using Lucent's proxy server, Lucent Personalized Web Assistant, to give each site a personal user alias, password, and email address. Then, if that site sells your address, you will know who is doing it, and you will also be able to filter out the offending calls. This service, only announced during November, has not been widely reviewed yet, but it may be worth watching out for.
#
"Future of security",371,0,0,0
(Nov '97)
Security staff in casinos around the world have computer-stored mugshots of banned customers, and screen-savers running on the computer to cycle through those who are unwanted, to make their faces more familiar. "Mugspot", a computer program that can recognise a face in a crowd, may take some of the pressure off.
The system spots a person, tracks them, and then selects the nearest to a full-face shot, which is then passed on to the software. This sophisticated software is able to see through superficial changes such as moustaches, beards, changed hair-styles and spectacles.
The Mugspot system can scan eight video frames per second in real time, and takes about 13 seconds to select the best view, process it for identification, compare it to the several hundred faces in its memory and decide whether it has found a match. This will do to identify wanted suspects fleeing the law, but will it be fast enough to identify known bank robbers before they can produce their guns?
#
"Mars Express to go ahead",372,0,0,0
(Nov '97)
The European Space Agency (\JESA\j) has decided to go ahead with a fast-track plan to launch a new Mars mission, Mars Express, in mid-2003. The \Jspacecraft\j will include an orbiter and several landers, one of which will search for traces of past or present life on Mars. The \JESA\j will provide about $175 million for the orbiter and launch; participating countries will pay for the landers and instruments.
This plan is intended to regain the impetus that was lost when a launcher for a Russian-led mission carrying the Mars '96 \Jspacecraft\j exploded a year ago over the \JPacific Ocean\j. But in addition to instruments developed for Mars '96, Mars Express will carry a subsurface-penetrating radar that can look for signs of conditions that might support life, such as evidence of \Jwater\j.
#
"Laser surgery a step closer",373,0,0,0
(Nov '97)
The same computer models which predict how lasers pound fuel pellets to trigger nuclear fusion have now shown that high-power lasers can also destroy bone, and do it without endangering nearby nerves. Given this, it now seems possible that surgeons in the future could treat chronic back pain by vaporising the bone away from pinched nerves with lasers.
\JLaser\j fusion involves high-intensity \Jlaser\j pulses heating a tiny pellet of \Jhydrogen\j \Jisotopes\j to tens of millions of degrees, while squeezing it to high density, so as to spark a nuclear reaction. While thermonuclear fusion is not yet a way of generating useful \Jenergy\j, scientists have created detailed computer simulations of exactly how the \Jlaser\j light is absorbed in the fusion process.
Richard London of the Lawrence Livermore National Laboratory in \JCalifornia\j told the American Physical Society Division of Plasma Physics Meeting about this at a November meeting. The calculations suggested that a burst of very intense \Jlaser\j light, lasting less than one-trillionth of a second, would ionise atoms in the bone and turn it into a hot gas called a plasma. And since the plasma is an electrical conductor, the \Jlaser\j light only penetrates a short distance, removing just a micrometre of bone.
Researchers have already used a \Jtitanium\j-sapphire \Jlaser\j on pig bones from a slaughterhouse, and they have demonstrated that they can target just the bone. When the \Jlaser\j zaps bone, the \Jcalcium\j atoms in the bone tissue emit photons at a certain wavelength, so all they had to do was set the \Jlaser\j to shut itself off if those wavelengths disappear - the sort of effect that might happen if the \Jlaser\j struck a nerve, which is low in \Jcalcium\j.
#
"Chronic diseases and PTSD",374,0,0,0
(Nov '97)
US Vietnam veterans who saw heavy combat, and who were later diagnosed with post-traumatic stress syndrome (PTSD) are significantly more likely than other veterans to suffer from a variety of chronic diseases 15 to 20 years later. Using soldiers who saw little combat in Vietnam and did not develop PTSD as a control group, these former soldiers are 50% to 150% more likely to develop \J\Jheart disease\j\j, weakened immune systems, infections, \Jarthritis\j, and respiratory and digestive problems, according to a report in \IPsychosomatic Medicine\i.
Author Joseph Boscarino, an epidemiologist and social psychologist, suggests that the problem arises from a high state of nervous arousal induced by the post-traumatic stress. Previous studies have shown less obvious signs of medical problems, since it is only when heavy-combat PTSD and light-combat PTSD groups are studies that the medical effects show up in analysis.
#
"Instant memory recall limit",375,0,0,0
(Nov '97)
Somewhere back in the 1950s, a psychologist called Miller proposed "The Magic Number Seven Plus or Minus Two" in the \IPsychological Bulletin\i. This is now so lost in the mists of antiquity, that most people take it as \Jfolklore\j, but it was Miller who originated the idea.
What he suggested was that people can keep track of approximately seven plus or minus two separate categories or situations at any one time. Wine judges who are very well-trained may be able to divide wines into nine quality groups, teachers may be able to assign pupils into seven distinct groups in terms of their skills, and so on.
Now it appears that for some tasks, we may only be able to keep track of four distinct categories at any one time. Steven Luck and Edward Vogel reported in \INature\i during November that humans can instantly recall the details of only four different objects. They say that a better understanding of this memory limit could have practical applications, such as improving the design of dashboard displays in cars or road signage.
Their work relates only to the subsystem of short-term memory that stores visual information, but it is often this that we rely on most, especially in situations like driving, where verbal memory is almost unused.
They used a simple test, flashing several squares, each one a single colour, onto a screen for one tenth of a second. After 1 second of darkness, they showed their subjects the squares again for 2 seconds, sometimes with one colour altered. As is traditional in \Jpsychology\j, the subjects were \Jpsychology\j students, ten of whom were repeatedly tested to see if they could recall if there had been a change.
With three or fewer squares, the students answered almost perfectly, but their success rate began to drop away when there were four or more squares. Next, the researchers made the task more difficult by asking the students to remember multiple features such as colour, orientation, size, or the presence or absence of a gap in rectangles. And once again, they noted a drop in accuracy when more than four rectangles were on the screen during the test.
Interestingly, they could remember a total of 16 features, as long as they were associated in only four objects: it looks as though some "chunking" was going on, of the sort we see when people remember words rather than collections of letters. There is almost certainly a message here for people who are designing things as diverse as road signage and \Jaircraft\j and \Jautomobile\j displays.
Or as the researchers put it, "the capacity of visual working memory must be understood in terms of integrated objects rather than individual features, which places significant constraints on cognitive and neurobiological models of the temporary storage of visual information".
#
"Herpes virus a trigger for MS",376,0,0,0
(Nov '97)
A strain of reactivated herpes virus may be associated with multiple sclerosis (MS), an autoimmune disorder in which the body attacks its own tissues. More than 70% of the American patients studied with the relapsing-remitting form of MS showed an increased immune response to human herpes virus-6 (HHV-6) and approximately 35% of all MS patients studied had detectable levels of active HHV-6 in their serum, according to a report announced in November, and appearing in December's issue of \INature Medicine\i. Only 18% of a control group showed an increased immune response to the virus.
The implication from this is that it may be possible to use currently available antiviral treatments like acyclovir to treat MS. Curiously, HHV-6 is found in about 90% of the US population, where it causes the common childhood illness, roseola. So now we need to know why such a common virus causes this major change at such a low level - the 250 million Americans include only about 350 thousand with MS.
Significantly, magnetic \Jresonance\j imaging detected numerous lesions in the \Jmyelin\j in the brain of a recently deceased MS patient, and an autopsy revealed HHV-6 in the lesions, but not in the adjoining normal tissues.
#
"Identifying habitats",377,0,0,0
(Nov '97)
We brought you last month a report on the new 840-square-mile (2000 square \Jkilometre\j) Masoala National Park, officially dedicated in October by the people of the \JMalagasy\j Republic (Madagascar). This month, a brief account of the methods used to identify the habitats which needed to be protected.
Claire Kremen, a research associate with Stanford's Center for Conservation \JBiology\j (CCB), has described some of her methods early in November. She led the planning team that designed the park for an international consortium that includes the Madagascar government, the WCS, CARE International, the Peregrine Fund and the people of Masoala.
In consultation with local communities, Masoala's planners sought to preserve natural habitats while respecting the traditional boundaries of villages. A long-term management plan is being implemented for the park and its surrounding waters. To help villagers better their lives yet sustain the forest, work is under way to build markets for renewable resources, such as ecotourism, butterfly farming and the sale of individually cut trees to buyers of high-value "certified sustainable" wood.
Conservation agents, specially-trained members of the local population, visited scores of villages on the Masoala Peninsula to learn about the population and what villagers considered to be the local forest necessary for their use. A Global Positioning System receiver would then be used to determine the latitude and longitude of each village.
The two teams (biologists and village surveyors) began with topographic maps and \Jsatellite\j imagery provided by the Missouri Botanical Garden and the U.S. Geological Survey, to chart the rugged terrain and the vegetation of the 1500-square-mile (4000 square \Jkilometre\j) peninsula.
After the teams had established the "ground truth" of these maps, they were able to establish a scientific basis for choosing which parts of the peninsula to set aside as most crucial to support the peninsula's rich diversity of species, and how much forest to leave outside the preserve as a functional buffer and support zone in which to develop sustainable economic alternatives to forest destruction.
The \Jbiology\j team had to conduct \Jbiodiversity\j inventories of birds, mammals, selected insect taxa and one of the world's most diverse collection of palms. They assessed the potential influence on diversity of gradients in rainfall, elevation, soil type, distance along the peninsula and distance from the forest's edge into the forest.
In the end, butterflies showed the way. Kremen and her colleagues at the Natural History Museum in London had earlier studied a group of brown "wood nymph" butterflies that have differentiated into more than 60 different species across Madagascar. Each species occupies its own niche, so the butterflies are an indicator that different types of soil, moisture and other conditions support a different mix of plants and animals in each of those habitat types. It was the first time that this application of butterflies as an "indicator species" was used to design a nature preserve.
And then to find out how large an area should be protected, the scientists assessed the ranges needed by wide-foraging animals like red-ruffed lemurs and the Madagascar serpent eagle, a species feared extinct, but rediscovered by the Peregrine Fund during the surveys.
#
"Lemurs are doing well",378,0,0,0
(Nov '97)
The Duke University lemurs, released on Madagascar last month, are starting to move further afield, though they have do not seem to have had any interactions with wild lemurs in the release area. After their arrival in Madagascar three weeks ago, the animals-Janus, Letitia, \JPraesepe\j, Sarph and Zuben'ubi-had spent time in an outdoor cage in the reserve, under the care of \JSan Francisco\j Zoo veterinarian Graham Crawford. During that time, all began eating fruits harvested from the forest, supplemented with commercial monkey food.
Before release on November 10, the lemurs' tail hair was trimmed to give each a distinctive pattern, so they could be better identified as they were tracked through the forest. After a brief ceremony, the doors were opened, and within five minutes, the lemurs were moving off.
Unfortunately, three animals immediately headed for the project's base camp and had to be recaught and released to ensure they would stay away from the familiar comforts of "home", while another moved too close to the territory of a wild group of ruffed lemurs, a contact which the researchers wanted to discourage early in the animal's adjustment to the wild. So, he had to be recaptured and freed near the release site.
#
"Cassini, last glimpse for now",379,0,0,0
(Nov '97)
Joseph Montani and colleagues at the University of \JArizona\j's Lunar and Planetary Lab in Tucson pointed their 36-inch Spacewatch \Jtelescope\j toward Cassini, which was 1 week into its mission to Saturn, late in October, and managed to detect the craft.
Cassini was moving at about three times the speed of an asteroid in the main belt, making the study a useful test. Certainly the observers had the advantage of a detailed \Jorbit\j to search along, but it was still a victory when the CCD (charge-coupled device) on the \Jtelescope\j registered the object, then nearly 5 million kilometres from \JEarth\j. We will have one more chance to see Cassini when it flies past \JEarth\j to gain some more speed on its trip to Saturn.
#
"Neutrinos have a cycle too",380,0,0,0
(Dec '97)
Science research is largely about designing better "models", simple descriptions of how things work and behave. Advances in science come when somebody finds a flaw in an existing model, or a better and simpler model that explains all of the same things.
The \Jsun\j has a large number of cycles associated with it, like the eleven-year cycle of \Jsunspot\j activity. Any model which explains the \Jsun\j's operation has to take these cycles into account, and explain them as well.
Apart from the \Jsunspot\j cycle, there is a 157-\Jday\j periodicity that Eric Rieger of the Max Planck Institute in \JGermany\j found in the intensity of solar flares, and a 780-\Jday\j "quasi-biennial" periodicity that Kunitomo Sakurai from Kanagawa University reported finding in neutrino data from the Homestake neutrino detector in South Dakota. Now data from the same detector have yielded a new and rather more important cycle in the \Jsun\j.
A December report in the \IAstrophysical Journal\i by Peter Sturrock, Guenther Walther, and Michael Wheatland, describes how Homestake data, collected over a 24-year period, show a 28.4 \Jday\j cycle in neutrino flux. The cycle has been dissected out using a complex statistical analysis, but it has only a 3% chance of being just a random pattern.
Neutrinos are not very exciting things to the casual observer. They are small, they have no charge, no rest mass, according to the "standard model", and the \Jsun\j does not seem to be producing enough of them. They are also very hard to detect, which could explain why we do not see enough of them, or maybe something happens, somewhere between the core of the \Jsun\j, where neutrinos form, and when they get here.
The three Stanford researchers favour a model for the \Jsun\j which involves magnetic fields that disrupts the neutrino flow, but if they are right, then our model for the neutrino is wrong. That is no great problem: the new model might help to explain the "missing mass" of the universe, and the shortage of solar neutrinos as detected at the \Jearth\j's surface as well.
As we understand the \Jsun\j now, light takes as much as a hundred thousand years to ooze and bounce its way from the solar core to the surface, and then it leaps across space, travelling the last 150 million kilometres in 500 seconds.
This slow travel to the surface would smear and hide most pulsations inside the \Jsun\j, burying them under layers of random "noise". Neutrinos, however, travel straight up from the centre of the \Jsun\j, passing through ordinary matter as if it did not exist, so any internal fluctuations would be preserved in the rate of neutrino arrivals at the \Jearth\j.
This same disdain for ordinary matter is what makes it hard to detect neutrinos, which is probably just as well, seeing something like a million billion (10\U15\u) solar neutrinos pass through your body each second. Occasionally, a neutrino passing through a vat of carbon tetrachloride (tetrachloromethane to the chemists, dry cleaning fluid to the old-fashioned) will change a single \Jchlorine\j atom into an \Jargon\j atom. If you have a large enough tank (around a hundred thousand gallons, half a million litres), and if the tank is deep enough below ground, so cosmic rays are blocked off, you may just be able to detect the odd neutrino now and then.
The Homestake gold mine became the first such home to a neutrino detector, a mile (1.6 km) below the ground. It only detects one "neutrino event" every two days, about a third of the predicted level, but this is in accordance with the results obtained at two other detectors, Kamiokande in \JJapan\j, and Gran Sasso in \JItaly\j.
One way of explaining the shortfall in the detection rate is to assume that the \Jsun\j is cooler than expected at its centre, but measurements of sound waves travelling inside the \Jsun\j imply a \Jtemperature\j of 15.6 million degrees \JCelsius\j, so that rules out any "cool \Jsun\j" explanation. This leads some physicists to assume that the neutrinos have a tiny but non-zero rest mass, and this is the favoured solution for the Stanford group.
Neutrinos come in three varieties, each associated with a different elementary particle (\Jelectron\j, muon and tau). According to some new theories, if neutrinos have mass, then they may cycle between the three different neutrino types (see Neutrinos--Do they have a mass?, June 1997). While the proposed neutrino mass is far too small to measure directly, if this \Jcycling\j can be demonstrated, then we would have a clear idea of why the shortfall occurs. Only one of the three types, the \Jelectron\j neutrino, is detectable, so if the neutrinos are \Jcycling\j through the three types, this would explain rather elegantly why only one third of the expected neutrinos are detected (except that there is no good reason why the three types should occur in equal amounts!).
With so few events, the Homestake data have normally been taken around four times a year, apparently making it impossible to detect a cycle as short as 28 days, but because the times are irregular, statistical analysis can be used to drag out the underlying pattern. Their first analysis, reported in 1996, claimed a 21.3-\Jday\j cycle, but when this analysis was presented to a peer-reviewed journal, one of the reviewers was unable to duplicate their results, which eventually drew their attention to a transcription error in the Homestake data.
The 21.3-\Jday\j cycle disappeared, once this error was corrected, to be replaced by a 28.4-\Jday\j cycle which corresponds closely to the \Jsun\j's perceived rate of rotation, as observed from the \Jearth\j's \Jorbit\j around the \Jsun\j. As this rotation is a property of the outer "radiative zone", this suggests that whatever is causing the \Jcycling\j may be found not in the core of the \Jsun\j, but in the outer layers.
Each solution brings a problem-and perhaps a grain more truth. If the neutrinos are being affected by something in the \Jsun\j's outer layers, this must be tied up with magnetic fields in the \Jsun\j. But magnetic fields can only affect neutrinos if they have a \J\Jmagnetic moment\j\j, and standard model neutrinos do not have a \J\Jmagnetic moment\j\j. Not unless they also have mass, that is.
Neutrinos can spin in either direction, and these directions are called, by convention, left-handed and right-handed. Nuclear reactions produce left-handed neutrinos only, and only the left-handed neutrinos take part in nuclear reactions such as converting \Jchlorine\j into \Jargon\j. If different parts of the \JSun\j have different strength magnetic fields, the flux of left-handed neutrinos will vary as they travel in different directions from the \JSun\j. That would lead to a detection rate on \JEarth\j that varies with the \JSun\j's rotation period.
This one will be worth watching.
#
"Space junk a hazard",381,0,0,0
(Dec '97)
Outer space is by no means as empty as we once believed, and near-\Jearth\j space is even worse. Our \Jplanet\j has a thickening layer of drifting debris such as small metal fragments and paint chips. Taken together, these bits and pieces can almost double the accident risk faced by some shuttle crews.
Radar surveys reveal that space junk is now a serious problem, with a fifty-year collection of abandoned satellites, nuts, bolts and fragments, all whirling around the \Jearth\j. These travel at about 7.5 kilometres a second (17 thousand miles an hour) in all sorts of elliptical orbits in all sorts of directions. Yet when the \J\Jspace shuttle\j\j was designed and built, there were many fewer bits and pieces up there, so the shuttles are only lightly protected.
At that sort of speed, a piece just 2/10" (5 mm) across can vaporise metal and send hypervelocity waves through the shuttle as it punches a fist-sized hole in whatever it hits. Radar systems on the ground can track the larger pieces of debris, but 95% of what is there is too small to be radar-visible from the \Jplanet\j's surface, and that 95% adds up to literally millions of odds and ends.
So now NASA is looking at ways of reducing the risk of damage from debris. During December, a report from the US National Research Council suggested that the risk of damage is as great as the risks from all other sources. Now NASA will be looking at changes in procedures, and also adding extra shielding to the shuttle's most vulnerable areas.
#
"How the planets influence us",382,0,0,0
(Dec '97)
It looks as though the planets do influence our lives after all, though not quite in the way that astrologers would have us believe, and certainly not on the time scale of a single human life. University of Toronto physicist Jerry Mitrovica and Allessandro Forte of the Institut de Physique du Globe de Paris reported in \INature\i during December on their numerical simulations which show the connection between \JEarth\j's changing shape and the gravitational effects of other bodies in the \JSolar System\j, particularly Jupiter and Saturn.
The \Jearth\j's rotational axis precesses (precession of the equinoxes), turning slowly over a 26 000 year cycle. The obliquity of the \Jecliptic\j, the tilt of the \JEarth\j's axis, varies over a period of 40 000 years. As the precession and obliquity vary, so too does the \Jearth\j's climate, mainly because the pattern of the sunshine that falls on the \JEarth\j has been altered.
The researchers' simulations show that the obliquity and precession have been affected by the gravitational attraction of Saturn and Jupiter. At some time during the last 20 million years, the \JEarth\j passed through a gravitational \Jresonance\j associated with the orbits of Jupiter and Saturn, which in turn influenced the way the \JEarth\j's axial tilt changed during the same period. This gravitational pull would have had a much greater impact on the \JEarth\j millions of years ago when the \JEarth\j was shaped differently, says Mitrovica in a press release sent out on the \JInternet\j.
"To understand climate on \JEarth\j it's clear that we need to consider the \JEarth\j as this dynamic deforming system," Mitrovica added. "But we also need to understand, more than we thought we did, the \JEarth\j's place in the \Jsolar system\j."
#
"Mars news",383,0,0,0
(Dec '97)
With more than 9500 pictures sent back by the Pathfinder mission, it took scientists into early December to complete their full analysis of the information. In the journal \IScience\i for December 5, the Mars Pathfinder imaging team reported that surface photographs provide strong geological and geochemical evidence that fluid \Jwater\j was once present on the red \Jplanet\j.
The 9669 pictures appear to confirm that a giant flood left stones, cobbles and rocks throughout \JAres\j Vallis, the Pathfinder landing site. More importantly, the researchers found evidence for a mineral known as maghemite-a very magnetic iron oxide. This forms in \Jwater\j-rich environments on \JEarth\j and could probably have been formed the same way on Mars.
So where did the \Jwater\j go? Did it evaporate into space, disappear into sub-surface aquifers as a liquid or freeze into ice below the surface or at the Martian \Jpoles\j? For now, nobody knows, but answering this question will be a major challenge for the next mission in 2001.
The images also reveal a dustier and more active \Jatmosphere\j than anybody had expected. There were even wispy, blue clouds, possibly made of carbon dioxide crystals (dry ice), travelling through Mars' salmon-coloured sky. White cirrus-like clouds, made of icy \Jwater\j vapour, also circulate throughout the thin Martian \Jatmosphere\j.
While the \Jweather\j is more active than expected, it is still puny by \Jearth\j terms. Some of the named Martian rocks like Yogi, Barnacle Bill and Scooby Doo must have been on the surface for millions of years to be carved and sandblasted by the weak winds of Mars.
Other reports in the same issue of \IScience\i identified a number of major finds. The better estimates of Mars' rotation rate and the wobble of the polar axis point to the \Jplanet\j having a dense, iron-rich core. The daytime \Jtemperature\j proved to be slightly warmer than that recorded by Viking I, and there was evidence for "dust devils" occurring. The Alpha Proton X-ray Spectrometer (APXS) on Pathfinder completed about a dozen chemical analyses and determined that the sampled rocks are high in \Jsilica\j, unlike the Martian meteorites, and probably represent a differentiated crust.
The Viking soil analyses and information from the twelve meteorites assumed to be of Martian origin pointed to a rather primitive surface composition, but this new evidence suggests that the Martian crust is probably as highly differentiated as the \JEarth\j's crust.
The sampled soils have a different composition from the rocks, and these soils may have formed by the addition of \Jmagnesium\j and iron from mafic rocks, like the Martian meteorites, to the locally eroded rocks during weathering. All of the observations and analyses suggest that Mars was indeed a warmer and wetter place a long time ago. Each of the five soil analyses gave similar results, and these were consistent with results obtained by Viking I.
Taken together, the evidence from the soils and from rocks such as Barnacle Bill (a felsic rock, low in \Jmagnesium\j, high in silicon and \Jaluminium\j) tells us that Mars must have a wide variety of geological types, just as our own \Jplanet\j does. This means that future expeditions will need to be highly mobile, in order to assess as many areas as possible.
#
"Jupiter news",384,0,0,0
(Dec '97)
Meanwhile on Jupiter, there is oxygen at Callisto's surface, and there are sulphur dioxide sources on Io. In either case, you would be well-advised to hold your breath. The ultraviolet spectrometer on \JGalileo\j has detected \Jhydrogen\j atoms escaping from Callisto, which implies that the Mercury-sized moon has oxygen locked up in its ice and rocks. In 1996 \JGalileo\j detected evidence of oxygen on the surface of Callisto's neighbouring moon, Ganymede.
Ganymede's oxygen probably comes from collisions between charged particles from Jupiter's plasma torus and the icy surface of the moon. On Callisto, the main cause appears to be ultraviolet sunlight striking the icy surface, but the levels of oxygen present are going to be extremely low. So hold your breath, if you ever get there.
While Io's volcanic activity appears to be rather variable, the sulphur dioxide levels are maintained when SO\D2\d ice sublimes on the moon's surface.
#
"Gravitational waves may be detectable",385,0,0,0
(Dec '97)
Dutch physicists, working in what is usually called the Grail Project, have concluded that a resonant antenna to detect gravitational waves is feasible. Gravitational waves have never yet been detected, but the workers believe a solid sphere of a copper-\Jaluminium\j alloy with a diameter of more than three metres, weighing more than a hundred tonnes, and held almost at \J\Jabsolute zero\j\j, may just be what they need to succeed. They hope to have their first successful results in 2002.
Gravitational waves should occur when several masses are accelerated with respect to each other. If you jump from a ladder, you are accelerated with respect to the \Jearth\j, and should create a gravitational wave, which will then spread out into space. But just as a small pebble only makes a small splash, serious gravitational waves will only be found when a massive star collapses to form a neutron star, or when material is pulled into a black hole.
The massive copper sphere will vibrate when the waves from such an event reach it. The spherical shape of the antenna means that each direction will be equally favoured, a vibration-free suspension will protect the antenna from other rumblings, and sensitive microphones will detect any signal and magnify it.
The instrument will be so sensitive that cosmic particles might set it off, so it will probably need to be located deep underground, and its extreme cold, between .01 and .02 kelvin, should guarantee that thermal interference is avoided or ruled out. The instrument will measure a range of frequencies between 100 and 150 hertz wide, centred on 700 Hz.
The alloy was chosen to have no residual magnetism, and a thermal \Jconductivity\j which will allow rapid cooling of the entire sphere. The anticipated degree of movement will be of the order of 10\U-21\u metre, and this will require the use of five examples of a superconducting quantum interference device (SQUID). For more information, look to www.nikhef.nl/pub/projects/grail/grail.html
#
"Teleportation a tiny step closer",386,0,0,0
(Dec '97)
That long-standing dream of Star Trek fans, particularly the ones who have to commute, the teleportation device, came a tiny step closer during December. But so far, it only works on single photons, so there is a way to go before this is a successful commuting method.
Anton Zeilinger and his team at the University of Innsbruck in \JAustria\j have shown that part of the spin orientation of a \Jphoton\j of light can be transferred instantaneously to another \Jphoton\j, no matter how far away it is. This appears to be the first experimental demonstration of quantum teleportation, the transfer of a quantum state from one particle to another, which was first proposed as a thought experiment by \JIBM\j's Charles Bennett and his collaborators in 1993.
Before that, Albert Einstein knew of the basic concept, but rejected it, calling this notion of action at a distance spooky. All the same, this was a process that was due to appear about now: a second research group in \JRome\j has achieved similar results, which will be published soon in another journal.
In the future, this process might help physicists build superfast quantum computers, by providing a safe way to communicate delicate quantum information. The trick relies on creating a pair of photons that are intimately related. When \Jlaser\j light is fired into certain crystals, individual photons can split into two identical twins, with a special property: they are "entangled" in the terms of a quantum physicist. In plain terms, the sum of the two offspring photons has to equal the original quantum state.
This means that when you measure, say, the spin of one \Jphoton\j, and find that its spin is up, the entangled twin is instantly forced into the opposite state-spin down-no matter how far it has travelled. As Bennett describes it, suppose a sender called Alice makes a combined measurement of the "message \Jphoton\j" and one member of the entangled pair. This forces the two photons measured by Alice to have opposite states. At the same time, the second member of the entangled pair is forced into a specific state.
Now the recipient, Bob, makes a measurement specified in advance by Alice, and he finds that the entangled \Jelectron\j has a state which is identical to that of the original message \Jphoton\j. One of the limitations of the method is that only information flows from Alice to Bob, being transferred from one \Jphoton\j to another. "It's more like faxing than teleportation", said one physicist.
Charles Bennett commented that we were so far away from teleporting even a bacterium that it was not worth thinking about. Zeilinger believes we might be able to teleport atoms within a few years, and molecules within a decade or so. But if the whole concept seems difficult, don't worry too much. Zeilinger says he does not really understand how it works either.
#
"Finding landmines",387,0,0,0
(Dec '97)
The recent Nobel Peace Prize award to Jody Williams and ICBL (see October), coupled with the death of Diana, Princess of Wales, has made the public far more aware of land mines. The guerrilla fighter's dream: mines can cost as little as a dollar, they are easy to make, they are often lethal and they cannot be detected by current technology. But that last qualification may be about to blow away.
Research which will be published in the February 1998 issue of \IPhysical Review E\i, indicates that weak shock waves sent into granular beds like soil, will cause acoustic signals containing critical information to be reflected off buried objects, such as land mines. Surajit Sen, who has been working and publishing on this topic for several years, believes that his work will lead to an accurate and inexpensive detection method effective for land mines in either plastic or metal casings.
It was only after his first publication that Sen watched a \Jtelevision\j news report about land mines and realised that his work might be able to be applied in this way. The shock waves are able to detect the shape and size of the object they bounce off, and also to provide information about the density of the object they are reflecting from. Sand, for example, has a density of around 2700 \Jkilogram\j per metre cubed, while plastic is lighter, with a density of around 1100 kg/m\U3\u.
Sen envisages a "special microelectromechanical device that would send weak acoustic shock waves deep into soil and detect the pulses that are returned after hitting an object". The bad news: the system has worked well, so far, but only in simulation. The next step will involve testing in sand and soil boxes, before going on to field trials.
#
"Free supercomputer anyone?",388,0,0,0
(Dec '97)
After the Stone Age came the Silicon Age, and after the Silicon Age, came the Age of the Stone Souper. "Stone Soup" is a \Jfable\j about a soldier who persuades villagers to help him make soup from a stone, each villager contributing something small to the pot to "make the stone soup better". In the computing world, the best-known application of the Stone Soup principle has probably been the development of Fractint, a fractal program for PCs of many flavours, although the \JInternet\j owes its existence to a similar process.
Now the term may take on an entirely new meaning as people emulate Forrest Hoffman and Bill Hargrove of Oak Ridge National Laboratory (ORNL) in \JTennessee\j. They were refused funding to build a parallel supercomputer, a machine made by wiring together a collection of individual PCs, which divides massive calculations into small chunks, farming out each one to a single PC that sends a crunched number back to a central processor to be integrated with the rest.
They managed to scavenge 48 PC 486s and strung them together in an empty floor of an ORNL computer building. Their Stone Souper, which carries out 1.5 million calculations per second, is only one-seventh as fast as a "real" parallel computer built with state-of-the-art components. All the same, it performed the cumbersome statistics that Oak Ridge environmental researchers needed to make a US map, with four soil variables plus elevation, showing which regions are best suited for growing certain plants.
And the best news? People are starting to discard Pentium computers, and just so long as people continue to discard and upgrade, there will be people there, ready and willing to use their leftovers to make a tasty pot of Stone Soup.
#
"Making the Net faster",389,0,0,0
(Dec '97)
Next to making a faster computer, there can be no greater joy than making the Net faster. On December 22, computer scientists at Washington University in St. Louis announced that they have patented two major inventions that should make \JInternet\j applications like e-mail, the World Wide Web and electronic commerce 10 times faster than they are now.
It now takes 1.2 microseconds to look up an \JInternet\j address: the new system will allow this to be done in just 100 nanoseconds, a better than tenfold increase in speed. This will increase the throughput of every packet of data passing across the \JInternet\j.
While links between computers are getting ever faster, the routers that pass messages on are not getting faster, leading to the risk that they will cause bottlenecks. The number of computers on the \JInternet\j is tripling every two years, and more complex messages and files are sent as people add multimedia, audio and video to their Web pages and transmissions.
When a router receives a message you have sent, it reads the \JInternet\j (IP) address, and determines which of many links it will pass the message along to. If it is a large message, your e-mail may be in a number of separate packets, and each needs to be examined separately.
In simple terms, it would help if the router had a lookup table that gave the correct links for every \JInternet\j address in the world, but this would make for a huge \Jdatabase\j, so the trick is to take the IP address in chunks. This keeps the \Jdatabase\j size down, but makes the lookup task more complex.
Real \JInternet\j addresses are not those odd little "me@myISP.com" addresses we all know and use, but sets of numbers, read in binary form by computers. We each have one of the approximately four billion IP addresses ranging from 0.0.0.0 to 255.255.255.255-these are the so-called 32-bit addresses, to be replaced at some point in the future by 128-bit addresses. While there are enough addresses available for all the world's literate people, if you want your toaster and washing machine to be controlled over the \JInternet\j next century, they will need their own IP address as well!
Routers today carry a \Jdatabase\j of about 40,000 prefix entries only, and use this to pass data packets along, but because there are 32 possible lengths for prefixes, processing is still too slow. Eight companies have now signed non-disclosure agreements, and are looking more closely at the methods developed by George Varghese, Venkatachary Srinivasan, Jonathan S. Turner and Marcel Walgvogel, and their university hopes soon to enter into licensing agreements.
December saw a redating of the northern Australian sediments at a site where humans may have been present more than 100 000 years ago. The previous dates, using thermoluminescence, had been recognised as having a possible flaw: a few old grains would raise the average age of the deposit considerably. Now individual grains have been dated, and it appears that the site was only some 40 000 years old, with contaminating grains in the deposit.
In the same month, the date for the arrival of the first "native Americans" (who were then Asian immigrants) has been pushed back to about the same date of 40 000 years ago. This was reported in December's \IAmerican Journal of Human \JGenetics\j\i. This finding may help to reconcile the disagreement between geneticists who believe that Native Americans descended from a single wave of immigrants, and the archaeologists who believe that the great cultural diversity of the new arrivals' descendants points squarely to a settlement that came as multiple waves.
Mitochondrial DNA is always passed from mother to child, and never from the father. Some parts of the mitochondrial DNA have no function, and so are free to mutate. In a large population, this happens almost as regularly as clockwork, explaining why mtDNA is sometimes referred to as a biological clock.
Over time, groups which began with the same ancestry, will diverge. From the number of mutations between two groups, we can estimate how far back they separated from each other. All Native American groups share four typical \Jmutation\j patterns, resembling those seen in some modern Asians, suggesting that a single group of Asians gave rise to all Native Americans. Using the characteristic patterns found in short sections of mtDNA suggested an arrival date of around twenty to thirty thousand years ago.
Now Sandro Bonatto of the Universidade Federal do Rio Grande do Sul in \JBrazil\j, working with Francisco Salzano, has looked at the DNA sequences of more than 700 individuals from 20 Native American groups. The results point to modern Native Americans sharing a single group of ancestors who lived in \JNorth America\j at least 25 000 years ago, and more probably between thirty and forty thousand years ago.
Of course, this difference could have arisen before the various groups crossed the Bering land bridge which joined \JAlaska\j to \JRussia\j, but it is also possible that the first immigrants arrived in north America forty thousand years ago, split up, and began to develop the different cultures which seem to have been in place by ten thousand years ago.
#
"Artificial life by way of symbiosis",391,0,0,0
(Dec '97)
On December 14, 1967, biochemists Arthur Kornberg and Mehran Goulian announced the creation of an artificial copy of DNA that was biologically active and could infect cells. On December 11, 1997, D H Lee, K Severin, Y Yokobayashi & M R Ghadiri reported in Nature on a set of two self-replicating molecules with a symbiotic relationship. The two \Jpeptide\j chains or proteins might be seen as competitors for resources, but each catalyses the formation of the other.
It began last year when chemist Reza Ghadiri of The Scripps Research Institute in La Jolla, \JCalifornia\j, announced the discovery of the first protein that could reproduce itself. When this was present in a system, it made the assembly of other fragments happen faster, but that did not make the molecule "alive". Ghadiri wanted to see a "molecular \Jecosystem\j," in which several molecules interact to promote each other's survival.
So far, Ghadiri's laboratory has come up with about eight replicating protein fragments. Two of these, called R-one and R-two, have a common segment, called \Jpeptide\j E, and slightly different versions of a second piece. Because each replicator needed the same resource, \Jpeptide\j E, they might be regarded as competitors, when they were together, the two types grew five times as fast.
This form of reproduction, termed a symbiotic hypercycle, was proposed by Nobel laureate Manfred Eigen in 1971, but now the theory has turned into reality. The big surprise for chemists, though, is that it turned up among peptides, rather than among nucleic acids.
#
"South American fossils in Madagascar and India",392,0,0,0
(Dec '97)
A group of mammals, the Gondwanatheria, previously known only as fossils from Argentina, have now been located in both Madagascar and India. The 65-70 million year old mammals, dating from the Late Cretaceous period, are not related to any groups living today and are known commonly as gondwanatheres.
The fossils have been identified as teeth, but these are highly distinctive. Finding them so widely dispersed is going to require some rethinking of the main ideas of \J\Jplate tectonics\j\j. Their name, which literally means "Gondwana mammals" now looks a little suspect, though it is more likely to represent independent support for a recent revision of plate tectonic theory, which has India and Madagascar attached to eastern \JAntarctica\j in the late Cretaceous.
No gondwanathere remains have been found in \JAfrica\j so far, supporting the \Jfossil\j evidence from dinosaurs, which implies that \JAfrica\j was isolated at that time.
#
"Water trapped in earth as crystals",393,0,0,0
(Dec '97)
While some people think the \Jearth\j's surface \Jwater\j comes from small comets in space, Professor Joseph Smyth told a American Geophysical Union meeting in \JSan Francisco\j during December that the \Jearth\j's interior may contain three to five times the \Jearth\j's surface \Jwater\j locked within billions of crystals. He suspects that this trapped \Jwater\j could help regulate the level of \Jwater\j on the surface of the \Jplanet\j.
Ten years ago, Smyth discovered that a mineral called wadsleyite, located 250 miles to 350 miles below the \Jearth\j's surface, could contain \Jwater\j, not as a liquid, but as the elements needed to make \Jwater\j, bound into the solid crystals, giving the crystals a 3.3% \Jwater\j content.
The wadsleyite is found in the \Jearth\j's mantle, and when convection brings some of the mineral to the surface at the volcanic vents of mid-ocean ridges, the \Jwater\j might be released, says Smyth. He is currently trying to make wadsleyite in the laboratory to study it: in its own environment, wadsleyite is stable at a pressure of about 3 million pounds per square inch (20 million kPa) and a \Jtemperature\j of about 3000░F (1650░C).
#
"Big quakes may be gentler",394,0,0,0
(Dec '97)
The \JMexico City\j \Jearthquake\j of 1985, the Newcastle \JAustralia\j \Jearthquake\j of 1989, and the Kobe \Jearthquake\j of 1995 all had one thing in common. Buildings on unconsolidated sediment were badly shaken by an effect which causes structures built on soil shake harder than those perched on bedrock. Under some conditions, the shaking can be three times as great on loose sediments such as sand, landfill, and loose soil.
A report in \INature\i this month indicates that for larger earthquakes, the \Jmagnification\j may be less than for small earthquakes. This had been predicted from laboratory studies, and even incorporated into building codes, but many seismologists expressed concern about extending lab simulations to the real world.
Now they need worry no more. A careful analysis of the records of the 1994 Northridge \Jearthquake\j provided Edward Field of the University of Southern \JCalifornia\j in Los Angeles and his colleagues with the data they needed to resolve the question. Data from 21 seismic stations for the 6.7-\Jmagnitude\j main shock of the Northridge quake and 184 aftershocks provided the answer: while aftershock \Jmagnification\j ranged from a factor of 1.4 to 3.1, during the main event, ground shaking in sediment was never more than 1.9 times higher.
#
"Genome of tuberculosis",395,0,0,0
(Dec '97)
\IMycobacterium tuberculosis\i has 4.41 million base pairs, and now we know them all. The chemical composition of the \Jgenome\j made it one of the most challenging yet to sequence. This is because the DNA is packed with stretches rich in two bases, \Jcytosine\j and \Jguanine\j, which tend to stick together, turning a DNA strand into a nasty knot.
Tuberculosis kills some 3 million people in the world each year, and is becoming increasingly dangerous (See Antibiotic-Resistant Bug Found, April, and Tuberculosis in the News, November), so this information will be an important addition to our medical armoury. Apart from anything else, this knowledge should allow researchers to develop tests which will quickly distinguish lethal TB strains from innocuous "cousins".
#
"Genome of a spirochaete",396,0,0,0
(Dec '97)
The \Jgenome\j of the bacterium \IBorrelia burgdorferi\i B31, the cause of Lyme disease, has been reported in \INature\i during December. The \Jgenome\j contains a linear \Jchromosome\j of 910,725 base pairs and at least 17 linear and circular plasmids with a combined size of more than 533 000 base pairs. The \Jchromosome\j contains 853 genes encoding a basic set of proteins for DNA replication, transcription, translation, solute transport and \Jenergy\j \Jmetabolism\j.
This is the first spirochaete \Jgenome\j to be sequenced, and the first procaryotic \Jgenome\j to contain several genetic elements. Lyme disease is the most common vector-borne disease in Europe, the United States and parts of Asia, while it is also becoming more common in \JAustralia\j.
#
"Genome of an archaebacterium",397,0,0,0
(Dec '97)
The Archaebacteria are a group of \Jbacteria\j regarded as ancient when compared with other bacterial kingdoms. They usually exist in extreme environments, and include not only the methanogens, but also the "salt-loving" or halophilic \Jbacteria\j, and the sulphur-acid tolerant thermoacidophilic \Jbacteria\j.
\IArchaeoglobus fulgidus\i is a sulphur-metabolising archaebacterium. A group of 51 authors at three USA institutions reported the determination of the complete \Jgenome\j sequence of \IA. fulgidus\i in \INature\i in late November. The bacterial \Jgenome\j has 2,178,400 base pairs.
#
"Dog genome gets closer",398,0,0,0
(Dec '97)
At the same time, scientists have produced a rough map of the genetic blueprint of dogs, which has just been published in the journal \IGenomics\i. A map like this, while less than a complete base-pair \Jgenome\j, identified a set of molecular signposts along the gene. So researchers, rather than searching for a needle in a haystack, can now search for the same needle in a cup of hay.
There are more than 300 distinct breeds of dogs with a range of genetically defined shapes, sizes, and temperaments, many of which carry a predisposition to certain diseases, many of which also occur in people. By comparing the genes found in different breeds which do or don't get a particular disease, the "needle", a gene causing the problem, can then be spotted. Problems ranging from hip problems to \Jepilepsy\j to \Jblindness\j may be open to attack in this way.
The map for dogs has a marker every 14 million bases-for comparison, the human map now has markers at every one million base pairs. This is like having a map with states and cities on it, but no detail of the streets in the cities just yet.
#
"Tubulin unveiled",399,0,0,0
(Dec '97)
The protein that makes up the cell's internal rail system, transporting everything from proteins to DNA, has now had its structure revealed. Announced just before Christmas, with publication due in \INature\i on January 8, the new knowledge may help researchers design such things as better anti-cancer drugs and fungicides.
The structure reveals how the two parts of the molecule interlock. It also shows the binding site of taxol, an important anti-cancer drug that works by setting up "roadblocks" on the microtubule highway. That information might allow researchers to design a family of microtubule disrupters. (An unrelated report, a few days earlier, indicates that taxol may also be useful in combatting Alzheimer's disease.)
#
"Chicken flu virus scare",400,0,0,0
(Dec '97)
"A brave family ate a large chicken as part of traditional Chinese celebrations of the winter \Jsolstice\j yesterday with more than 10 friends and relatives", reported the \ISouth China Morning Post\i. A cluster of illnesses due to infection with avian influenza A (H5N1) virus seized the imagination of the world's media as Christmas approached. With no evidence that there is any risk of a major outbreak, each death from the "new virus" was reported breathlessly, and Hong Kong's "school zoos" and pet corners faced a loss of all the children's feathered friends.
The "brave family" was not really taking any risk, not if the chicken was properly cooked. The new virus was easily identified from a number of tests in Rotterdam, \JAtlanta\j, and London. It is a known strain which causes influenza in birds, which now seems to have acquired the ability to transfer from birds into humans, but until it acquires the ability to transfer from human to human, it represents no great threat to the world.
Although evidence of person to person transmission has not been found so far, scientists are researching possible vaccines against this virus, "just in case".
The first case surfaced as far back as May, 1997, when a three-year-old boy died of Reye's Syndrome (a rare disease involving the liver and central \J\Jnervous system\j\j, which can be assosiated with influenza B, influenza A or \Jchickenpox\j). The boy was later found to be carrying the H5N1 virus. The second case did not arise until November. By the end of the year, four people had died, and sixteen cases were known, but in the absence of any evidence of human to human transmission, no quarantine or travel restriction recommendations appeared necessary, according to the World Health Organisation.
Before the May case, the disease was known to occur in chickens and ducks, although it was first isolated in terns in South \JAfrica\j as far back as 1961. In the northern spring of 1997, thousands of chickens died of the disease in Hong Kong. In 1983, H5 influenza outbreaks in poultry farms in the USA killed many birds, and cost $61 million to bring under control, but there were no cases of human infection at that time.
More cases were anticipated for January, if only because medical authorities are now on the lookout for the disease, and will be testing 'flu victims more carefully. In fact, it is likely that the new cases reported in December were more a result of better surveillance of Hong Kong's 6.5 million people, rather than a sign of a "flare-up".
There was one reported "family cluster", and work was still going on to identify what appeared to be a common cause as this report was being prepared. While live chickens are the prime suspects, rats, mice, dogs, cats and other domestic and wild birds, in Hong Kong and the vicinity are also under investigation.
Usually, bird influenza only transfers into humans after it has become established in another \Jmammal\j first. And because the affected humans have never encountered the new influenza variety before, they have no resistance to it. Flu viruses commonly change their surface proteins, the parts that our immune systems recognise, through a process called "drift". The real worry is whether we are going to see a "shift", when major changes happen in the proteins. A shift is believed to occur when two different strains of virus come together in the same host.
The WHO Collaborating Centre at CDC (Centers for Disease Control, \JAtlanta\j) has also prepared a kit of reagents which will be despatched shortly to 110 National Influenza Centres in 82 countries for diagnosis of H5N1, and at the end of December, more than a million chickens were condemned to death as the first reports of confirmed human to human transmission were heard.
Meanwhile, During 1997, Djibouti had 41 deaths from \Jcholera\j, \JKenya\j 55, \JSomalia\j 248, \JTanzania\j 1720 and Zanzibar 122 deaths, with another 70 deaths in \JKampala\j. In Sierra Leone, a respiratory illness similar to influenza affected 2000-3000 people and killed 36 between September and December, and some 2.5 million people died of AIDS in 1997.
These outbreaks, however, did not seem to rate the same media prominence. It would be unkind to suggest that the difference might arise from Hong Kong being a major trading point and well-connected to the western world which owns and consumes those media.
No doubt they were remembering the 20 million killed by the 1918 flu epidemic, caused by a virus derived from swine (pig) flu, or the "Asian flu" of 1957-58 and the "Hong Kong flu" of 1968-69, both avian (bird) flu, which killed thousands of westerners. The symptoms of influenza were first described by \JHippocrates\j in 412 BC. The first well-described pandemic of influenza-like disease occurred in 1580. Since then, 31 possible influenza pandemics have been documented.
#
"Why red wine is good for the heart",401,0,0,0
(Dec '97)
Red wine is good for the heart (Wine Good For The Heart?, November), but now we know why. A December report reveals that resveratrol, described as a form of oestrogen, is highly concentrated in the skin of grapes and is abundant in red wine.
Resveratrol protects grapes and some other plants against fungal infections. It has been shown previously to have a number of potentially beneficial properties, including antioxidant, anticoagulant, anti-inflammatory and anti-cancer effects. It has a molecular structure similar to that of diethylstilbestrol, a synthetic oestrogen.
A study reported in the December 9 issue of the \IProceedings of the National Academy of Sciences\i confirms that the substance has oestrogen-like properties, and interestingly, the authors indicate that resveratrol could replace oestradiol in supporting the proliferation of certain breast cancer cells that require estragon for growth.
Strictly speaking, there is no compound called "oestrogen": it is a category of substances defined by their biological effect. Originally named for their ability to induce oestrus ("going into heat") in animals, oestrogens act on cells by binding to a protein called "oestrogen receptor", which then causes certain genes to be expressed, or "turned on."
#
"Immunoglobulin E a killer?",402,0,0,0
(Dec '97)
Immunoglobulin E, or IgE, the antibody responsible for hayfever, appears also to be involved in \Jmalaria\j. At an \Jimmunology\j conference in Britain during December, Dr Marita Troye-Blomberg of Stockholm University, Sweden explained why she thinks that IgE could be a killer.
\JMalaria\j kills between 1.5 and 3 million people a year, of the estimated 300-500 million people who are infected with \Jmalaria\j. The most common deadly form is cerebral \Jmalaria\j, in which blood clots form in the brain, and cerebral \Jmalaria\j patients have higher levels of IgE type \Jantibodies\j in their blood than people with milder forms of the disease..
IgE is produced in response to the mouse parasite \IPlasmodium chabaudi\i, which is related to the malarial parasite, \IPlasmodium falciparum\i. The IgE \Jantibodies\j stimulate the immune system to produce a messenger molecule called \Jtumour\j necrosis factor, or TNF. While TNF helps the body fight \Jmalaria\j, people who produce excessive amounts of TNF have an increased risk of dying from cerebral \Jmalaria\j.
Studies of twins reveal that identical twins produce very similar amounts of IgE whereas non-identical twins do not, suggesting that the amount of IgE produced by an individual is genetically controlled, and there seems also to be a genetic link to the amount of TNF an individual produces in response to IgE. The implication: if researchers try to make immunoregulatory drugs against \Jmalaria\j, they will need to tread carefully, to make sure they do not do more damage than good.
#
"Genetic mutation responsible for allergies?",403,0,0,0
(Dec '97)
A December report in the \INew England Journal of Medicine\i says that some \Jallergy\j-prone people seem to have a genetic flaw that makes them more susceptible. A group of researchers have identified a gene which appears to be found in patients with either severe skin allergies or hyper-IgE syndrome, a rare condition in which the body produces too much IgE. They found an identical genetic \Jmutation\j in seven of their 10 patients.
When they tested fifty healthy adults for both the \Jmutation\j and for above-average IgE levels, they found the \Jmutation\j in 13 of 20 people with elevated IgE levels, but only in five of 30 people with normal IgE levels.
#
"Asbestos transformed",404,0,0,0
(Dec '97)
According to scientific \Jfolklore\j, one insurance company, as early as 1917, refused to insure the lives of people working in the \Jasbestos\j industry. In 1971, your reporter was called a dangerous troublemaker for barring \Jasbestos\j mats from a school laboratory: just a few years later, the mats were entirely banned, and schools were being closed when \Jasbestos\j was found in them. Today, we all know that \Jasbestos\j is a very dangerous substance, but what do you do with it?
Removing \Jasbestos\j throws small fibres into the air, fibres which can be breathed into people's lungs to cause later damage. So in some cases, the safest thing is to immobilise the \Jasbestos\j that was once used to fireproof homes, schools, and offices, by painting it or covering it in cement, but this reduces the fireproofing properties. Now a new \Jfoam\j has been announced which also breaks down the \Jasbestos\j fibres. This \Jfoam\j transforms \Jasbestos\j to a harmless silicate compound while leaving the fireproofing intact.
The \Jasbestos\j eater, announced by the chemical company W. R. Grace and the Brookhaven National Laboratory in December, contains acids and \Jfluoride\j ions that convert the cancer-causing the fibres of \Jasbestos\j to become an amorphous form which seems to be just as useful as the original fibres, while losing the carcinogenic properties.
#
"Kyoto Environmental Conference report",405,0,0,0
(Dec '97)
The \JKyoto\j conference on greenhouse emissions has finally taken place, and cuts have been agreed for carbon dioxide emissions, with the European Union reducing its emissions by 8% below 1990 levels by 2010, \JRussia\j, \JJapan\j and the USA reducing their emissions by 5%, and \JAustralia\j and \JNorway\j being allowed a 5% increase, with other countries falling somewhere between, according to the proposals of Raul Estrada-Oyuela of Argentina, who chaired the meeting.
The proposals cover only CO\D2\d, \Jmethane\j and nitrous oxide: HFCs, PFCs and SF\D6\d will need to be argued over at the next meeting, at Buenos Aires, in 1998. The question of countries such as the USA, Canada and \JRussia\j forming a carbon "club" to trade emissions will also need to be discussed at that meeting. In summary, the conference achieved, at best, a modest gain for the world.
#
"Global warming confirmed",406,0,0,0
(Dec '97)
A new 300-site survey of borehole temperatures spanning four continents and five centuries has confirmed what most scientists already believe-the \JEarth\j is getting warmer and the rate of warming has been accelerating rapidly since 1900. Subsurface rock temperatures confirm that the average global surface \Jtemperature\j has increased about 1░C. (1.8░F.) over the last five centuries with half of that warming taking place in the last 100 years. And 80% of the rise occurred after 1750, when people began making a serious use of \Jcoal\j as a fuel.
The boreholes were in Europe, \JNorth America\j, \JAustralia\j and South \JAfrica\j, and the data were presented to the American Geophysical Union at a \JSan Francisco\j meeting during December. Sensitive thermometers were lowered into boreholes drilled from the surface to obtain the data. Because subsurface rocks preserve a record of actual surface \Jtemperature\j changes over time, boreholes are an important data source for scientists studying global climate change. Short-term changes, such as seasonal variations, penetrate only a few metres underground. Long-term changes on scales of hundreds of years are preserved at greater depths.
#
"African weather",407,0,0,0
(Dec '97)
Over the past 25 thousand years, \JAfrica\j's climate has varied wildly, with the continent's rainfall and average temperatures suddenly plunging or rising, dozens of times. The evidence for this claim is based on core samples taken by the Ocean Drilling Program's drill ship, the \IJOIDES Resolution\i, from about 10 miles (15 kilometres), at 20 degrees north latitude and 18 degrees west longitude, off the coast of \JSenegal\j. Rapid \Jsedimentation\j means that a 2.5 cm (1 inch) layer of sediment represents about eighty years of history, and in that core, dust from desert dust storms and plankton remains, reflecting past ocean temperatures, told a clear story of rapid change.
There were literally dozens of periods when the climate shifted drastically within the space of a century. It has been normal to assume that climate shifts were slow and gradual, but future climate changes in \JAfrica\j could be just as rapid, they warn.
Interestingly, the study offers further confirmation of the 1500-year cycle in the \Jearth\j's climate, reported last month (The \JEarth\j's 1500-Year Rhythm, November). According to Peter deMenocal, a palaeoclimatologist at Columbia University's Lamont-Doherty \JEarth\j Observatory, \JAfrica\j becomes dramatically colder and wetter every 1,500 years, and stays that way for centuries. And this cycle of cold-wet, then warm-dry periods exactly matches a pattern of dramatic, abrupt changes in the North Atlantic region reported last month, which was the work of another Lamont-Doherty researcher.
#
"Cold snap 8200 y.a.?",408,0,0,0
(Dec '97)
A team of palaeoclimatologists in America told the American Geophysical Union in \JSan Francisco\j how, 8200 years ago, the world climate suddenly got colder and stayed that way for a few hundred years before temperatures returned to normal. They have dubbed this the "8k event", and they say that it was short, compared to other, more distant events, lasting only about 200 years.
In the cold snap, temperatures dropped 11░F (6░C), and is clearly shown in the \Jtemperature\j record from the \JGreenland\j ice cores, but also in ice accumulation, in the indicators of forest fires and in the amounts of \Jmethane\j found in the \Jatmosphere\j. The researchers believe that the event may be linked to the shutdown of the ocean conveyor system (see Global Warming: Could We Lose The Conveyor?, November) which drives, among other things, the Gulf Stream.
#
"New toxin to combat insect pests",409,0,0,0
(Dec '97)
\IBacillus thuringiensis\i, or Bt to its farmer (and other) friends, has been a pest-control mainstay for the past thirty years. It has had close to a monopoly of the role during that time. Now a new bacterium, \IPhotorhabdus luminescens\i, has been found to contain a toxin which has proven effective against a broad array of insect pests, from household cockroaches to the boll weevil pest of American cotton farmers..
\IPhotorhabdus\i is a widely-dispersed, multiple strain bacterium that lives inside of, and in symbiosis with, soil-dwelling called \Jnematode\j roundworms. The nematodes invade the insects, release the \Jbacteria\j, which then turn the insects into a protein-rich "soup", suitable food for large numbers of nematodes.
The genes of Bt have since been transferred to a number of plants, and 1998 will see an estimated 3 million to 5 million acres of Bt transgenic corn planted in the Midwest of the USA alone. The genes responsible for the Photorhabdus' toxin have already been sequenced, and may well be seen in plants within the next three to five years.
#
"Pill to combat mosquitoes",410,0,0,0
(Dec '97)
An American researcher, Dov Borovsky, wants to put mosquitoes on a diet, to turn them into whining anorexics which starve to death. He has developed a mosquito "diet pill" which alters mosquito digestion, making it impossible for them to feed, lay eggs or survive.
There are more than 3000 species of mosquitoes. Worldwide, mosquito-borne diseases infect about 700 million people each year and kill at least 3 million. Borovsky's pill will work on all of them, and he is even willing to share his "recipe", which has a certain resemblance to lines written by W. Shakespeare when he was developing what actors call "the Scottish play".
See if you think the three witches would like this recipe: first, take a hundred thousand mosquito ovaries, dry and crush them into a powder that contains their digestive control hormone. From the nearest pool or pond, scrape off the green scum, also known as \IChlorella\i, an alga. Insert the hormone into the \IChlorella\i, make it into a pill, then place the pill into any \Jwater\j body where mosquitoes are known to breed. Then watch the larvae feast on the \IChlorella\i. \JFamine\j follows.
"Fortunately, now we can synthesise the hormone, so we don't have to use 100,000 ovaries for each batch any more," Borovsky says. The synthesised hormone is inexpensive, as is \IChlorella\i, which is found and produced worldwide. \IChlorella\i, in fact, turns out to be the perfect ride for the mosquito hormone, because it can be freeze-dried and stored for long periods and then brought back to life as the deadly diet pill.
Better still, the \IChlorella\i stops producing the hormone within three weeks, making it safe to use in the environment, and this is a deliberate design feature, since mosquitoes continually exposed to the poisoned \IChlorella\i might develop resistance to it. As the hormone gene sits outside \IChlorella\i's \Jgenome\j, and after the third cell division, it is no longer detectable.
Of course, as Borovsky points out, if mosquitoes become resistant to their own reproductive hormone that could have unknown adverse consequences for them as well. It seems to be a lose-lose situation for the mosquitoes..
#
"Gorilla census",411,0,0,0
(Dec '97)
A new count of mountain gorillas in \JUganda\j's Bwindi Impenetrable Forest National Park has found almost 300 of the giant apes, bringing the total to around 600 for this most endangered \Jgorilla\j sub-species. There were 292 gorillas in 28 groups, as well as seven lone silverbacks (adult males).
The researchers followed trails and counted nests. Each night, gorillas build a new nest, and researchers can tell the age of the animal that slept there by the size of dung piles left behind, and if it is a female by the presence of infant dung. In addition, silvery hairs found in the nest can reveal the presence of adult males. Researchers collected hairs from every nest for DNA fingerprinting, to confirm that no groups were counted twice.
#
"Japan Prize results",412,0,0,0
(Dec '97)
A Japanese physicist and two Belgian geneticists have won the 1998 \JJapan\j Prize, one of the world's richest science awards, announced on December 19. Physicist Leo Esaki will receive 50 million yen, about US$391,000, and geneticists Jozef Schell and Marc Van Montagu will each get 25 million yen, about US$196,000.
Esaki, 72, was awarded the prize for the category "Generation and Design of New Materials Creating Novel Functions." His work was on superlattice crystals, which are composed of layered thin films. These crystals exhibit a number of novel electronic properties, such as the ability to carry current at discrete voltages. The technology is at the heart of semiconductor lasers used in optical telecommunications systems, sensors in wireless communications devices, and in devices that read stored data in the coming generation of computer hard disks.
Schell, 62, of the Max Planck Institute in Cologne, \JGermany\j, and Van Montagu, 64, of the Flanders Interuniversity Institute for Biotechnology in \JGhent\j, \JBelgium\j, are to share the prize in the category "Biotechnology in Agricultural Sciences." They were honoured for developing a method of inserting foreign genes into a plant, leading to transgenic plants which can resist insects or diseases.
The prizes will be conferred in April 1998. The 1999 categories for the \JJapan\j Prize have just been announced: "Information Technologies" and "Molecular Recognition and Dynamics in Bioscience."
In the 19 December 1997 issue, the editors of Science offered what they saw as the year's top ten scientific breakthroughs. We thought it would be interesting to see how we fared: we scored about eight out of ten, not bad, considering we were picking the stories up as they broke, rather than having the benefit of hindsight. Here are the selections made by the editors of \IScience\i:
\B1. The \Jcloning\j of Dolly, the world's first cloned adult \Jmammal\j.\b
We brought you that story in February, with follow-ups later in the year - and another follow-up this month-see the next story.
\B2. The Mars Pathfinder mission.\b
We brought you reports on Pathfinder and its mission, every month since July, with a summary report this month.
\B3. \JSynchrotron\j light\b
While we did not cover these giant light sources during the year, we described one of the key discoveries made with \Jsynchrotron\j light, the structure of the nucleosome core particle, in September.
\B4. Clock genes\b
We left that one out. A report in the October 2, 1997 issue of \INature\i from Dr Hajime Tei of the University of \JTokyo\j, \JJapan\j and colleagues, identified the mouse and human versions of the \IDrosophila\i gene period. These genes share several structural features with the fly gene, suggesting that they work in broadly similar ways. In other words, the homeobox gene concept gets further support.
\B5. Single-walled nanotubes\b
We looked at fullerenes last December and again in June, but we have not featured nanotubes in detail. Nanotubes are small tubes made of carbon atoms joined in a lattice arrangement, rather like \Jgraphite\j. The reports of new findings have been constant all year, but never quite big enough to rate a special article. There are still a lot of "maybes" around nanotubes, but we will keep you posted in 1998.
\B6. Microbial genomes\b
Over the past twelve months, we have brought you news of the identification of the genomes of a yeast, \ISaccharomyces cerevisiae\i (May), \IBacillus subtilis\i in July, \IHelicobacter pylori\i in August, \IEscherichia coli\i in September, with no less than three new reports this month!
\B7. Gamma ray bursts\b
See Small Galaxy Disappears (March). Gamma-ray bursts (GRBs) drew the attention of astronomers and physicists this year, as it became clear that these immensely violent astronomical events are occurring at cosmological distances, and not, as might have been the case, in our Galaxy.
\B8. Neandertal DNA\b
See Neandertal Man Partly Cloned (July) for full details. According to the evidence of mitochondrial DNA, isolated from Neandertal remains, the Neandertal people were not "us". (Incidentally, a January report indicates that one Neandertal man, whose skeleton was found at La Ferrassie in the Dordogne in 1909, has scars on the bones which tell us that he died of lung cancer, so even the bones have some stories still to tell.)
\B9. Neurological disease developments\b
The main developments relate to Parkinson's disease and Alzheimer's disease, both of which we have covered fully in a number of articles over the year.
\B10. Europa's ocean\b
We did well there: we brought you the first news in December 1996, with a follow-up in January and a detailed account (Europa's Ocean) in April.
#
"First transgenic cloned sheep",414,0,0,0
(Dec '97)
Dolly may have been the world's first cloned sheep, but now she is followed by Polly, the world's first transgenic cloned sheep. Polly was produced from a fetal cell, which is easier to clone than an adult cell. Before the nucleus was inserted into an empty \Jovum\j, the Roslin researchers injected the human gene that controls the production of Factor IX into the fetal cell. This factor is used as a treatment for human haemophilia B. Factor IX is now produced by extracting it from human blood, or in some cases, by genetically engineered organisms.
The new sheep produces Factor IX in its milk, which means haemophiliacs can now obtain Factor IX in large amounts from a source free of the disease risks associated with human blood. Because Polly has been grown from a fetal cell, she is a clone, and because she carries a human gene, she is transgenic. (In fact, "Polly" is one of six sheep, three of which express the human gene, while the other three are misses.)
The cell used to produce Polly is now being cloned repeatedly to produce a flock of identical sheep, in the hope that when Polly is old enough, her milk will contain Factor IX in commercially and medically useful amounts. Next target for the researchers: transgenic pigs, to act as a source of organs which can be transplanted into humans.
(In early January, a physicist named Richard Seed announced his intention to go ahead on \Jcloning\j human babies for childless couples. There will be more on this next month.)
#
"Prion chaperone identified",415,0,0,0
(Dec '97)
As we indicated in our report on the 1997 Nobel Prize in \JPhysiology\j or Medicine (October), Stanley Prusiner was of the opinion that a "chaperone" molecule might be needed to help a prion convert proteins from one shape to another, causing prion disease.
Now, even before Prusiner can collect his award, a chaperone protein has been isolated in years, and this protein has been shown also to affect mammalian prion proteins. These findings suggest prions are far more widespread than suspected.
Meanwhile, a December \INature\i report implicates B cells, a type of immune cell carried in the blood, in prion diseases like scrapie. According to the report, mice lacking B cells are resistant to infection with scrapie, a sheep condition similar to mad cow disease, when they are inoculated with infectious material in areas outside the brain. Defects affecting only T lymphocytes had no apparent effect, but all mutations which disrupted the differentiation and response of B lymphocytes prevented the development of clinical scrapie. The conclusion is that white blood cells may be involved in the transmission of diseases such as "mad cow disease" (BSE) and CJD.
#
"Small comets in doubt again",416,0,0,0
(Dec '97)
Louis Frank believes that the \Jearth\j is being bombarded with small watery comets (Controversy in Space, May) while others disagree vehemently (No Watery Comets After All?, July). The debate continues, with a paper in \IGeophysical Research Letters\i in which George Parks and colleagues claim to demonstrate that dark spots in instrument records of the ultraviolet glow in the \JEarth\j's upper \Jatmosphere\j can be produced as artefacts of the ultraviolet camera instrumentation. Louis Frank, on the other hand, claims that their methods are flawed, and that 20- to 40-ton cosmic snowballs, the size of houses, are still pelting the \JEarth\j at the rate of 30,000 a \Jday\j.
Parks says that at first he was "agnostic" towards Frank's data, but later became suspicious when he saw the data. It was simply unlikely, he says, that the clusters of spots on the images could have been caused by snowballs in space. Parks began an analysis of his own images taken with the Ultraviolet Imager (UVI) on the NASA Polar \Jsatellite\j. There he found the same dark spots that Frank had found on his images. The "comets", says Park, are just random blobs of "noise", and he claims that this is borne out by a statistical analysis of the "blips".
Against this, Frank notes that the number of atmospheric holes in the images drops by about 80% when the \Jsatellite\j is farther from \JEarth\j. This is just what you would expect from real impacts, he says, adding that instrument noise would show the same pattern at low or high altitudes. We will keep you posted, but please recall what this random noise on screens is called-snow!
#
"Mathilde pictures released",417,0,0,0
(Dec '97)
Pictures of the C-class asteroid, 253 Mathilde (see Mathilde--not your average asteroid?, July) were released in \IScience\i during December.
The main surprise is in finding so many large craters packed so tightly on the relatively small surface of Mathilde. This means that large objects have been able to strike the asteroid's surface without destroying it, leading scientists to suggest that hitting the asteroid is a bit like throwing things at Styrofoam. Whatever the asteroid is made of, it does not seem to be too rigid. But why this should be is anybody's guess. Another mystery, still to be explained: Mathilde rotates once every 17.4 days. Only two other known \Jasteroids\j rotate more slowly: 288 Glauke and 1220 Crocus.
#
"Martian life gets more distant",418,0,0,0
(Dec '97)
One of the top stories of 1996 is fading in interest. Three scientists, Ralph Harvey, John Bradley, and Hap McSween, have dismissed the claim that Martian \Jmeteorite\j ALH84001 contains small fossils. They say that most of the "microfossils" are nothing more than narrow ledges of mineral protruding from the underlying rock that under certain viewing conditions can masquerade as \Jfossil\j \Jbacteria\j.
Unusually, \INature\i published not only their views in its December 4 issue, but also a rebuttal of their claims, by some of those who still believe that the \Jmeteorite\j offers evidence of Martian life. Web reference: http://www.cwru.edu/artsci/geol/ansmet/index.html
#
"Royal Greenwich Observatory closed",419,0,0,0
(Dec '97)
Britain's Particle Physics and \JAstronomy\j Research Council rejected a proposal to privatise the Royal Greenwich Observatory in early December. The business plan drawn up by the staff of the RGO appeared too risky and costly. It also threatened to turn the RGO into an unwelcome competitor for the new \JAstronomy\j Technology Centre, which PPARC is setting up at the Royal Observatory, Edinburgh (ROE).
The RGO's duties, which are now mainly in \Jtelescope\j design, will go to Edinburgh, but RGO staff, outraged at the loss of their historic institution, set about preparing a rescue package that would involve setting up a company to provide astronomical services not being transferred to Edinburgh, such as data archiving. They also planned to establish a \Jtelescope\j-building business with John Moores University in Liverpool, and to carry out some PPARC-funded astronomical research.
Now the institution which created Greenwich Mean Time, and which has been home of the Astronomer Royal since 1675 when John Flamsteed filled the post, will be no more.
#
"High x-ray bursts detected over Sweden",420,0,0,0
(Dec '97)
A 1996 \Jballoon\j flight over Sweden, designed to study the aurora borealis, the "northern lights", detected x-ray bursts with energies as high as a million \Jelectron\j volts-enough to penetrate an inch (2.5 cm) of \Jaluminium\j. Ten times higher than the usual energies at 35 km, where the observations were made, the x-rays are a complete mystery.
The x-rays probably came from particles spiralling in through the \Jearth\j's magnetic field, but their exact nature remains a mystery, especially as the \Jearth\j's magnetic field was quiet at the time. NASA has made funds available for a fortnight-long \Jballoon\j flight in June 1998 to explore the phenomenon further.
#
"End of a star",421,0,0,0
(Dec '97)
A week before Christmas, NASA released pictures of a dying star, a "planetary nebula" of the sort our \Jsun\j will form in some 5 billion years. These were named in the 18th century by astronomers who thought the stars looked \Jplanet\j-like.
This sort of nebula forms when a midsized star runs out of fuel, and blasts out its outer layers, leaving a superheated white dwarf behind. The surface of the white dwarf radiates off ultraviolet which ionises the expanding shells of gas, creating colourful "sculptures in the sky".
#
"Death of a planet--or many planets?",422,0,0,0
(Dec '97)
A new analysis of solar data has led Max-Planck scientist K. Scherer, his colleagues H. Fichtner from the University of \JBonn\j and John Anderson and E. Lau from JPL to conclude that an apparent \Jplanet\j around Pulsar PSR B1257+12 may be an artefact of solar rotation. The report was published in \IScience\i.
Pioneer 10, one of four deep space probes in the heliosphere, the circumsolar region dominated by the \J\Jsolar wind\j\j plasma, recorded data while moving between 40 and 60 AU (6 to 9 billion km) from the \JSun\j. Measuring Doppler shifts in two-way radio signals to accuracies of 1 mHz (millihertz), the researchers found an \Jelectron\j density fluctuation of a particle stream (\J\Jsolar wind\j\j) from the \JSun\j, with a main period of 25.3 days. If the \Jsun\j shows this sort of variation, say the researchers, then Doppler shifts in radio signals from distant stars may be caused in the same way, rather than being generated by planets.
#
"Switzerland: The Gene Protection Initiative",423,0,0,0
(Dec '97)
A referendum under this name will be voted on during 1998 in \JSwitzerland\j. If the proposal is approved by the referendum, it will result in a constitutional prohibition of gene manipulation, a prohibition on the use and patenting of gene-modified animals (including those standard genetic animals, worms and flies), and a prohibition on the cultivation of gene-modified plants.
While the initiative is opposed by both houses of the Swiss parliament and scientists, its supporters are busily warning the public that genetically modified organisms cause allergies, shine \Jlaser\j light from their eyes, spit venom, read your thoughts, are one step from super-monsters and worse. If the initiative is passed, it will mean the end of Swiss biotechnology and molecular \Jbiology\j. Swiss scientists now find themselves challenged to explain complex science to the scientifically illiterate, to demonstrate that they are ethically and morally responsible. This won't be easy, given that this area of study is so close to the bleeding edge of science.
#
"Green tea kills cancer",424,0,0,0
(Dec '97)
In September, we reported (A soothing cup of tea) on the cancer-fighting properties of green tea. The substance in green tea which kills cancer cells, while leaving healthy ones alive, has now been isolated. It is epigallocatechin-3-gallate, which has now been tested on cancerous human and mouse cells of the skin, \Jlymph\j system, and prostate, and on normal human skin cells. In the test tube, it led to apoptosis, or programmed cell death, in the cancer cells, but left the healthy cells unharmed.
Epigallocatechin-3-gallate is a major constituent of the polyphenols found in green tea. A typical cup of green tea contains 200 mg of the compound. Tea consumption in the world ranks second only to \Jwater\j consumption, and around 20 percent of tea consumed is green; with the rest being black tea.
#
"Pine cone intelligence",425,0,0,0
(Dec '97)
The scales on pine cones open when the \Jweather\j is dry, favouring seed dispersal, and stay shut when it is damp. Strangely, the mechanism by which the female pine cone responds to changes in relative \Jhumidity\j has never been explained-until now. A December report explains that the scale contains a "bilayer system", rather like a bimetallic strip, but reacting to \Jhumidity\j, not heat.
For botanists, the inner surface of the ovuliferous (\Jovule\j-bearing) scale is made up of sclerenchyma fibres, grouped in cable-like bundles, with microfibres aligned along the scale and resistant to any stretching. The outer surface consists of sclereids with microfibrils wound around the cell allowing for \Jelongation\j when it is damp, closing the scale down.
#
"Obituary for December 97",426,0,0,0
David Schramm, 52-year-old astrophysicist and the research vice president at the University of \JChicago\j, died on 19 December, after the private \Jaircraft\j he was piloting crashed outside \JDenver\j.
Schramm was a leading authority on the birth of the universe, who helped explain the process by which the three lightest elements--\Jhydrogen\j, \Jhelium\j, and \Jlithium\j--were created immediately after the \J\Jbig bang\j\j. He and his collaborators also calculated the amount of ordinary matter in the universe, which helped demonstrate that the universe is dominated by invisible "\J\Jdark matter\j\j".
#
"1998 Science in Review",427,0,0,0
\JJanuary, 1998 Science Review\j
\JFebruary, 1998 Science Review\j
\JMarch, 1998 Science Review\j
\JApril, 1998 Science Review\j
\JMay, 1998 Science Review\j
\JJune, 1998 Science Review\j
\JJuly, 1998 Science Review\j
\JAugust, 1998 Science Review\j
\JSeptember, 1998 Science Review\j
\JOctober, 1998 Science Review\j
\JNovember, 1998 Science Review\j
\JDecember, 1998 Science Review\j
#
"January, 1998 Science Review",428,0,0,0
\JStop it, or you'll go blind\j
\JSafer oil\j
\JSeeing the light (1)\j
\JSeeing the light (2)\j
\JHigh fliers or high flies?\j
\JAIDS myth laid to rest\j
\JVaccine plans\j
\JRing out the old\j
\JEating ginkgo to learn and live\j
\JGarlic good for the arteries\j
\JCloning \j
\JCellulose genes\j
\JThe fountain of youth?\j
\JThe Bruno Rossi Prize\j
\JHow a termite finds a home\j
\JThe oldest fossil ants\j
\JGobi dinosaurs died in a sand slide\j
\JFewer earthquakes kill more\j
\JNeutrinos to reveal the earth's interior?\j
\JWatching the Andes grow\j
\JIt's a wobbly old world\j
\JEl Ni±o a worry for astronomers as well\j
\JOfficial - 1997 the hottest year ever\j
\JHong Kong Flu\j
\JBack to the moon again\j
\JInternational Space Station news\j
\JUniverse to keep on going\j
\JJupiter's aurora\j
\JIo's glowing poles\j
\JBlack hole news\j
\JHuge comet shower, not many hurt\j
\JComet swarms less likely\j
\JHow civilizations die\j
\JDeath of a scientist\j
#
"Stop it, or you'll go blind",429,0,0,0
(Jan '98)
Tiger beetles are natural predators, so they eat just about anything they can catch by chasing it and running it down. But when tiger beetles chase prey at high speeds, they lose their power of vision briefly. This discovery explains why these beetles chase their food in fits and starts.\p
If they move too quickly, their eyes do not gather enough photons to form an image of their prey, so while the beetle's eyes may still be working, there is no information being gathered. Luckily for the beetle, it can fly very fast, so after stopping to orient itself, it is still able to catch up with its prey once more.\p
So how fast is fast for a tiger beetle? Olympic superstar Michael Johnson, the world-record holder, can run 200 meters in 19.32 seconds, which is an average speed of 10.35 meters per second (23.1 mph). Yet the top speed for tiger beetles in Cole Gilbert's study at Cornell University, \ICicindela repanda\i, was just 0.5387 meters per second (1.2 mph). But while Johnson can cover 5.6 body lengths per second, a tiger beetle has a body length of only 10 millimeters, so it completes 53.87 body lengths per second. Relatively, it runs ten times faster than our best human sprinter.\p
One Australian species, \ICicindela hudsoni\i, is 20 millimeters long and can run 2.5 meters per second. or at a relative speed of 125 body lengths per second. Cole Gilbert suggests that knowledge of this biological tracking system could be important for people designing remote space vehicles such as Mars Rover.\p
\p
#
"Safer oil",430,0,0,0
(Jan '98)
Would you like a motor oil that cuts automotive \Jpollution\j by 40%, and which can be safely disposed of? Now there is one available, created by Duane Johnson, a \JColorado\j State University new and alternative crops specialist, from the seed oil canola, commonly used as a cooking oil, especially in Asian foods.\p
This oil has about the same weight as 10W-30 oil, and is used as a \Jlubricant\j. It produces no waste, as the leftover ground seeds can be used as \Jcattle\j food. Production causes no air \Jpollution\j, accidental spills are rated as non-hazardous, and the bio-degradable oil is produced from a renewable resource.\p
Used canola oil from \Jautomobile\j engines can be recycled into greases and chain oils. These products are called "total loss lubricants" because they leave no residual or waste. But one major hurdle remains: in America, the American \JPetroleum\j Institute will not certify it, and \Jautomobile\j manufacturers require that only API-certified oil be used in their engines or manufacturer warranties are void. The oil's time is certainly coming, with patents obtained now in Europe, Canada, Mexico, \JAustralia\j, New Zealand, Argentina, and \JJapan\j.\p
\p
#
"Seeing the light (1)",431,0,0,0
(Jan '98)
Deep inside a frog's brain, there is a protein that catalyzes bio\Jchemical reaction\js in response to light, according to a report in the \IProceedings of the National Academy of Sciences\i in January. This finding may shed light on how biological clocks work in all our bodies, whether we are frog or human. The protein is called melanopsin. It first came to light during studies of melatonin, a hormone associated with human sleep cycles.\p
First researchers found messenger RNA in the pigmented skin cells of an African \Bclawed \Jtoad\j\b, \IXenopus laevis\i. This mRNA helps make a new opsin-like protein which is yet to be isolated. Opsins are molecules which change shape in response to light and set off chains of \Jchemical reaction\js, which are eventually converted into nerve impulses. Next, they found more evidence of the protein in the nonoptical cells of the \Jretina\j, in the iris, and deep in the brain.\p
This suggests a connection to circadian rhythms, an organism's response to cycles of light and darkness, that are controlled by the brain. To be certain, researchers will need to prove that it is light-sensitive, but this remains a molecule worthy of further notice. One curious side-issue: the protein shares only 39% of the genetic code of its closest known relative, an \Joctopus\j opsin.\p
\p
#
"Seeing the light (2)",432,0,0,0
(Jan '98)
Our human \Bcircadian rhythm\b is set to an approximately 24-hour period. That said, the "clock" is kept synchronized with the solar \Jday\j by daily entrainment to our natural light-dark cycle. In the past, scientists have assumed that the light-dark cycle was detected by our retinas, but now it appears that the backs of our knees are able to do the job, all by themselves, and entrain endogenous circadian rhythms.\p
Human circadian rhythms govern sleep, body \Jtemperature\j, and other regular cycles, including melatonin levels. As the seasons change, our bodies adjust their 24-hour cycles of sleep and waking to the \Jday\j lengths. The master timekeeper of this circadian clock is thought to be a bundle of nerves called the suprachiasmic nuclei. This bundle sits on top of the brain's optic nerve channel and receives impulses directly from the retinas of the eyes, so researchers have argued that the eyes help set the clock.\p
Recent research suggests that the body may have other tricks for keeping synchronized with the seasons: light-sensitive compounds carried by the blood, such as haemoglobin and the liver's bilirubin. These compounds seem to influence production of melatonin, a hormone that helps control sleep cycles as its levels rise and fall through the \Jday\j. While this is unimportant for most people, if you are jet-lagged, or if you suffer from sleep disorders, the new theory becomes very important indeed.\p
Scott Campbell and his colleagues at Cornell University Medical College wanted to see if circadian rhythms could be influenced by light that does not reach the eyes. They decided to use the backs of their subjects' knees as this part of the skin is rich in blood vessels, close to the surface of the skin.\p
The research used healthy volunteers over four nights, and on the second night of each stint, the researchers shone blue-green light (which quickly influences the sleep cycle) onto the backs of the subjects' knees for 3 hours. Body core temperatures and melatonin outputs of the test subjects-but not controls-shifted consistently in response to the light exposure, in some cases by 3 hours. So how long will it be before airlines offer seat cushions that glow in the dark for their passengers to put behind their knees?\p
\p
#
"High fliers or high flies?",433,0,0,0
(Jan '98)
"Crack" \Bcocaine\b is one of the most powerfully addictive street drugs, but it is also a drug about which we know very little. Traditional studies of the effects on rats and monkeys don't reveal much, but now two geneticists have discovered that fruit flies, \IDrosophila melanogaster\i, respond to "crack" \Jcocaine\j in much the same way as humans.\p
More importantly, because humans and fruit flies use many similar biochemical pathways, this discovery suggests that the flies may help scientists unravel the molecular basis of \Jcocaine\j addiction in people. And since so much of the fruit fly \Jgenome\j is known and understood after 80 years of genetic study, the chances of a real breakthrough look good.\p
The journal \ICurrent \JBiology\j\i certainly thinks so, giving the story its cover during January. Perhaps the editor had in mind the prospect that the discovery of biochemical pathways and \Jreceptors\j could also lay the foundation for highly specific drugs to treat \Jcocaine\j addiction.\p
The flies showed differing reactions to different levels of \Jcocaine\j, suggesting that the changes in the \IDrosophila\i brain and \Jnervous system\j in response to \Jcocaine\j are probably very similar to those that occur in the human brain. If this turns out to be the case, it could clear up many of the remaining mysteries about brain \Jreceptors\j and neurotransmitters, the parts of our \Jnervous system\j which have to be involved in any addiction.\p
So how do you get a fruit fly "hooked"? You dissolve a droplet of crack in alcohol, coat a wire filament with the solution, and then put the wire into a glass tube with the fruit flies. Then you run a current through the wire, heating it enough to produce a \Jcloud\j of smoke that is absorbed by the flies.\p
At low doses, the flies groomed themselves continuously, while higher levels made them walk backwards, sideways, and in circles. At the highest doses, the flies developed tremors, paralysis, or even died.\p
The flies seemed to become more sensitive to crack with repeated doses, an effect also seen in humans and rodents, and which possibly ties in with the paranoia and pyschosis seen in long-time \Jcocaine\j addicts. This sensitization is the reverse of the effect seen in other drugs, such as opiates and alcohol, for which increasingly larger doses are required to induce the same effects.\p
\p
#
"AIDS myth laid to rest",434,0,0,0
(Jan '98)
One of the nastiest rumors about AIDS/HIV can now be safely ignored. Claims that "doctors spread HIV when they injected Africans with polio vaccine made from monkeys in the 1960s" have now been shown to be completely false.\p
In the first days of February, the 5th Conference on Retroviruses and Opportunistic Infections in \JChicago\j was told about a blood sample, taken in what is now \JKinshasa\j, Democratic Republic of \JCongo\j, in 1959, which shows fragments of HIV-1. The sample was taken from a man, to be used in a study of the \Jgenetics\j of immunity.\p
In all, researchers have recovered just 15% of HIV-1's complete \Jgenome\j, which they then sequenced. This virus, dubbed ZR59, appears to be closely related to the common ancestor of three strains found in Europe, North America, and \JAfrica\j. The research team believes that this common ancestor must have been introduced into humans from animals sometime in the 1940s or 1950s. The work was described in detail in \INature\i in early February.\p
The main importance of the find lies not in destroying a cruel rumor, but in the help it gives vaccine makers. By capturing the \Jgenome\j of an early version of the virus, we now have a better idea about those parts of the viral \Jgenome\j which are "conserved" as the rest of the \Jgenome\j mutates, the parts which seem to be essential to the survival of the virus. It can twist and turn, disguising itself by changing other parts of the \Jgenome\j, but the conserved parts will always be there as a target.\p
\p
#
"Vaccine plans",435,0,0,0
(Jan '98)
AIDS researchers remain divided about the best vaccines to use as a biotechnology company gets ready to carry out phase III trials of HIV vaccine in the United States and \JThailand\j. Neither the Thai nor the US authorities have approved phase III trials, although the US FDA has approved phase I and II trials, which test for safety and early signs of efficacy of a modified version of a vaccine that has already gone through toxicity testing. Thai authorities are expected to give their approval in the coming weeks.\p
Two competing tensions arise here: the urge to get a possibly life-saving vaccine into use, and the fear that the vaccine could turn out to be worthless, or worse, a killer. Meanwhile, plans are proceeding to get the vaccine, based on a genetically engineered version of a protein called gp120 that makes up much of the outer coat of HIV, into use.\p
But while a report in the February \IJournal of Virology\i cast doubts on whether vaccines like the one to be used in the trials protect against HIV, other methods are emerging as well. Some HIV-infected patients who have a mutant gene for a chemokine called SDF-1 progress much more slowly to full-blown AIDS or death than do people with a normal version of the gene, according to a report in \IScience\i during January. In the past, mutated forms of chemokine receptor have been involved in a slowing of the process. As the central problem with HIV is why it takes so long to destroy the immune system, the mutated SDF-1 is likely to turn out to have been a key finding.\p
Meanwhile, January saw plans announced in \JPennsylvania\j for a new clinical trial, combining a proven antiretroviral drug therapy with an experimental DNA vaccine in an effort to eradicate HIV in infected patients. Three different agents which block HIV replication will be given at the same time: this has already been shown to take virus levels in many patients down to the very limits of detectability. This is not a true cure, since the virus lurks quietly in some types of T cells, threatening to return.\p
The vaccine uses elements of four HIV genes, known in the literature as env, rev, gag, and pol. An earlier trial, using a vaccine with just env and rev components increased antibody production without apparently affecting the infection. With luck, the patients treated with drugs will have enough of an immune system restored that they will be able to drive out the remaining HIV particles.\p
Three groups of seven patients will be given the treatment, each successive group getting three times the vaccine level of the previous group: this allows safety issues to be addressed, while maximizing the chance that any positive immunological results will be clear and obvious.\p
\p
#
"Ring out the old",436,0,0,0
(Jan '98)
\BTinnitus\b, a constant and debilitating ringing in the ears, often called "ringing in the ears" is no joke to those who suffer from it. In the United States where recent research was carried out, 10% of elderly Americans have the condition, and they often suffer depression, anxiety, sleep disruption, and other symptoms that have a major impact on their quality of life.\p
The study, reported in mid-January, describes how \Jpositron\j emission \Jtomography\j (PET) can pinpoint the specific brain regions responsible for tinnitus. This is the first major breakthrough in finding a cure for the problem.\p
The researchers worked with unusual tinnitus patients who can control the loudness of the ringing by clenching their jaws. The team was able to track changes in the brain's blood flow through PET scans taken while these patients manipulated their symptoms. In this way, they were able to build a map of the brain site responsible for tinnitus activity.\p
An odd finding stands out: the patients had a link between the auditory system and the hippocampus, part of the limbic system, the area of the brain which controls emotions, perhaps explaining why tinnitus can be emotionally crippling.\p
\p
#
"Eating ginkgo to learn and live",437,0,0,0
(Jan '98)
\IGinkgo biloba\i is prescribed widely in Europe to improve brain function, and now it appears to improve learning and memory in rats and prolongs their life as well. This was a spin-off from a study of rats' cognitive losses as they grew old, when researchers noted that the \Jginkgo\j rats were living longer.\p
While \Jginkgo\j can be obtained as a dietary supplement in the United States, it is not approved as a drug. A complex mixture of perhaps 200 different chemicals obtained from \Jginkgo\j leaves is prescribed in \JGermany\j and \JFrance\j under the name EGb 761, and this was the product used in the study.\p
The effect on the 20-month-old rats was dose-related. While the standard dose during most of the study was 50 mg/kg, one sub-group of animals was given EGb 761 in doses of 100 mg/kg followed by 200 mg/kg, followed by periods when they performed maze-running tasks while receiving no extract. The results showed that at the highest dose rate, the rats' errors declined by 50%. The next problem: working out which chemical (or chemicals) from the extract to use in the future.\p
\p
#
"Garlic good for the arteries",438,0,0,0
(Jan '98)
In myth, wearing \Bgarlic\b is supposed to keep vampires from attacking your blood vessels, but taking it into the body seems to do more real good inside the body. In particular, \Jgarlic\j protects the \Baorta\b, keeping it elastic, according to a recent report in the journal \ICirculation\i.\p
This is important because tremendous pressure surges rush out of the top of the heart. If these pressure waves flowed on to the ends of the arterial system, our capillaries would all burst. The capillaries are saved by an elastic \Jaorta\j that swells like a \Jballoon\j, absorbing the pressure shock, and then squeezing in again to push the rest of the surge through the body. In effect, the \Jaorta\j is a pressure relief valve.\p
A German study, reported in January, observed more than 200 German men and women, half of whom took 300 mg or more of standardized and odorless \Jgarlic\j powder in tablet form every \Jday\j for two years. Those who took the \Jgarlic\j supplement had a 15% reduction in aortic stiffness compared with the control group. The aortas of 70-year-old subjects who took \Jgarlic\j were as elastic as the aortas of 55-year-old subjects who didn't take \Jgarlic\j, according to one of the researchers. Interestingly, the effects increased with age, perhaps because younger people have less need of the supplement, as their aortas are performing normally.\p
\BKey names:\b Gustav Belz, Harisios Boudoulas\p
\p
#
"Cloning",439,0,0,0
(Jan '98)
The \Jcloning\j topic took an interesting turn or two in January. A 69-year-old physicist, Richard Seed, unaffiliated with any university or research institution, announced his plans to start a human \Jcloning\j clinic to help infertile couples to have children. Seed became an overnight sensation and prompted some heated discussions about ethics, but by the end of January, he seemed to have dropped from view.\p
Rather more importantly, the International Embryo Transfer Society was told about a discovery which showed that unfertilized cow's eggs can incorporate and, seemingly, reprogram at least some of the genes from adult cells from an array of different animal species, including sheep, pigs, rats, \Jcattle\j and primates. The significance of this finding is that it suggests the molecular machinery responsible for programming genes within the \Jcytoplasm\j of the egg may be similar or identical in all mammals.\p
Ear cells from five different fully-grown mammals were taken, and their genes were added to the unfertilized cow's eggs (ova). In each case, the cell then developed into "viable preimplantation-stage embryos," or embryos which gave every appearance of being alive and able to implant into the uterus wall, ready to develop and grow.\p
Where Dolly the sheep developed in a sheep \Jovum\j, it now seems possible that rare and endangered mammals might perhaps be reproduced by \Jcloning\j, using the ova of some other more common \Jmammal\j.\p
\p
#
"Cellulose genes",440,0,0,0
(Jan '98)
It began with a small plant, a relative of the mustard plant, called \IArabidopsis\i, growing in \JAustralia\j's capital city. Richard Williamson of the Australian National University in \JCanberra\j noticed that a mutant \IArabidopsis\i variety produced much less \Jcellulose\j when grown in soil that was hotter than the normal 18║C.\p
\JCellulose\j is the main stuff of plants. Every cell has \Jcellulose\j cell walls, strings of sugar molecules wrapped around every cell, holding them together. \JCellulose\j is the part of plants that we keep when we turn plants into paper, but while the molecule is common, we have little idea about how plants make \Jcellulose\j.\p
In \Jgenetics\j, we get our first hints when something goes wrong, and here the problem was a simple one: at 31║C, the mustard plant produced less \Jcellulose\j, suggesting that one or more genes was defective. The next step was to cross the mutant plant with normal plants a few times, and get an indication of roughly where the important gene was on one of the plant's five chromosomes. After identifying that short strand of DNA, the researchers snipped it out and inserted it into yeast DNA. Then they were able to grow copies of the few dozen genes it contained. Finally, they used a gene-sequencing machine to decode each gene.\p
Work like this always involves collaboration, and Williamson's group turned to a team from the University of \JCalifornia\j, Davis, headed by plant biologist Deborah Delmer. In 1996, Delmer's group had found strong but not conclusive evidence that an almost identical gene controlled \Jcellulose\j synthesis in cotton.\p
Finally, to prove that the related \IArabidopsis\i gene was the mutant behind \Jcellulose\j production, Williamson's team isolated a normal gene and cloned it into a mutant plant, which then produced normal amounts of \Jcellulose\j, even at high temperatures. According to Delmer, there may be as many as ten genes involved in regulating \Jcellulose\j production, so the current score is one down, nine to go.\p
\B\p
\p
#
"The fountain of youth?",441,0,0,0
(Jan '98)
All cells seem to come equipped with an expiry date, a limited lifetime, a built-in limit to their maximum age. For about the last twelve years, researchers have believed that human cell division is regulated by structures called telomeres, specialized stretches of DNA located at the ends of the chromosomes. But how do you prove a hunch like that?\p
The successful answer, reported in \IScience\i in mid-January, was "leaked" a few days before the journal was published. Usually, scientists who break news early like this are criticized by their colleagues: breakthroughs should only be announced in a peer-reviewed scientific journal, or in a symposium attended by other scientists.\p
The news of a major breakthrough is usually released "under embargo" to the media, several days earlier so people like your reporter can write up new research before it is officially announced, but occasionally, an embargo is broken. Then again, scientists, like other humans, gossip, so occasionally scientists need to "go public" in a press conference rather than "publishing" in the time-honored way. However it happened, this piece of news "broke," and so was anounced publicly before it was released in print.\p
The telomeres protect the genetic information carried on the chromosomes. Because of the way DNA is replicated, the ends of a sequence are not completely copied. If the telomeres were not there, that information would gradually be lost. But that same imperfect copying also causes the telomeres themselves to be eroded away each time a cell divides. Finally, when the telomeres reach what is called their "threshold length," cells stop dividing, then they become senescent, and eventually they die.\p
The new research suggests that an \Jenzyme\j called telomerase can not only extend the lifetime of several types of cells, but may be used in new ways to treat aging-related diseases and suppress tumors. Telomerase works to rebuild telomeres, but is normally absent from the body's cells, except for those that produce eggs and sperm. The researchers injected a cloned telomerase gene into cultured cells from \Jretina\j, skin, and blood vessels. All of these tissue types are associated with degenerative, aging-related diseases.\p
The cells began to divide vigorously, and completed at least twenty more cycles than normal cells. More importantly, there were no signs of karyotypic abnormalities, errors in the numbers and forms of the chromosomes. This was a significantly longer life-span, and this finding is likely to have important applications in medicine (obviously) but also in basic biological research, wherever cells are maintained in cultures.\p
\p
#
"The Bruno Rossi Prize",442,0,0,0
(Jan '98)
The Bruno Rossi Prize, awarded annually by the American Astronomical Society "for a significant contribution to High \JEnergy\j Astrophysics, with particular emphasis on recent, original work," has been awarded jointly to the team that operates the Dutch-Italian BeppoSAX X-ray \Jastronomy\j \Jsatellite\j and Dr. Jan van Paradijs, who works in both \JAlabama\j and \JAmsterdam\j. Van Paradijs led a team that identified the first known optical counterpart for a gamma-ray burster in February 1997, while the Dutch-Italian team actually discovered the burst.\p
Van Paradijs is the Pei-Ling Chan eminent scholar in astrophysics at the University of \JAlabama\j, \JHuntsville\j (UAH), and splits his time between \JAmsterdam\j and \JHuntsville\j, where he collaborates with the BATSE (Burst and Transient Source Experiment) team, working with the Compton Gamma-Ray Observatory.\p
Since the 1970s astrophysicists have known about bursts of gamma radiation which appear at random times and locations in the sky. The BATSE instrument on the Compton Observatory was expected to show the bursts coming from within the Milky Way galaxy. Instead, BATSE observations showed that the bursts most likely originate near the "edge" of the observable universe.\p
On February 28 1997, BeppSAX observed an x-ray glow immediately following a gamma ray burst (GRB), identified by its date as GRB970228. These gave a very precise location in the sky for the GRB, and just 21 hours later, the location box for the source had been calculated. Soon after, van Paradijs had found it contained a brilliant object, one which was not in previous images from the area. Three days later, the gamma ray source had faded away again, although the object itself is still visible.\p
Gamma ray bursts were discovered by accident 30 years ago from data taken by the Vela \Jsatellite\j series. The technology was developed when American science policy advisers became concerned about the possibility of secret Russian nuclear tests in space, and proposed building satellites carrying detectors like those used to analyze nuclear blasts on \JEarth\j.\p
A secret project called Vela was started, launching its first \Jsatellite\j in 1963, which carried six gamma ray detectors and other instruments. The third \Jsatellite\j carried gamma ray detectors made of cesium iodide which scintillates--flashes with visible light--when gamma rays pass through it, and the \Jelectronics\j were improving rapidly.\p
Starting with Vela 4, tests were carried out to see if natural causes could trigger the detectors. This involved poring over books of computer print-out, line by line, looking for effects which might have come from cosmic radiation passing through the \Jsatellite\j. By mid-1969, this "hand analysis" of data collected on July 2 1967 showed the first recorded gamma ray burst, a pattern quite unlike what was known from nuclear explosions.\p
Using the Vela 5 and Vela 6 satellites, with timing synchronized to less than 1/64 of a second, scientists could use triangulation effects to show that the events were coming from beyond the \Jsolar system\j, and this was reported in \INature\i in 1973, when sixteen GRBs were identified. The launch of the Compton Gamma Ray Observatory in 1991 opened up the prospects of some serious searching, and led eventually to the 1997 discoveries. BATSE has detected over 2,000 cosmic bursts, more than all other experiments combined, and two additional optical counterparts have since been found for gamma ray bursts, the most recent discovery coming only in the third week of December 1997.\p
\BKey words:\b Klebasabel, Fishman\p
\p
#
"How a termite finds a home",443,0,0,0
(Jan '98)
Mosquitoes home in on food animals like us by detecting carbon dioxide from our breath, and then switching to homing-in on our warmth when they get close. Now it has been discovered that termites look out for carbon dioxide as well.\p
Researchers did this by putting termites in a simple T-maze, where one arm of the T was supplied with normal air, and the other arm got air enriched with carbon dioxide, up to levels higher than those normally found in soil. At the junction, the termites moved their antennae, and most chose the higher level of CO\D2\d. Researchers believe that the termites are attracted because the higher levels of carbon dioxide usually indicate rotting wood.\p
The next step? Finding a way to use this knowledge to develop an inexpensive, non-toxic alternative to current methods of pest control. Two solutions are being considered: luring termites to monitoring traps or to sources of insecticides, or using slow releases of CO\D2\d to confuse termite behavior to the point where a colony cannot sustain itself.\p
This discovery was prompted by earlier work by one of the same team, who showed that western corn rootworm, a pest that causes $1 billion in crop damage each year, relies only on CO\D2\d to find young corn roots. The larvae must locate roots within three days after hatching or die of starvation, and so they can be controled at corn planting time by burying pellets which slowly release the gas and steer rootworm larvae off-course and to their deaths.\p
\BKey names:\b Louis Bjostad, Elisa Bernklau, and Erich Fromm\p
\p
#
"The oldest fossil ants",444,0,0,0
(Jan '98)
Late in January in the journal \INature\i, a team of researchers from the American Museum of Natural History announced the discovery of the oldest \Jfossil\j ants ever found. The extremely rare 92-million-year-old ants are preserved in amber from a location in New Jersey that has produced some of the world's most important amber-encased fossils, and they are 50 million years older than the previous "most ancient fossils" that were clearly recognized as ants.\p
Ants are distinguished by having an anatomical structure called the metapleural \Jgland\j, and this is clearly visible in the specimens. The \Jgland\j is a key to their ability to live in colonies underground or in rotting trees. It is found above the hind legs, it secretes a substance that functions as an antibiotic and prevents \Jbacteria\j and fungi from invading the ants' nests and infecting the members of the colony, and it is probably central to the ants' development of their complex social system.\p
The find includes three worker and four male ants, and represents both primitive and more advanced types of ant, showing that the group was well-established by the time these specimens were trapped in the plant sap that would one \Jday\j become amber. A reasonable estimate would place the origin of ants in the Lower Cretaceous at about 130 million years ago.\p
\BKey names:\b Donat Agosti, David A. Grimaldi, and James Carpenter.\p
\p
#
"Gobi dinosaurs died in a sand slide",445,0,0,0
(Jan '98)
The January issue of \IGeology\i presents new evidence that the dinosaurs and other ancient creatures from the \BGobi Desert\b were killed by sudden avalanches of \Jwater\j-soaked sand flowing down the sides of dunes. The site, in the area known as Ukhaa Tolgod (Brown Hills), is one of the world's richest Late Cretaceous \Jfossil\j sites.\p
Ukhaa Tolgod is virtually unparalleled in the extraordinary preservation of the specimens it yields. Minuscule skeletal structures are perfectly preserved. This remarkable quality of preservation shows that the animals at Ukhaa Tolgod were killed swiftly by sudden events that buried their bodies before they could be scavenged or destroyed by the \Jweather\j. It has often been presumed that immense sandstorms killed them, with wind-blown clouds of grit burying the dinosaurs alive.\p
Instead, the cause is now seen to have been a debris flow, or "sand slide," in which a massive quantity of wet sand rushes down the side of a dune, burying everything in its path in an avalanche of debris.\p
There are three distinct types of sandstones at the site, each revealing a different part of the puzzle. One type shows a well-defined bedding structure that is tilted at an angle of twenty-five degrees and is arranged by particle size; such structure is typical of wind-blown deposits. This \Jsandstone\j was likely formed during violent storms like those long thought to be the \Jdinosaur\j's killers, but it contains no skeletal remains.\p
A second type of \Jsandstone\j did not show the fine-scale structure of the first type, but similarities in its texture, and its large tilted and cemented sheets of sand showed that it too was created by the action of the wind. Burrow marks made by insects and other tiny creatures were present in the \Jsandstone\j, but only below a certain depth.\p
The third type of \Jsandstone\j is the one in which all of Ukhaa Tolgod's hundreds of \Jfossil\j have been found, and it drew particularly close attention from the team. Unlike the other two types of \Jsandstone\j, this showed no structured layering at all. Large pebbles and cobbles, which are much too big to have been carried by the wind, are sometimes present in these sandstones, indicating that the sandstones were not formed by wind action, and thus ruling out the possibility that windy sandstorms delivered the fatal blow to the dinosaurs of Ukhaa Tolgod.\p
To test this, the team reviewed research on the travel literature of Central Asia and \JArabia\j to see if there were any modern-\Jday\j accounts of animals buried alive in sandstorms. The research did not record any such mass smotherings, but the team heard stories of vehicles that were half-buried by sand flows generated by a heavy rainstorm in \JNebraska\j, where there are dunes which are probably similar to those of Ukhaa Tolgod.\p
The sliding seems to be related to wind-blown clays coating the sand grains, until the clays inhibit the dune's ability to absorb \Jwater\j so that an unusually heavy rain can cause a slurry of wet sand to rush down its face. If the dunes had been actively migrating sand dunes, they would probably have lost their clay, so the find also tells us that the sandhills of Ukhaa Tolgod were not a howling, sterile desert, but rather in a stabilized dune field where plant life and rainfall were relatively abundant.\p
\BKey names:\b David Loope, Lowell Dingus, Carl Swisher, and Chuluun Minjin\p
\p
#
"Fewer earthquakes kill more",446,0,0,0
(Jan '98)
Seventeen major earthquakes, rated at \Jmagnitude\j 7.0 or higher, were recorded in the world for 1997, according to the US Geological Survey National \JEarthquake\j Information Center (NEIC) in the United States in early January. This compares with 21 such quakes in 1996, but the 1997 death toll, set down as "at least 2913," is much greater than for 1996, when 449 people were killed.\p
This number will increase in 1998: a single \Jearthquake\j in early February, as this report was being prepared, is believed to have killed more than 4000 people in \JAfghanistan\j. The worst 1997 \Jearthquake\j was in northern \JIran\j on May 10. With an estimated \Jmagnitude\j of 7.1, it caused at least 1567 deaths, 2300 injuries, and left 50 000 people homeless.\p
The most newsworthy \Jearthquake\j was undoubtedly the Italian quake which caused damage to the Chapel of St. Francis of Assisi, and devastated tourist income for an entire region after people who saw the spectacular footage of the damage changed their vacation plans.\p
The US Geological Service says that the number of earthquakes of \Jmagnitude\j 7.0 or higher has remained fairly constant throughout this century, even though many people believe that they are becoming more common. This belief is partly perception, given better media coverage (and more people carrying video cameras able to take broadcast-quality footage), and partly the increased damage and loss of life which can be expected as human populations rise, leading to more lives lost, and more structures destroyed.\p
The USGS estimates that several million earthquakes occur in the world each year. Many of these earthquakes go undetected because they occur in remote areas or have very small magnitudes. The Service locates 12 000 to 14 000 earthquakes each year (about 35 per \Jday\j).\p
\p
#
"Neutrinos to reveal the earth's interior?",447,0,0,0
(Jan '98)
There is a huge amount of heat inside the \JEarth\j, so that the surface emits about 40 trillion watts of heat, and perhaps 40% of this comes from the slow decay of the unstable \Jisotopes\j uranium-238 and thorium-232. This heat fuels volcanoes and drives the slow movement of the \Jplanet\j's plates, but how this heat is produced or where, remains a secret.\p
Now neutrino detection experiments being built in \JItaly\j and \JJapan\j to study the \Jsun\j's neutrinos might be able to deliver a bonus, a sort of PET scan of the whole \Jplanet\j. Neutrinos are also emitted when uranium and thorium decay, and the new detectors should be able to distinguish these from solar neutrinos, because they will have different energies. At the very least, we should be able to get the first global transuranic chemical analysis of \JEarth\j, perhaps revealing the ratio of these elements in continental and oceanic crusts, shedding light on the various models of how \JEarth\j's surface segregated from the rest of the molten \Jplanet\j.\p
\p
#
"Watching the Andes grow",448,0,0,0
(Jan '98)
The Global Positioning System, a network of two dozen orbiting satellites, was developed in the 1970s by the US Defense Department to locate ships and \Jaircraft\j to within a range of about 100 meters. The same degree of accuracy proved useful to \Jaircraft\j, trucking companies, and shipping fleets and is now in widespread civilian use by drivers, boaters, and even hikers.\p
But geologists have spent the past decade refining the technology to be able to pinpoint a location with far greater precision, down to 3 millimeters (about an eighth of an inch). This is well beyond military or ordinary navigational needs, but if you keep recording for a couple of days, and use a $20,000 receiver instead of a $300 receiver, you can get sub-centimeter accuracy.\p
In a recent study, reported in \IScience\i during January, researchers wanted to study the relative movements of the Nazca plate and the South American plate on the west coast of South America. To do this, they drove large pins into firm ground at 43 separate locations on the South American continent. They traveled to each site and used a tripod-mounted precise optical plumb system to position an antenna directly over the landmark, then tracked the \Jsatellite\j for a couple of days as the GPS receiver recorded the position of the \Jsatellite\j.\p
The results reported show that about three inches (8 cm) of motion per year occurs between the Nazca and South American plates, and is divided three ways. About 44% (or 1.3 inches per year) of the Nazca plate slides smoothly under South America, giving rise to volcanoes. Another 44% is almost certainly locked up at the plate boundary, squeezing South America, and is released every hundred years or so in great earthquakes. The remaining 12% of the motion per year crumples South America, building the \JAndes\j.\p
\p
#
"It's a wobbly old world",449,0,0,0
(Jan '98)
What are we to assume when the sea levels go up? Most of us think it is something to do with global warming and melting ice caps, but it can also be to do with local land movements.\p
Or it might be caused by the world wobbling on its axis. In a report in \IScience\i in January, Jerry Mitrovica and Jon Mound used numerical simulations to show how long-term changes in the direction that the \JEarth\j's rotation axis points, in other words, wobbling of the \Jplanet\j, can cause sea-level variations which exceed 100 meters.\p
Over the past 140 million years, there have been rises and falls of ocean levels of between 100 and 300 meters, an effect which has been blamed on changes in the elevation of the ocean floor caused by changes in the rate of sea-floor spreading. It is quite possible that all of this change was caused by wobbling, also known as polar wander. And strangely enough, the effects in different parts of the world, at the same time, can be in opposite directions. Some areas may be having a rise in sea level as others have a fall.\p
\p
#
"El Ni±o a worry for astronomers as well",450,0,0,0
(Jan '98)
When astronomers use a ground-based \Jtelescope\j to look at stars overhead, they are always looking through the same thickness of \Jatmosphere\j, but when the star is near the horizon, \Jrefraction\j bends the light, and makes the star appear to be in a slightly different place. The extra \Jatmosphere\j also makes the star slightly dimmer, an effect called atmospheric \Jextinction\j.\p
There are two easy ways to allow for this atmospheric \Jextinction\j. The astronomers can either take measurements on certain standard stars during their observations, or they may rely on standard tables of \Jextinction\j values instead. That might be fine if you are working on visible light, but it can be bad news for astronomers looking at the infrared spectrum. Infrared light \Jextinction\j is mainly caused by \Jwater\j vapor in the \Jatmosphere\j, and during an El Ni±o event, the \Jwater\j content of the air changes enough to make a difference of up to 2%, which can be very important in some lines of astronomical research, says Jay Frogel.\p
Not surprisingly, Frogel's work is in just such an area, explaining why he has sounded a note of warning to his colleagues in the latest issue of \IPublications of the Astronomical Society of the Pacific\i. Interestingly, there is another side to Frogel's observations: he points out that his data, collected over some fifteen years, might also reveal changes in \Jwater\j vapor levels over a period which stretches back to the last really bad El Ni±o event, the one of 1982-83.\p
Working at Cerro Tololo in \JChile\j, his records reflect the southern seasons, with greater \Jextinction\j in the southern summer months of January, February, and March, and decreased \Jextinction\j during the winter when the air is colder and can hold less moisture. On top of that, he found a second pattern, almost exactly matching the ENSO (El Ni±o-Southern \JOscillation\j) Index, showing that there really is a great deal more \Jwater\j around in the \Jatmosphere\j during an El Ni±o year.\p
\p
#
"Official - 1997 the hottest year ever",451,0,0,0
(Jan '98)
By a whisker, 1997 has been shown to be the hottest year on record. This has been confirmed by the US National Oceanic and Atmospheric Administration's National Climatic Data Center in Asheville, North Carolina. Taking "normal" as the average at 61.7║F (16.5║C), the mean \Jtemperature\j of the world was three-quarters of a degree Fahrenheit above normal. It exceeded the previous highest year, 1990, by 0.15║F (0.08║C).\p
This continues an undeniable trend, where nine of the past eleven years have been the warmest on record. There is some small hope: land temperatures in 1997 were slightly cooler than those of 1990, but that remains the faintest of hopeful signs, given that ocean temperatures were well up on past years, exceeding the previous record warm years of 1987 and 1995 by 0.3 of a degree Fahrenheit.\p
This means, according to the NOAA, that global \Jtemperature\j warming trends now exceed 1.0 degree Fahrenheit per 100 years, with land temperatures warming at a somewhat faster rate. With typical scientific reserve, they comment that it is likely that the sustained trend toward increasingly warmer global temperatures is related to anthropogenic increases in greenhouse gases-in simple terms, they think humans are the cause of the problem.\p
\p
#
"Hong Kong Flu",452,0,0,0
(Jan '98)
A sixth person died of this disease by mid-January, but after that, the story lost the world's attention. More than 1.5 million birds were slaughtered, probably to no good effect, but by early February, imports of chickens from "mainland China" to Hong Kong had begun again.\p
By mid-January, the \Jgenome\j of the virus had been isolated and analyzed, with medical researchers having some ideas as to why the virus was such a killer, although they were no closer to understanding how a bird flu virus was managing to infect humans. They have a complete sequence of the genes that code for its surface proteins and partial sequences of the remaining \Jgenome\j.\p
Currently, different bird flu viruses are being analyzed to see if the strain that kills humans can be found in birds. So far, there has been only one suspected case of transfer from one person to another.\p
\p
#
"Back to the moon again",453,0,0,0
(Jan '98)
In early January, NASA launched a probe to the moon to look for \Jwater\j. Lunar Prospector is another "cheap" mission, coming in at US$65 million. This is the first time in 25 years that NASA has been to the moon, and the dreamers behind the mission have been full of talk about \Jwater\j and lunar colonies, but given the foreseeable political climate and lack of a profitable incentive, that's not likely to happen for a while yet.\p
\p
#
"International Space Station news",454,0,0,0
(Jan '98)
The first launch towards the completion of the 16-nation \BInternational Space Station\b is due in less than five months, if completion is to happen in 2002, but at the end of January 1998, the agreement was finally signed. A number of the signatories indicated that they still have misgivings, but the piece of paper is now signed, and the project appears to be on schedule.\p
But what operating system will the space station's network use? \JScience fiction\j buffs have noted with some glee that Windows NT contains a Hardware Abstraction Layer, or HAL for short. Will any of the space station's crew be called Dave?\p
\p
#
"Universe to keep on going",455,0,0,0
(Jan '98)
Further evidence against the likelihood of a Big Crunch has been found in a study of distant exploding stars. An analysis of 40 of the roughly 65 supernovas so far discovered by the \JSupernova\j \JCosmology\j Project indicates that we live in a universe that will expand forever. Apparently there isn't enough mass in the universe for its gravity to slow to a halt the expansion that started with the \BBig Bang\b.\p
Some of the most distant supernovas are 7 billion light years from \JEarth\j, and the light from these stars has been considerably red-shifted. They can only be seen because supernovas are so intrinsically bright that their light is visible half-way across the observable universe, but even so, after such a journey the starlight is feeble, compared with nearby supernovas.\p
All supernovas of a type called "Type Ia" were triggered when a dying white dwarf star pulled too much gas away from a neighboring red giant, starting a thermonuclear explosion that ripped the white dwarf apart. These Type Ia supernovas are all very much the same brightness, so the surviving brightness gives us a good clue as to each star's distance, while the \Jredshift\j gives us speed. Together, these two measurements tell us how the universe is expanding, whether the rate of expansion is slowing down. The short answer: it is not slowing down, so the universe will expand forever.\p
\p
#
"Jupiter's aurora",456,0,0,0
(Jan '98)
Like \JEarth\j, Jupiter has auroras at its \Jpoles\j, and astronomers have known this for decades. Now, using a special filter and Hubble's Imaging Spectrograph (STIS) and Wide Field Planetary Camera 2 (WFPC2), University of \JMichigan\j astrophysicist John Clarke has produced the best-ever images of this phenomenon.\p
While it was known from earlier studies that Jupiter's moon Io left a "footprint" in the auroral display, we now know that the small moon Ganymede also appears to leave a slight but detectable electromagnetic footprint on Jupiter's \Jatmosphere\j. Selected images are available for viewing at: \Bhttp://oposite.stsci.edu/pubinfo/pr/98/04.html\b\p
\p
#
"Io's glowing poles",457,0,0,0
(Jan '98)
The Space \JTelescope\j Imaging Spectrograph has been busy lately. Besides active volcanoes, lakes of molten sulfur and vast fields of sulfur dioxide snow, we now know that the Jupiter's moon Io has caps of glowing \Jhydrogen\j gas at its \Jpoles\j. Hubble's STIS revealed these glowing caps for the first time when Frederick Roesler described his work to a meeting of the American Astronomical Society.\p
Roesler says he has no idea where the \Jhydrogen\j came from or why it is glowing. He believes the observation may indicate that Io's \Jpoles\j are swathed in a frost of molecular \Jhydrogen\j sulfide, a toxic gas that requires temperatures of around -130║F (-90║C) to freeze. Or maybe there are other \Jhydrogen\j-bearing frosts concentrated at Io's \Jpoles\j.\p
Another possibility is that the glow is caused by a large electrical current flowing between Jupiter and Io, which also propels \Jhydrogen\j atoms to Io from the \Jhydrogen\j-rich Jovian \Jatmosphere\j.\p
\p
#
"Black hole news",458,0,0,0
(Jan '98)
Using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) on the Hubble Space \JTelescope\j (HST) combined with archival images from the HST's Wide Field Planetary Camera (WFPC), astronomers have been able to trace where the dense dust and gas reside in the central regions of several Seyfert galaxies.\p
From this, they have been able to identify spiral dust lanes that appear to provide fuel for black holes at the centers of active galaxies. This result is important because it suggests a new mechanism to fuel activity in galaxies. A typical Seyfert galaxy emits the equivalent \Jenergy\j of 1 trillion suns over a volume the size of the \Jsolar system\j or smaller, and the black holes in Seyfert galaxies are believed to be very massive, perhaps the equivalent of 10 million suns.\p
Two false-color images on the world wide web (\Bwww.ciw.edu/regan/n1667e.gif\b and \Bwww.ciw.edu/regan/n3982e.gif\b show active galaxies NGC 1667 and NGC 3982, respectively.\p
\BKey words/names:\b Michael W. Regan, John S. Mulchaey, the Carnegie Institution of Washington\p
\p
#
"Huge comet shower, not many hurt",459,0,0,0
(Jan '98)
Most scientists now accept the idea that showers of comets in the past have caused extinctions, and some have even proposed that regular swarms of comets have battered \JEarth\j, devastating life. The usual cycle suggested is a 30 million year one.\p
But where is the evidence? The logical place to look is somewhere that has not been disturbed for a very long time, to see if we can find any \Jcomet\j dust. Now the American Geophysical Union has been given the results of just such a search, when geochemists reported finding a high level of \Jcomet\j dust in 35-million-year-old ocean sediments. From the pattern, it looks as though the \Jcomet\j shower lasted for some two million years, but there is no sign in the \Jfossil\j record of any major \Jextinction\j events during that time.\p
\p
#
"Comet swarms less likely",460,0,0,0
(Jan '98)
Far beyond Pluto, astronomers believe there are millions of comets in the \BOort \JCloud\j\b. Sooner or later, they say, something passes close enough to give some of the comets a gravitational nudge, and send them diving into the inner \Jsolar system\j. To do this, the visitor needs to come within one or two light years of our \Jsun\j.\p
Four million years ago, a triple-star system called Algol came close to \JEarth\j, but new, very accurate measurements tell Lawrence Molnar of the University of \JIowa\j that Algol's approach was by no means the close encounter we thought it was: it never came closer than 13 light years, much too far off to have pushed any comets our way. And at the moment, there appear to be no other stars moving in on the Oort \JCloud\j either. For the moment, civilization is safe-until we find a way of destroying it ourselves.\p
\p
#
"How civilizations die",461,0,0,0
(Jan '98)
A sediment core retrieved from the bottom of the Gulf of Oman, 1800 kilometers from the heart of the Akkadian empire, reveals that a 300-year-long dry spell began just as the Akkadians' northern stronghold of Tell Leilan was being abandoned.\p
The Akkadian empire, centered on Akkad or Agade in southern \BMesopotamia\b around 2300 BC, was ruled by Sargon of Akkad (not to be confused with the much later Assyrian king of the same name). Under Sargon, Akkad conquered all the land between the Mediterranean Sea and the Arabian Gulf, being the first civilization to combine independent societies into a single state, but it splintered a century later. Now we know why.\p
\p
#
"Death of a scientist",462,0,0,0
(Jan '98)
Kenichi Fukui\b died of cancerous \Jperitonitis\j during January. A theoretical and physical chemist, he shared the 1981 Nobel Prize for chemistry with Roald Hoffmann.\p
\p
#
"February, 1998 Science Review",463,0,0,0
\JA river ran through it\j
\JReviving an old idea\j
\JSex differences add up\j
\JChomsky's theories get a boost\j
\JBeware of the watchers\j
\JScientific literacy\j
\JNeedle-free immunization\j
\JWhere did you go for your holidays?\j
\JIs your sunscreen safe?\j
\JUsing viper venom to stop cancer\j
\JTough out lactose intolerance!\j
\JGerms cause high blood pressure\j
\JOral bacteria go for the heart as well\j
\JEbola unraveled\j
\JI knew that . . .\j
\JAntibiotic stupidity abounds\j
\JMedicine or witchcraft?\j
\JOne way to determine sex?\j
\JA very thin wire . . .\j
\JSenate ban plan fails\j
\JOverfishing still goes on\j
\JAre frogs dying of ultraviolet overload?\j
\JAlbatrosses on the Net, not netted\j
\JSome like it hot\j
\JTin worms\j
\JTougher than diamonds\j
\JPalaeontology news\j
\JThe walk of the pterosaur\j
\JNo sex, please, we're Northerners\j
\JSex at the right time\j
\JRecovering from your next mass extinction\j
\JThe dawn of the animals\j
\JHot spots in the news\j
\JCalifornia's next volcano\j
\JCold War end makes US Navy come in from the cold\j
\JWhy is Antarctica so cold?\j
\JAn even nastier greenhouse gas\j
\JBack to the moon again\j
\JTerrorists from space?\j
\JSeriously hot water\j
\JLights, window, action!\j
\JIdentifying x-ray sources\j
\JNow the wine is bad for you\j
\JDelivering anti-cancer drugs no pigment of the imagination\j
\JReginald Victor Jones 1911-1997, physicist, inventor and hoaxer\j
#
"A river ran through it",464,0,0,0
(Feb '98)
NASA's Mars Global Surveyor has sent back images of a river \Jcanyon\j, 4 kilometers wide. The \Jcanyon\j winds and twists just like a proper \Jearth\j \Jcanyon\j, but while this tells us \Jwater\j must have flowed through it, there are no signs of any tributaries, so it is uncertain whether the \Jcanyon\j was ever fed by rain. The answer to that may come in a year or so when the orbiter descends closer to the surface of Mars.\p
#
"Reviving an old idea",465,0,0,0
(Feb '98)
Once upon a time, most of the best statisticians had an interest in \Jastronomy\j, and most of the best astronomers had an interest in \1statistics\c. Early this century, quantum mechanics, \Jthermodynamics\j and \Jelectromagnetism\j came to the fore, and statistics took a back seat among the disciplines needed by a really good stargazer. Now the former enthusiasm for serious number-crunching is almost unknown.\p
At the AAAS meeting in February, Eric Feigelson and Jogesh Babu traced this fall from grace. Statistics were important in the days when Newton's work made it possible for astronomers to make repetitive, accurate measurements of planetary characteristics. Unfortunately, there were more data available than the astronomers could deal with, and they needed ways to reduce the data, to summarise them.\p
One attempt that worked was by a French astronomer, Adrien Legendre, who published a new method, minimizing the sum of squares of errors, for determining the orbits of comets in 1805. Today, with deep space exploration churning out gigabytes of information, these huge amounts of data pose problems for astronomers not only because of their size, but also because the number of individual properties recorded are large, creating multivariate databases. \p
These types of databases are best handled with such statistical methods as time series analysis, sampling theory, multivariate analysis and nonlinear regressions. Applying such methods to \Jastronomy\j forms the basis of the newly named field of astrostatistics. But while Legendre and others were making contributions at the cutting edge of statistics, today's astrostatisticians are able to use methods developed in other areas of application.\p
Take survival analysis, the standard method used to estimate the lifetime of light bulbs and the survival rate of cancer patients. Nobody wants to wait around for the last light bulb to sputter out or the last laboratory animal to die to determine their average life spans, so statisticians long ago developed methods to let them compute the averages before the last subjects expired. This same method works for astronomical objects that are too faint to be detected. \p
\p
#
"Sex differences add up",466,0,0,0
(Feb '98)
What makes a mathematically precocious young child? Following a study reported in early February, and carried out at the University of Washington's Halbert Robinson Center for the Study of Capable Youth, it may have something to do with the color of their bootees.\p
A long-term study has found significantly more boys than girls with very high levels of mathematical talents. And even when children are put into an enrichment program to foster their abilities, mathematically-talented girls don't catch up with their male counterparts in the first two years of school. A further finding: the girls and boys who show an early aptitude and interest in mathematical studies are no "flash in the pan" whose interest wanes once they are in a formal classroom setting where other children are learning the basics. The researchers were equally interested in the control group and reported that children with advanced math skills in both groups stayed ahead of or even increased their advancement in math compared to their classmates in school. \p
The gender imbalance first showed up when researchers were setting up their study. In an effort to balance the sexes, special efforts were made to enlist girls. To be eligible for the study, children were required to score at or above the 98th percentile on at least one of three screening tests. Of the children screened, 348 qualified and 60 percent, or 210, were boys, and in the end, a number of the boys were excluded randomly, to balance the genders.\p
This sort of result can sometimes be explained by tests which favor boys over girls, but at this age, that sort of bias seems unlikely to arise. Moreover, the very brightest of the bright mathematicians tended to be boys. The top 5% of those selected for the study, using any of eight different measures, were virtually all boys. While some measures might be biased, it is hardly likely that all eight would show the same bias.\p
Perhaps we will have to consider that there is some bias in our society, or in our genetic make-up, that produces this result, but the researchers seem to have missed the obvious question: if some girls, even a handful, managed to make it into the top group, what is it that makes them different? What is there about their \Jgenetics\j, their experiences, their environments, that might account for their better performance?\p
#
"Chomsky's theories get a boost",467,0,0,0
(Feb '98)
\1Noam Chomsky\c is an American linguist whose theories have caused many arguments among other linguists. Chomsky believes that there is some sort of "universal grammar" hard-wired into human minds, a structure of language that young babies bring to the task of learning to crack the code, and join the throngs of speaking humans. They may learn the words, according to Chomsky's view, but we have a ready-made store to slot all of the language's system of word meaning, sentence structure, and sounds (semantics, \Jsyntax\j, and phonology). \p
Now we have been shown compelling evidence, prepared by a Cornell University psycholinguist, Barbara Lust, showing that both American and Taiwanese children as young as 3 years of age already have a remarkable knowledge of language structure and \Jsyntax\j. This knowledge is so complex and precise, says Lust, that it must challenge any known learning theory to account for its acquisition.\p
Lust's researches were reported to the AAAS in mid-February, and described how 86 American children between 3 and 7 acted out sentences such as, "Ernie touches the ground and Big Bird does, too," "Oscar bites his banana and Bert does, too," and "Big Bird scratches his arm and Ernie does, too."\p
These sentences, as acted out, include information that is not given in the verbal form, such as what Ernie does in the last sentence. Does he scratch Big Bird's arm or his own arm? Each sentence appears simple, but actually has four correct grammatical interpretations and five incorrect interpretations. In collaboration with Chinese researchers, Lust also conducted matched studies with children whose parents spoke Mandarin Chinese in \JTaiwan\j. She found that these children also understood the complex grammar of their language in ways similar to the American children learning English.\p
While this study covers just two languages, Lust works with graduate students, including native speakers of Chinese, Japanese, Spanish and Tulu in the Cornell Language Acquisition Laboratory. She and her students also collaborate with native speakers in more than a dozen other languages, including German, Dutch, Swedish, Korean, Arabic, Indonesian, \JSinhalese\j, Inuktitut, and the South Asian languages \JHindi\j, Tamil, and Malayalam.\p
The evidence, said Lust, is that language acquisition comes easily to young children. Where professional linguists may take years trying to figure out the rules and principles and parameters of language, children seem able to create the right theory for whatever language is around them, whether it is English, French, Japanese, or Tulu, within just those first three years.\p
#
"Beware of the watchers",468,0,0,0
(Feb '98)
In social research, it is accepted that the interaction of experimenter and "experimental animal" will always affect the outcome, especially when the animal is human. In physics, there should be no such problem, barring a blatant attempt at hoaxing or \Jfraud\j. There is no such thing as "mind over matter".\p
Except, that is, in \1quantum physics\c, where the influence of the observer has always been considered likely to change the outcome of an experiment. It is probably this conclusion, more than any other, which stops quantum physics from being taught in any serious way in schools. It is just too hard to comprehend, and sounds too much like pseudo-science, so teachers tend to duck out on it.\p
At the end of February, Weizmann Institute of Science reported in Nature that they had conducted a highly controled experiment demonstrating how a beam of electrons is affected by the act of being observed. According to their report, the greater the amount of watching, the greater the influence. \p
According to quantum mechanics, particles such as electrons can also behave like waves, but for electrons, this is only true over small, submicron distances-less than one thousandth of a millimeter. Typical of this behavior is the way an \Jelectron\j can pass through several gaps in a barrier at the same time, and meet up again at the other side of the barrier in a process called interference.\p
Interference can only happen if nobody is watching. As soon as somebody begins to watch the process, interference has to stop, since no \Jelectron\j, having been seen to pass through one gap, can also pass through a second gap. So the very act of observation forces the electrons to "behave" like particles, rather than like waves.\p
The Weizmann Institute researchers built a tiny device measuring less than one \Jmicron\j in size, containing a barrier with two openings. Then they sent a current of electrons towards the barrier, where a tiny but sophisticated electronic detector could spot passing electrons. The sensitivity of the detector can be adjusted either by changing its electrical \Jconductivity\j, or the strength of the current passing through it.\p
The detector had no effect other than to detect (or "observe") the electrons. But even so, the presence of the detector near one of the openings caused changes in the interference pattern of the \Jelectron\j waves passing through the openings of the barrier. More importantly, as the detector was made more sensitive, so there was more "observation", the interference grew weaker. And when the detector was made less sensitive, the interference increased.\p
This might seem like a particularly obscure, confusing, terrifying piece of physics: in fact, the effect may well be of use in providing secure information transfer. If information is so encoded that it needs the interference of multiple \Jelectron\j paths to decipher it, the presence of any eavesdropper-an "observer"-would reduce the interference, revealing the presence of the unwanted listener.\p
There is just one problem. With an inanimate "observer" any notion of "mind over matter" has to be dismissed.\p
\BKey names:\b Mordehai Heiblum, Eyal Buks, Ralph Schuster, Diana Mahalu, Vladimir Umansky and the Condensed Matter Physics Department. Useful Web address: \Bhttp:\\www.weizmann.ac.il\b\p
#
"Scientific literacy",469,0,0,0
(Feb '98)
The definition of "\Jliteracy\j" is fairly simple: somebody who can read and write independently. Everybody can agree on that much, but go any further and the arguments start. Does "literate" mean "able to write good clear prose," "able to read difficult text," "able to understand difficult text," or something else?\p
That is nothing to the arguments that rage around "\1scientific \Jliteracy\j\c." Does it mean "able to perform as a scientist," "knowing a lot about science," "able to analyze scientific statements," "able to think in a scientific way," or something else again? Perhaps it is the ability to think about quantum physics without getting the shakes. Perhaps it means the ability to see through the attempts and blandishments of charlatans and pseudo-science confidence tricksters.\p
A group of \JArizona\j State University undergraduates reported to AAAS on their views of scientific \Jliteracy\j during February. They are, said the students, uniquely qualified to add insight to the debate because they are scientists at the very beginning of their careers. The only undergraduates presenting at AAAS, the students delivered a paper called "Advocating Scientific Change: A History of the Future."\p
In their paper, they argued that "science is a vital endeavor of which 'all Americans' must be aware. Scientific \Jliteracy\j then entails that humans not just apply science but that they interact with and access science in a whole new way." \p
They say that though the issue of scientific \Jliteracy\j is filled with confusion and consternation in Washington, the (US) National Academy of Sciences, the AAAS and the (US) National Science Foundation have all developed independent positions that are likely to be critical in defining new initiatives for science education. All agree that scientific \Jliteracy\j is (1) a measurable educational goal that is (2) publicly important, (3) necessary for all Americans (rather than a select few), (4) of real value in everyday life, and (5) tied "inextricably" to social issues. So everybody agrees that the topic is important, but what is scientific \Jliteracy\j?\p
That remains the unresolved question. Is it knowledge, they ask, of science facts or should it be instead critical thinking skills, a "habit of mind"? If it is another name for critical thinking, how do we measure teaching effectiveness? Is true scientific \Jliteracy\j for all Americans possible? How will we afford it? Is the end goal to allow people to understand key science issues, to appreciate science, or to be technologically capable? Is it to make the public into better thinkers? Or is the goal a combination of all of these things? \p
Their main conclusion is that there is not enough scientific \Jliteracy\j around, but they also note that there is no sense in taking an American perspective: scientific \Jliteracy\j is a world problem, and the different definitions and arguments offered for it can have widely differing consequences.\p
While accepting that there will be different consequences, your reporter suggests that it probably matters little if one interest group concentrates on some smaller subset of the range of skills that go to make up scientific \Jliteracy\j, just so long as they do not lose sight of the other aspects. A coherent consensus, say the student authors, is highly desirable. This is undoubtedly true, but perhaps some coherent action would be more useful.\p
\p
#
"Needle-free immunization",470,0,0,0
(Feb '98)
Needles are a problem when it comes to \Jvaccination\j, especially in developing countries where the cost of fresh needles adds to the cost of \Jimmunization\j, and even more so with children, who object to "jabs" (as indeed do some adults). A curious solution to the problem has been developed: applying vaccine to the unbroken skin. But how do you make the skin absorb the vaccine?\p
The answer appears to lie in applying bacterial \Jcholera\j toxin, often used in oral and nasal vaccines to enhance the immune response. Research shows, in the careful words of the researchers, that vaccine components such as \Jdiphtheria\j and tetanus toxoids have no effect when applied to the skin of mice alone, but evoke an immune response in the presence of \Jcholera\j toxin.\p
Translated, this means that \Jcholera\j vaccine, mixed with another vaccine, seems to carry at least some of the vaccine through a mouse's skin. It will be some time, though, before trials are carried out on human beings.\p
#
"Where did you go for your holidays?",471,0,0,0
(Feb '98)
In some parts of the world, a "healthy" tan is no longer desirable, but there are still those who believe that the \1melanins\c of a tanned skin indicate health and wealth, since only expensive holidays in the \Jsun\j can give a good tan. Now that we know the problems which can arise from exposure to the \Jsun\j, people are a little less enthusiastic about a "healthy" tan.\p
The enthusiasm for a tanned skin probably dates to the days when white people in northerly latitudes could suffer from \Jrickets\j if their diet was low in vitamin D. Now we get all the vitamin D we need from our diets, so there is no longer any excuse for seeking a tan, except the faddish wants of the uninformed. Unfortunately, their wants may lead to ill health which costs us all, either directly in taxes to support the dying, or in a variety of indirect ways.\p
Luckily, a solution is close: "liquid melanin". This tan in a bottle is the work of Yale University \Jdermatology\j researcher, John M. Pawelek, \JPh\j.D., who is a cancer biologist who studies \1melanoma\c. Pawelek calls his product Melasyn, marking that it is a synthetic form of melanin. Melanin itself is no use: it is insoluble and difficult to work with, making it impractical for inclusion in creams and lotions, but Pawelek's melanin substitutes dissolve readily in \Jwater\j and, incorporated into cosmetic creams, can be spread evenly on the skin to produce a tan instantly.\p
Health faddists will no doubt appreciate the fact that the synthetic melanin is derived from the aloe vera plant, so popular in "health" products. Pawelek believes that the substance may help to protect the skin against cancer-causing ultraviolet light, especially if it is added to a sunscreen, since the very people who refuse a sunscreen, wanting a "healthy tan" can now have their cake and eat it too.\p
#
"Is your sunscreen safe?",472,0,0,0
(Feb '98)
Sunscreens and \Jsun\j blocks are now commonly used in the developed world to protect pale skins from sunburn. One of the main reasons for the acceptance of sunscreen has been a growing public awareness of the thinning \Jozone layer\j, and a greater awareness of melanomas and skin cancers.\p
A skin cancer is usually a small and rather harmless cancer, which can be "burnt off", usually with liquid \Jnitrogen\j, but melanomas are much more dangerous, as they can metastatise, dropping off cancerous cells which travel through the body, establishing secondary cancers which end up killing the victim. Melanomas can be removed safely, if they are detected before \1metastasis\c sets in.\p
Dr. Marianne Berwick, an epidemiologist at Memorial Sloan-Kettering Cancer Center has now warned that sunscreens may offer no protection against this greater threat. Addressing the AAAS, she stated something which has been known in the world's \Jmelanoma\j capital, \JAustralia\j, for some time, that sunburn does not, by itself, cause melanomas. (\JAustralia\j has large numbers of fair or red-haired people in tropical areas, many of them with numerous moles on their bodies-something like a third of the population having Irish ancestry.)\p
While sunscreens may prevent sunburn, those Americans most at risk from \Jmelanoma\j tend to avoid exposure to the \Jsun\j as much as others, and people's reports on their sunburn histories were unreliable. "When asked the same question at different times, people often gave inconsistent answers about their sunburn history," said Dr. Berwick. \p
On the other hand, those people who are normally \Jsun\j sensitive may use sunscreens to stay out in the \Jsun\j longer-thereby eliminating sunburn, which would have otherwise signaled them to get out of the \Jsun\j. As a result, these people expose themselves to more \Jsun\j than they should. Sunburn may not be a cause of \Jmelanoma\j, but may be an indicator of exposure that causes the cancer instead.\p
While Australian researchers are pursuing a belief that sunscreen is blocking the "wrong" forms of ultraviolet radiation, the ones which do not cause melanomas, while allowing others to pass through, Berwick offers a fascinating hypothesis which may well stand validly alongside the Australian stance: that people who know themselves to be at risk may be protecting themselves from long exposure to the \Jsun\j which would partially "toughen" the skin.\p
Instead, they allow themselves shorter, regular exposures to sunlight, gathering the exposure which will trigger a \Jmelanoma\j, without acquiring the toughening. The issue is a complex one, and there are probably other factors involved as well, but Berwick's epidemiological contribution seems too important to ignore.\p
#
"Using viper venom to stop cancer",473,0,0,0
(Feb '98)
The venom of a viper snake may kill you, but it is even better at stopping metastasis, the technique used by aggressive tumors to spread themselves through the body. To find out why, Mary Ann McLane, an assistant professor in University of \JDelaware\j's Department of Medical Technology has been trying to identify the structure of the venom molecules. While her work is rather specialized, it makes an elegant example of the ways in which genetic \Jengineering\j is used in research.\p
The venom from Macmahon's Viper \IEristocophis macmahoni\i, found in \JAfghanistan\j and \JPakistan\j, contains the protein, eristostatin, which blocks metastasis. Eristostatin is one of many viper-venom "disintegrins," proteins which interact with a family of cellular \Jreceptors\j called integrins. The disintegrins prevent an early step in blood clotting, platelet aggregation, and cell adhesion because the disintegrins from viper-snake venom stop the sticky protein, fibrinogen, from binding with \Jplatelets\j. \p
In one study, \Jmelanoma\j cells were injected into cancer-susceptible mice, some of which also received eristostatin. Eleven days later, eristostatin had clearly reduced the average number of liver tumors, from 14.4 among unprotected mice to 0.6 within the treated population, according to a research paper published in 1995.\p
McLane is concentrating on the RGD loop, a group of amino acids which stick out from the molecule. Composed of arginine, \Jglycine\j and aspartic acid, the RGD loop is known to play a key role in binding with integrins. To learn more about the RGD loop, McLane compares eristostatin with echistatin, a disintegrin from the venom of \IEchis carinatus\i, another viper-type snake from the Middle East. The amino-acid sequences of the two proteins are 68 percent identical, but echistatin exhibits markedly different binding behaviors. Compared with eristostatin, for instance, echistatin is far less effective at preventing fibrinogen from interacting with integrins, and cannot prevent \Jmelanoma\j cell metastasis. Echistatin also interacts with \Jreceptors\j on blood vessel walls, a trick that eristostatin has not mastered. \p
The present and future work involves manipulating the genes involved, inserting them into \Jbacteria\j, and then having the \Jbacteria\j produce the altered snake venoms. Ideally, McLane will be able to give echistatin the binding region of eristostatin, one amino acid at a time. Finally, she will test each mutant's ability to interact with \Jreceptors\j on \Jplatelets\j to identify the exact sequences most critical for binding. In the mean time, she is also assessing the usefulness of disintegrins in treating \Jthrombosis\j, or blood clots, especially those which happen in the arteries.\p
Useful Web site for more detail: \Bhttp://www.udel.edu/PR/NewsReleases/Viper/viper.html\b\p
#
"Tough out lactose intolerance!",474,0,0,0
(Feb '98)
According to the dean of Purdue University's School of Consumer and Family Sciences, Dennis Savaiano, many people who claim to be \1lactose\c intolerant, simply are not anything of the sort. Their "condition" may be no more than their systems being unused to digesting the dairy foods which contain lactose.\p
The condition known as \1lactose intolerance\c causes gas, bloating, or nausea, none of which is directly life-threatening, but the condition may have a long-term effect on victims, who may choose to avoid dairy foods, and so get too little \Jcalcium\j in their diets. In the even longer term, this could lead to \Josteoporosis\j.\p
Lactose is a form of sugar, or \Jcarbohydrate\j, found in milk and dairy products, a \1disaccharide\c made of two simple sugar molecules. This sugar is too large to be absorbed by the \Jintestine\j, and is broken down by an \Jenzyme\j, lactase, produced by the body. Most adults do not produce enough lactase to completely break down the lactose. In fact, up to three-quarters of the world's population produces too little lactase.\p
According to Savaiano, if you only consume dairy products once in awhile, you are more likely to have symptoms from them, and if you consume them by themselves, as opposed to as part of a meal, they tend to be transported throughout the \Jintestine\j more rapidly and are more likely to cause symptoms.\p
So dairy foods, regularly but in moderation, taken as part of a meal, is the way to go. Or choose to eat \1yogurt\c, he says, since this contains a lactase which helps digest lactose in the \Jintestine\j. Savaiano adds that the human digestive system is remarkably adaptable. The large intestines contain \Jbacteria\j which help digest lactose, so by altering your diet over time, \Jbacteria\j more effectively digest lactose, making milk better tolerated in your system. \p
#
"Germs cause high blood pressure",475,0,0,0
(Feb '98)
\I\1Chlamydia\c pneumoniae\i is a bacterium, usually responsible for \Jpneumonia\j, bronchitis, and sinus infections, which now has to be linked with severe high blood pressure (a reading of 160/90 or greater), according to a study in this month's \IHypertension\i, the journal of the American Heart Association.\p
A report in July 1997 (see Updates, July, 1997, \B\IBut will people be more sensible about \Jantibiotics\j?\i\b) revealed that those heart attack survivors who had the most \IChlamydia pneumoniae\i \Jantibodies\j had a four-times-higher risk for suffering another heart attack or needing treatment to restore blood flow to the heart than did survivors with no detectable \Jantibodies\j in their blood.\p
The new work, based on a sample of patients with high blood pressure in the inner city of Birmingham, described in the report as a multiracial city. A study of 123 patients and 123 controls revealed that 35% of individuals with high blood pressure and 18% of matched healthy individuals had \IChlamydia\i infection.\p
#
"Oral bacteria go for the heart as well",476,0,0,0
(Feb '98)
Meanwhile, in the USA, periodontal \Jbacteria\j have been shown to be related to \Jheart disease\j. The link has been suspected for a number of years, but it was only at the end of February that the link was finally confirmed, when dental plaque \Jbacteria\j were shown to cause clumping in human blood \Jplatelets\j.\p
Two related reports appeared over a short period of time, the first being advance notice by dental researchers of a paper to be delivered in early March. Dr. Eugene J. Whitaker and Dr. Thomas E. Rams of Temple University report that the only bacterium to do this was \IPorphyromonas gingivalis\i which is the most important bacterial cause of destructive gum diseases in adults. (Note that a number of the Web reports wrongly name the bacterium as \IPorphyromanas gingivalis\i, so your Web searches will need to use that name as well, or a wild card). In other words, the organism which causes plaque on teeth is also responsible for the plaque that builds up in our arteries!\p
These research findings further support and expand a possible link between periodontal disease and development of atherosclerotic \Jheart disease\j, a condition resulting from plaque build-up and constriction of coronary heart arteries, and strokes affecting the brain. \p
The earlier report, given to the AAAS in mid-February, came from University of \JMinnesota\j researchers, Dr. Mark Herzberg and Dr. Maurice Meyer, who worked on rabbits, where the Temple researchers had used human blood.\p
According to Herzberg and Meyer, certain strains of the \Jbacteria\j \IStreptococcus sanguis\i, the most numerous organism found in dental plaque, and \IPorphyromonas gingivalis\i, both caused \Jrabbit\j blood to clot. After \IS. sanguis\i was infused into rabbits, blood \Jplatelets\j clumped together, the effect being found in about 60% of the strains of \IS. sanguis\i. (Curiously, their publicity on the Web gives no further indication of the activity of \IP. gingivalis.\i)\p
So whether you are a \Jrabbit\j or a human, it looks as though oral hygiene just became that little bit more important than before.\p
#
"Ebola unraveled",477,0,0,0
(Feb '98)
Ebola virus causes a rapidly progressing, often fatal, infection that can lead to vomiting, diarrhoea, rash, high fever, hemorrhaging and shock. Often, liver and kidney functions are impaired. Given the right funeral rituals, ebola can spread from a dead victim to most of the mourners, and also to carers who tend the sick and dying.\p
Now its methods are being worked out, and a report in \IScience\i during February reveals that Ebola works on two fronts, using two forms of the same glycoprotein to fight off the body's immune response. One form disables the cell's inflammatory process, where \Jinflammation\j is used to fight off attacks, and the second form is used to attach to the endothelial cells which line blood vessels, probably causing the hemorrhaging which is characteristic of Ebola, helping the virus in some way to infect and damage those cells. \p
The glycoprotein has been known for almost twenty years, but its role has only now been determined. Originally, researchers thought that the secreted form of glycoprotein (sGP) might act as a decoy. Its role, they suspected, was to steer inflammatory cells away from the transmembrane form of glycoprotein (GP). It now appears that the sGP actually binds to neutrophils and hinders an early inflammatory response which might stop the virus from replicating. \p
#
"I knew that . . .",478,0,0,0
(Feb '98)
Scientists have a legitimate role in testing and, where necessary, confirming "accepted views." A recent study, reported in the February 1998 issue of the \IJournal of Clinical Microbiology\i reveals that most cases of the common \1cold\c are caused by viruses, but only half are a result of infection with the rhinovirus, the virus most often implicated in colds. These findings support the recommendations that \Jantibiotics\j, which do not work on viral infections, not be used to treat cold symptoms, as only seven of 200 patients studied had bacterial infections, and six of those had viral infections as well. That recommendation alone must make the report worthwhile!\p
Despite many studies showing that \Jantibiotics\j are of little use in treating the common cold, it is estimated that up to 60% of patients with common colds receive some type of antibiotic. This results in an estimated cost of $37.5 million per year in the United States for unnecessary prescriptions on top of the risk of developing antibiotic resistance.\p
\p
#
"Antibiotic stupidity abounds",479,0,0,0
(Feb '98)
The misuse of \Jantibiotics\j (see reports in April, July, August and September, 1997) continues around the world, and the increase in the number of strains of resistant \Jbacteria\j continues. During February, it was reported that methicillin-resistant \IStaphylococcus aureus\i (MRSA), a predominantly \Jhospital\j-acquired infection, has been identified in children outside of the \Jhospital\j setting with no identified risk factors.\p
A new Website was revealed at the AAAS conference in February, a site devoted to tracking trends in resistance. Called The Resistance Web\U(TM)\u, the new website is located at \Bhttp://resistanceweb.mfhs.edu\b\p
The site offers researchers immediate access to the results of 10 years of drug resistance tracking and all of the associated drug utilization data and demographic information (details about how and on whom the drugs have been used). The site will help to alert medical professionals to drug resistance patterns, the dangers of over-prescribing \Jantibiotics\j, and to encourage increased focused surveillance activities. The site houses a powerful \Jdatabase\j enabling the user to construct customized queries on susceptibility patterns of different \Jbacteria\j to specific \Jantibiotics\j, regionally and nationally across the United States. \p
#
"Medicine or witchcraft?",480,0,0,0
(Feb '98)
Treatments such as eye of newt and tongue of frog are less common now in the standard medical kit than they once were, and medicine is probably better off for that, overall. These days, exotic substances are not commonly tested on humans unless there is some good theoretical reason to do so.\p
At first glance, this common principle would make it amazing that anybody should even think of using a pheromone, a trail-marking chemical, laid down by an ant called \IAphaenogaster rudis\i, to treat patients with Alzheimer's disease. The ant, common in the north-eastern United States, secretes anabaseine (3,4,5,6-tetrahydro-2,3'-bipyridine) as one of four substances in its trail marking solution.\p
A chemical analog of this substance, called GTS-21 stimulates the nicotine receptor sites in the brains of Alzheimer's patients and helps reduce their memory loss. A research report in February indicates that Alzheimer's patients may be diagnosed as much as two years before the condition becomes serious, so fastidious pre-Alzheimer readers may need to be reassured that the treatment uses no ants: synthetic anabaseine can be made far more easily than the real stuff can be extracted from ants.\p
The same alkaloid is also found in marine worms and \Jtobacco\j leaves, but the researchers are silent on the thought processes which led to testing this substance for its effectiveness in treating Alzheimer's disease. \p
#
"One way to determine sex?",481,0,0,0
(Feb '98)
Most metazoans, animals which are made of groups of cells, have two sexes. All the way from tiny worms to elephants, the same pattern holds. In the past, scientists believed that different branches of the animal kingdom had different ways of genetically controling sex. In \INature\i during February, a report was published about the male sexual regulatory gene mab-3 from the \Jnematode\j \ICaenorhabditis elegans\i, which the researchers found to be related to the \IDrosophila melanogaster\i sexual regulatory gene doublesex (dsx). \p
In other words, the same gene is found in two very different branches of the "tree." Both genes control "sex-specific neuroblast differentiation and yolk protein gene transcription," while dsx controls other sexually dimorphic features (that is, features which differ between males and females) in the fruit fly as well. The form of dsx that is found in male flies can direct male-specific neuroblast differentiation in \IC. elegans,\i indicating that it is effectively identical. This structural and functional similarity between phyla suggests a common evolutionary origin of at least some aspects of sexual regulation. \p
Even more interestingly, they have identified a human gene, DMT1, which is active only in the testis of human males, and which produces a similar protein.\p
#
"A very thin wire . . .",482,0,0,0
(Feb '98)
Israeli researchers at the Technion-Israel Institute of Technology have used strands of DNA to assemble tiny particles of silver into a conductive wire a thousand times thinner than a human hair. Reported in Nature in mid-February, this may well be one of the key steps in working our way to the serious miniaturization of nanoelectronics.\p
The nanocircuits of the future will use wires, transistors, and other components with dimensions measured in nanometers, or billionths of a meter. By packing many more components closer together, scientists could produce computer chips that are much faster than today's, and far more sophisticated.\p
DNA has already been used to assemble minute nanoparticles of semiconductors and other electronic material into crystal-like lattices and other orderly structures, but this is the first time that a true component of a nanocircuit has been made in this way. The Technion team constructed their nanowire between two gold electrodes separated by a narrow gap of 12 micrometers (also called microns), about one-tenth the width of a human hair. \p
They synthesized strands of DNA that linked themselves together to form a kind of construction scaffolding between the electrodes. Since DNA by itself does not pass current, they finished the wire off by attaching grains of silver along the scaffold by a process similar to photographic developing, forming a silver wire which connected the electrodes. \p
\p
\BTechnical details\b\p
The first step was to deposit two gold electrodes on a small glass plate and coat each with one of two types of 12-base oligonucleotides, short sequences of DNA. Each oligonucleotide type has a different chemical code, a unique chemical identity. The problem is that DNA by itself won't bind to gold. So the scientists used a chemical subunit called a disulfide group to "glue" the oligonucleotides to the gold electrodes. Then they next added long DNA ladders to create a self-assembling scaffold to guide the construction of the silver nanowire. \p
Lengths of double-stranded DNA are then added in a \Jwater\j drop, each DNA double strand having a single-stranded segment of 12 bases, designed to match the oligonucleotides and bond to them. This causes the DNA to span the gap, with the two "tails" bonding to the two oligonucleotides.\p
When the glass plate is washed through a solution of silver ions, the ions line themselves up along the DNA strand, ready to be treated by a chemical reducing agent, which converts the ions into neutral atoms. These first grains of silver are then used to catalyze the growth of metal silver to form a conducting pathway for electricity-a wire.\p
One curious discovery: the wire "remembers" which way you ran current through it the first time and becomes a diode, a one-way street for future electrical currents. The polarity of a diode, the direction in which it passes a current, represents a piece of stored information. The team members speculate that there may be the opportunity here to create a tiny storage module.\p
#
"Senate ban plan fails",483,0,0,0
(Feb '98)
The US Senate rejected a bill to ban \Jcloning\j on humans during February. Reacting to the announced plans of Richard Seed to start human \Jcloning\j, the knee-jerk bill was considered far too drastic, and even President Clinton, who favors a ban, considered that this bill went too far.\p
The bill was defeated after opponents of the bill, including major scientific societies, industry organizations, patient advocacy groups, and 27 Nobel laureates argued that the proposed bill would block basic biomedical research as well as human \Jcloning\j. Their arguments had sufficient force to win over enough senators to put off a vote, at least for the moment.\p
#
"Overfishing still goes on",484,0,0,0
(Feb '98)
In recent years, many fisheries have been placed under drastic restrictions, aimed at conserving the surviving stocks. A new analysis, published in \IScience\i in early February, suggests that there is still not enough being done. The investigators looked at fisheries catch data collected between 1950 and 1994 by the UN Food and Agriculture Organisation (FAO). They found that there has been a gradual depletion of long-lived, high-trophic-level fish, such as cod and haddock, and a corresponding rise of low-trophic-level invertebrates and plankton-feeding fish, such as anchovy.\p
In other words, fish further up the food chain are slowly being eliminated, and their prey are now being eaten by us. On a simple level, this might seem fine, but the implication is that entire ecosystems are being disrupted by overfishing.\p
Even if this were not a problem, there are effects on humans as well, as the quality of the fish catch declines. In the Black Sea, for instance, there has been a huge increase in the numbers of \Jjellyfish\j as their economically valuable competitors have been removed. In the longer term we might end up with oceans no better than a marine junkyard dominated by plankton.\p
Proposed solutions relate mainly to the creation of more zones which are off-limits to fishing boats, but such areas are tempting to the operators of today's "fishing factories", ships which can go to sea for months, and store huge harvests until they return to their homes, half-way around the world. During February, two such pirate ships were caught in Australian territorial waters off a distant island territory, close to \JAntarctica\j, and many others are reported to have escaped.\p
And in another sign of the times, the recent count in British Columbia of redds, the bowl-shaped gravel nests in which Atlantic salmon lay eggs, is providing grave new evidence of another dismal breeding season for wild salmon, which once thrived half a million strong in rivers north of the Hudson. \p
#
"Are frogs dying of ultraviolet overload?",485,0,0,0
(Feb '98)
Frogs are dying out, all over the world. Simple explanations like \Jpollution\j do no good at all, as some of the lost species have died out in pristine wilderness areas. Now a simple but sensible theory has emerged, being presented first to AAAS during February.\p
Joseph Kiesecker and Andrew Blaustein, who have been carrying out field experiments in the Oregon Cascade Mountains, have confirmed what many scientists had suspected-ambient levels of ultraviolet-B (UVB) radiation from the \Jsun\j can cause high rates of mortality and deformity in some species of frogs and other amphibians.\p
This makes sense of some recently released Australian data, where many of the apparent causes of death could well be triggered by increased UVB levels breaking down amphibians' defences to diseases and disorders.\p
The researchers compared the progress of the embryos of long-toed salamanders \1(\Jsalamander\j, newt\c and \1axolotl\c are all names for lizard-shaped amphibians with similar breeding habits to the frogs) shielded from UVB radiation by Mylar filters with the progress of unshielded embryos. They found that 95% of the shielded embryos hatched, compared with only 14.5% of the unshielded embryos. Even more striking, only 0.5% of the surviving shielded salamanders had deformities while 91.9% of the unshielded salamanders had deformities. Malformed tails, blisters and oedema were the most frequent deformities. \p
The Australian results are borne out by Kiesecker's report that he had found increased mortality associated with a pathogenic \Jfungus\j \I(Saprolegnia ferax)\i infecting some embryos exposed to UV-B, while embryos under Mylar filters were not infected.\p
Most interestingly, Kiesecker was able to relate mortality levels in different species to the amounts of an \Jenzyme\j, photolyase, found in the different species. Photolyase is used to repair UV damage to DNA, attacking a major UV photoproduct in DNA, cyclobutane pyrimidine dimers, which can cause mutations and cell death if left unchecked. The work was also reported in the December issue of the \IProceedings of the National Academy of Sciences.\i\p
#
"Albatrosses on the Net, not netted",486,0,0,0
(Feb '98)
All around the world, the \1albatross\c species have a problem with fishing boats, many dying each year when they try to take bait from long-line fishing boats which have not managed to get their hooks below the \Jwater\j surface soon enough. Now albatrosses are being studied, and the results are available on the \JInternet\j.\p
Wake Forest University biologist David Anderson usually studies seabirds in the wild without much company, but thousands of schoolkids are tagging along this time via a Web site and e-mail. Anderson's \JAlbatross\j Project is tracking Hawaiian albatrosses by \Jsatellite\j to find ways to reverse losses to longline fishing and answer evolutionary questions raised by their flights. \p
Anyone can participate in the study by typing "subscribe \Jalbatross\j" in the body of an e-mail message to \Blistserv@wfu.edu\b Or people can click on "Join the Project" at The \JAlbatross\j Project's Web site, at \Bhttp://www.wfu.edu/\Jalbatross\j\b\p
The site has recently been upgraded, with maps showing some of the monster flights undertaken by some of the birds in the study.\p
#
"Some like it hot",487,0,0,0
(Feb '98)
The world's hottest worms - it may not be the greatest record in the world to hold, but down in a mess of hydrothermal vents and geysers off the coast of Costa Rica, there are worms which thrive at 80░C (176░F). A report in \INature\i in early February described these worms, known as \JPompeii\j worms \I(Alvinella pompejana),\i and the bacterial hitch-hikers which ride on their backs.\p
These \Jbacteria\j produce enzymes that may hold the key to new protein-based catalysts for making drugs, paper, food and a host of other goods, according to the report. These enzymes can withstand the high temperatures that their makers live in, they appear to be stored for long periods of time, and to work in organic solvents, making them extremely interesting.\p
Before the worms, the "hottest" animal on \Jearth\j was the Sahara Desert ant, Cataglyphis, which is capable of foraging for brief periods under a blazing Saharan midday \Jsun\j, when temperatures soar to 131░F or 55░C. A French report of \JPompeii\j worms in 221░F/105░C may well have been correct, but it was probably only a temporary exposure. So for the moment, the record stands conservatively at 80░C.\p
So will there be something left to survive global warming? Possibly not: while the "tail" end of the worm may be at 80░C (176░F), its "gills" end is most commonly in \Jwater\j at 22░C (72░F), so perhaps the worm needs to keep one end cool. One thing is certain, the discovery has sparked a rush, not only to learn more about these strange creatures, but also to study other "extremophiles", animals living under extreme circumstances, around the world.\p
To see the worms, go to \Bhttp://www.udel.edu/PR/NewsReleases/Worms/worms.html\b\p
\p
#
"Tin worms",488,0,0,0
(Feb '98)
The comfortable rules of ordinary physics break down when you get to a small enough scale, and as we move to smaller and smaller circuits, bizarre things must be expected to happen. Mohan Krishnamurthy and his fellow researchers at \JMichigan\j Tech University were uncertain what would happen when they carefully applied a film of a tin-\Jgermanium\j alloy several atomic layers thick to a two-inch disc of \Jgermanium\j.\p
\1Tin\c and \1germanium\c don't get along very well, metallurgically speaking, so the experiment's prospects were unusually hard to predict, but even so, the results were surprising. They found that little tin "worms" dug neat little ditches through the alloy down to the level of the pure \Jgermanium\j. Like a metallic earthworm, each glob of tin eats up the alloy, spits out the \Jgermanium\j, and keeps the tin.\p
Even more surprisingly, there was no random tangle of trenches. Instead, the worms had dug out a series of wobbly \Jstraight line\js finished off with right-angle turns. And, when they finally halted their excavations, the worms might not have created the world's smallest circuit board, exactly. But at least it looked roughly like an artist's fantasy of how such a circuit board might look. \p
The trenches were 8 nanometers deep, each flanked by tiny mounds of \Jgermanium\j only 4 nanometers high. More importantly, they were amazingly long by nano-standards, up to about 10 micrometers (that's 10 000 nanometers!) in length. This perhaps explains why the researchers' work appeared in \IScience\i during February, under the title "Quantum Etch-A-Sketch." Krishnamurthy has now reported the result, and speculated about a number of possible uses in which the effect might be used, but just for now, it remains just one of those interesting but unexplored alleys of science.\p
#
"Tougher than diamonds",489,0,0,0
(Feb '98)
In the study of the \1hardness of minerals,\c the \1diamond\c always comes first. Now a group of Korean and US theoretical physicists have come up with an idea to make a diamond that is tougher than diamond. Their solution: take a diamond, and give it a coating. While there are a number of substances which are theoretically tougher than diamond, the problem has been making enough of the materials to test them. In \IPhysical Review Letters,\i they suggest that adding a thin layer of \Jboron\j atoms will create a surface some 18% tougher than diamond alone, as measured by its bulk modulus. If their idea works, it may lead to ways of making protective coatings for everything from machine tools to computer disk drives.\p
The secret lies in understanding what the surface of a diamond is like, at the scale of the single atom. Diamond is crystalline carbon, where each carbon atom is bonded to four others in a sturdy tetrahedron, with a single atom in the centre and others surrounding it. A diamond's surface looks like a bunch of pointed triangles sitting on a tabletop, with each point resting on a base of three of its atoms. Directly above the fourth atom is a final surface atom which is usually a \Jhydrogen\j atom.\p
A \Jboron\j atom will normally bond to just three other atoms. So if the \Jhydrogen\j atom and the carbon it is attached to are both removed and replaced with a \Jboron\j atom, the surface will be sealed off, producing a material with a much greater \1bulk modulus.\c Now all they need to do is find a way of bringing it off.\p
#
"Palaeontology news",490,0,0,0
(Feb '98)
The carbon-14 clock has now been calibrated back to 45 thousand years, thanks to cores taken from the bed of Lake Suigetsu, near \JKyoto\j in \JJapan\j. This has added a huge 35 thousand years to the range of ages in which we can truly rely on \1carbon dating.\c\p
While we can make certain assumptions about past levels of carbon-14 in the \Jearth\j's \Jatmosphere\j, these are not always as safe as we assume, because the \Jearth\j's magnetic field, and the cosmic ray background may both have varied in the past, causing variations in the amount of carbon-14 being produced. Other variations can be caused by ocean ventilation effects, since the world's oceans are a major carbon reservoir, storing and releasing carbon back into the main biological and biochemical systems.\p
Since we assume at the same time that freshly formed plant material will always have the same levels as the surrounding \Jatmosphere\j, any variations in the atmospheric levels of radiocarbon will also be seen in the plant material created at that time. In the case of animals, the picture can be even further confused by the animals eating food derived from old wood, or drinking \Jwater\j containing ancient \Jcarbonates\j dissolved in it, but for now, let us stay with the plant material.\p
Scientists have been able to collect old samples of ancient trees, such as bristlecone pine, and by careful tree-ring matching, or \1dendrochronology,\c they have been able to accurately date specimens of wood, which can then be tested, providing a \Jcalibration\j check in the form of a \Jcalibration\j curve, a slight but important distortion of the theoretical curve we would get if carbon-14 was always produced at exactly the same rate.\p
Even if there are tree samples older than 10 thousand years, there are no "matches" that will allow us to accurately link them into the tree-ring sequences that we have available to us, so the \Jcalibration\j curve could only reach back 10 thousand years or so. But that is no longer a problem, not since Hiroyuki Kitagawa revealed the banding patterns of seasonal deposits to be found in the lake's bed. The carved deposits of the lake allow careful counting, all the way back to 45 thousand years ago.\p
The results that Kitagawa and van der Plicht report are consistent with some earlier attempts at calibrating the record in less detail. Their results indicate several episodes of ocean ventilation and two large spikes, perhaps indicating a high flux of cosmic rays, nearby \Jsupernova\j, or collapse of the geomagnetic field. \p
#
"The walk of the pterosaur",491,0,0,0
(Feb '98)
There are two main ways of finding out how a \1fossil\c walked. If you can find footprints, and then be lucky enough to link the prints to a particular species, you are well on the way, but that sort of find and link is rare. So most palaeontologists rely on reconstructions and modeling.\p
The problem is that most fossils curl up, or are scattered in death, and when they are fossilised, they are flattened and pressed out of shape. Most \1pterosaur\c skeletons have been flattened in thin-bedded rocks, obscuring three-dimensional anatomy.\p
Scientists disagree about the extent to which pterosaurs moved on the ground. According to some, they were sprawling, quadrupedal walkers, while other palaeontologists want to regard them as erect, bird-like bipedal cursors.\p
We can deduce the hip movements of these animals from fossils of related groups, but the lack of any accurate three-dimensional, articulated \Jpterosaur\j feet has prevented examination of all of the movements that are possible within the foot, and that limits our understanding of how the pterosaurs walked.\p
A "Letter to \INature"\i in February reports finding a large, uncrushed, partial skeleton of a new species of the basal \Jpterosaur\j \IDimorphodon\i in thick-bedded deposits of Tamaulipas, Mexico; this material includes such a three-dimensional foot, and it contradicts one basic assumption that has been made previously, that the animals ran with only the tips of their toes touching the ground.\p
The flattened metatarsal-phalangeal joint at the base of the first four toes of this specimen would not allow such a posture without separating most of the joints, suggesting that the specimen was flat-footed, a view that is consistent with presumed footprints of pterosaurs which show impressions of the entire sole of the foot. \p
Note: the term "Letter to \INature"\i is a 19th-century term for a scientific communication, and not to be confused with a "letter to the editor" in other journals.\p
#
"No sex, please, we're Northerners",492,0,0,0
(Feb '98)
Most plants and animals reproduce sexually, and only a few use asexual reproduction. Comparisons of closely related sexual and asexual species in the northern hemisphere shows that the asexual example is more northerly in 76% of cases, based on a study of 43 cases, covering ten genera of plants.\p
A mathematical study reported in \INature\i this month indicates that it is possible for this variation to be accounted for by a single underlying process, even allowing for the fact that asexual species are generally found at higher altitudes and in marginal, resource-poor environments, where conditions are generally worse.\p
It will be interesting to see what happens when this sort of study is carried out in the southern hemisphere.\p
#
"Sex at the right time",493,0,0,0
(Feb '98)
Getting the timing right is important for all living things. Susan Brawley, a University of Maine marine biologist, leads a research team which has overturned a widely-held principle of reproduction in aquatic organisms, using \1algae\c as their research material. \p
Organisms which reproduce in the \Jwater\j by external fertilisation - some species of fish, corals, and plants such as seaweeds, have always seemed to have poor timing, with less than a 1% success rate in fertilisation, according to experiments and modeling studies. The only problem: these studies all assumed that the \Jwater\j in question was turbulent, carrying many of the gametes away. In spite of that, the organisms seemed to manage to survive quite well.\p
Brawley looked at a \Jgenus\j of brown \Jalgae\j, the seaweeds known as \IFucus,\i the only \Jseaweed\j known to thrive equally well in saline waters as well as low \Jsalinity\j environments such as the Baltic Sea. Her main interest was the way \IFucus\i avoided polyspermy, a lethal condition in which an egg is fertilised by more than one sperm. Seaweeds normally depend on high concentrations of sodium to prevent polyspermy, but the Baltic Sea populations didn't have much sodium to work with, since the Baltic often is more like a brackish lake than an oceanic sea. She noticed that \IFucus vesiculosis\i did not release its gametes until slack high tide, and suspected the plants were responding to \Jwater\j motion and, perhaps, to accompanying changes in \Jsalinity\j.\p
Studying seaweeds from Maine and \JCalifornia\j as well, the group found that high \Jsalinity\j is one of the cues for the release of gametes, and all of that was done under calm conditions. The group now believes that carbon dioxide levels are also involved.\p
While the \Jsun\j is shining, plants take up carbon as they carry out \Jphotosynthesis\j. Brawley and her team reasoned that when \Jwater\j is being churned by waves or tidal currents, the plants are constantly receiving new supplies of carbon dioxide. Gametes are not released under such circumstances. Without turbulence, the carbon dioxide supply begins to run out. The experiments have shown that the carbon deficit is the chemical signal for plants to release their gametes. It is the green light which tells the plants that the \Jwater\j is calm and the time is right for reproduction. \p
No wonder the \Jalgae\j are managing to survive!\p
#
"Recovering from your next mass extinction",494,0,0,0
(Feb '98)
Mass \Jextinction\j events usually involve wiping out whole groups of plants and animals. A whole class, order or \Jphylum\j may disappear at such a time, but in the remaining groups, as many as 90% of all species are wiped out, either from the imbalances caused to whole ecosystems, or from the lack of food, or because of large numbers of hungry predators left without their preferred food.\p
For those species which survive the mass \Jextinction\j event and the shakedown period which follows, there are suddenly many opportunities for movement into newly vacated niches. In theory, recoveries could either be much the same, wherever they occur, because life forms seem to be remarkably good at springing back. Or the recoveries might be hugely different, because of the random nature of the shake-ups that occur.\p
The only real test is to examine the \Jfossil\j records left behind after a number of \Jextinction\j events, and that is exactly what David Jablonski did, reporting his results in Science at the end of February. His main finding: the recovery patterns differ widely from one geographical region to another, even though the \Jextinction\j intensities and patterns are more or less the same everywhere.\p
Focusing on the mass \Jextinction\j event that killed the dinosaurs, at the Cretaceous/Tertiary (K/T) boundary 65 million years ago, the University of \JChicago\j palaeontologist found that recovery rates in North America and Europe were very different even though they are at roughly the same latitude. This was the event, first suggested by \1Luis W Alvarez,\c which seems to have been caused by a smallish asteroid landing somewhere close to Chicxulub on Mexico's \JYucatan\j Peninsula.\p
In northern America, the result was a pulse of "bloom taxa", a great diversification of a few groups, rather like the explosion of algal blooms or weeds after an ecological disturbance. In Europe, on the other hand, this pattern is entirely missing. Jablonski then burrowed into museum collections and investigated the collections of molluscan fossils for northern \JAfrica\j and India. Among other things, he found that the Gulf Coast of North America differed from all three regions, not only in respect to bloom taxa but in other aspects, as well. \p
In any mass \Jextinction\j, the plants and animals that appear afterwards comprise three major groups: 1) newly evolved species, 2) local survivors from the \Jextinction\j region and 3) invading survivors which were living elsewhere before the \Jextinction\j). Previous studies have found that the numbers of invaders coming into a region is usually correlated with the intensity of the \Jextinction\j, but Jablonski's results don't show that pattern. In North America, there was a significantly higher proportion of invaders than in the other regions he studied.\p
Perhaps this is because the Gulf Coast site was much closer to Chicxulub impact site, where a massive \Jmeteorite\j slammed into \JEarth\j 65 million years ago, yet the \Jextinction\j levels were much the same in all four sites. Further studies, looking at equally close South American sites may answer some of these questions.\p
In any case, if the on-again-off-again deadly asteroid (see \B"Death of a Death Threat",\b March 1998) does turn out to be a real problem, it looks as though it will be hard to predict a really good place to shelter, and it may be even harder to predict what will survive.\p
#
"The dawn of the animals",495,0,0,0
(Feb '98)
The Doushantuo geological formation of south-central China is the site of a mine, from which phosphate-rich rock is taken to be used as fertiliser. Now the mine has been recognised as the site of some amazing treasures: the world's oldest \Jfossil\j plants and animals, dated to around 570 million years ago. So exciting was this news that reports appeared in both \IScience\i and \INature\i at the start of the month, outlining the finds.\p
The \1Cambrian\c explosion at about 550 million years ago, which gave us the huge diversity of life seen in the \1Burgess shale\c could only happen when there was already a large range of animal and plant forms available. But what that earliest life looked like, nobody knew - until now. These discoveries of exquisitely preserved fossils of \Jalgae\j and animal embryos, in rocks dating from about 570 to 580 million years ago, will open a new era in the study of early animal \Jevolution\j. \p
The finds include sponges, some of them less than a millimeter across, which are now rated as the first multicellular animals to exist on \JEarth\j. These sponges, originally thought to be \Jalgae\j, were identified by a group led by cell biologist Chia-Wei Li of the National Tsing Hua University in \JTaiwan\j, and reporting in \IScience.\i\p
The second group, led by palaeontologist Shuhai Xiao of Harvard University, reported in Nature that some of the other fossils in the mine are actually early-stage embryos of more complex animals that have bilateral symmetry, the next evolutionary step beyond sponges, which have no symmetry at all.\p
Xiao's group describe the finds as coming from phosphorites "of the late Neoproterozoic (570 ▒20Myr BP) Doushantuo Formation", identifying these as dating from just before the \1Ediacaran\c radiation. They report "abundant thalli with cellular structures preserved in three-dimensional detail show that latest-Proterozoic \Jalgae\j already possessed many of the anatomical and reproductive features seen in the modern marine flora," and state that embryos "preserved in early cleavage stages indicate that the divergence of lineages leading to bilaterians may have occurred well before their macroscopic traces or body fossils appear in the geological record."\p
The embryos are typically described as "squishy little \Jlarva\j-like things" with no chance of becoming fossils: now we see that phosphatisation can indeed produce fossils of soft-bodied animals, and palaeontologist Simon Conway Morris, one of the discoverers of the Burgess shale fauna, to comment on the Web: "The big question is how much farther back this [record] will go." \p
Originally, metazoan (multi-celled) animals were thought to date from the Vendian, about 565 million years ago, immediately before the Cambrian explosion, although the relation of these early forms to later Cambrian fossils and extant fauna has not been clear. These fossils must extend the origin of early metazoans to at least about 40 to 50 million years before the Cambrian explosion.\p
#
"Hot spots in the news",496,0,0,0
(Feb '98)
Hot spots are points on the \Jearth\j's surface where mantle plumes, narrow conduits of hot rock that well up from close to the core-mantle boundary, come closest to the surface. Volcanoes which are not located on a boundary between two tectonic plates are usually attributed to a hot spot. Volcanic island groups like Hawaii are put down to the crust moving slowly over the top of a stationary hot spot. Like a flame playing on the underside of a sheet of paper, the hot spot sometimes bursts through, erupting to the surface world.\p
New models, reported during February in \INature,\i tracing events of the past 68 million years, show how the underlying plumes of hotspots have not stayed in the same place but have swayed, drifted and twisted in response to large-scale mantle flow. \p
Plumes can also fuel the activity at a plate boundary, as happens in \JIceland\j, but there has been some doubt about the depths from which the plumes originate. Do they rise from the bottom of \JEarth\j's lower mantle, 2900 kilometers down, or are they rooted only a few hundred kilometers down in the upper mantle? The answer now seems to be available, based on studies of \Jearthquake\j waves that probed the mantle deep below \JIceland\j.\p
According to a report in \IScience\i during February, seismologists have detected signs of a narrow, hot plume at the traditional boundary between upper and lower mantle, about 660 kilometers down. \p
#
"California's next volcano",497,0,0,0
(Feb '98)
Some volcanoes come from shallow depths, even though they are fed by hot spots that go further down. A report in the February issue of \IGeology\i titled, "Fluids in the lower crust following Mendocino triple junction migration: Active basaltic intrusion?" describes a \Jvolcano\j which is due to erupt soon in \JCalifornia\j. "Soon", that is, in geological terms, within about the next four hundred thousand years. And just to reassure you, it will take at least a thousand years for it to happen.\p
It seems that a \Jmagma\j chamber 20 km (12 miles) below the surface, could rise to the surface in the Lake Pillsbury area, north of San Francisco. The location is related to an earlier active site at Clear Lake, some 50 miles (80 km) south of the present hot spot, and still earlier extinct sites stretch away to the south.\p
The entire area appears to be underlain with a large \Jmagma\j chamber, according to data gathered with the same seismic techniques that oil companies use to locate oil fields, although operating at much deeper levels.\p
Using 150 portable seismographs, they created a seismic array at 400 recording sites. With the seismographs placed about 200 feet (60 meters) apart over 18 miles (29 km) in an east-west direction, they fired one thousand pounds of chemical explosives into a 100-foot (30 meter) well. They repeated the experiment on a northwest-southeast line, parallel to the geologic structures. \p
Whether the \Jvolcano\j erupts or not will depend on the plate movements in the area. If the plates move in one direction, they could shut the system off. If they go another way, they could accelerate it and if they continue with the current one, it might produce a Clear Lake-type volcanic field. Then again, the \Jvolcano\j may hold off, and erupt a further 31 miles (50 km) to the north.\p
#
"Cold War end makes US Navy come in from the cold",498,0,0,0
(Feb '98)
The US Navy, at a ceremony in \JChristchurch\j, New Zealand on February 20, 1998 marked a significant milestone in the US Navy's withdrawal from the US Antarctic Program (USAP) after 42 years. This was planned in 1993, when the Navy developed new global priorities related to the ending of the Cold War. \p
This marks the end of the Naval Antarctic Support Unit, the Navy unit stationed in New Zealand, although the US Navy will continue to provide limited flight support to the USAP through the end of the next austral research season (1998-99). Support for the US Antarctic Program will now be provided by the 109th Air Wing of the New York Air National Guard.\p
Flying ski-equipped Hercules LC-130 \Jaircraft\j, able to shuttle to and from the South Pole, the Air National Guard will continue to support a program which is being handed over, more and more, to civilians, bringing to a close a long period of military research around the South Pole.\p
#
"Why is Antarctica so cold?",499,0,0,0
(Feb '98)
In mid-February, 26 scientists representing 10 countries sailed on the ocean drilling ship \IJOIDES Resolution\i to collect core samples from the continental rise and shelf of the Antarctic Peninsula. The international Ocean Drilling Program is conducting a two-month expedition near the edge of the Antarctic continent, the first of a series to probe the historical development of the Antarctic ice sheet and its consequences for \Jearth\j's climate. \p
The Antarctic ice sheet is the world's largest, but scientists have many questions about how it grew, when, and why. The researchers hope this expedition will help to answer those questions, when they bring up samples of the ocean bottom. In some places around \JAntarctica\j, it may have taken 1000 years to deposit 10 centimeters of sediment. \p
So drilling one kilometer of those sediments would take researchers back 10 million years. A teaspoonful of mud one centimeter thick would go back 100 years. The southern ice sheets are probably at least 35 million years old, while northern sheets like the \JGreenland\j ice sheet are probably only 7 million years old.\p
The expedition will conclude April 11 in Cape Town, South \JAfrica\j. We will keep you posted . . .\p
\p
#
"An even nastier greenhouse gas",500,0,0,0
(Feb '98)
A recent report in \IGeophysical Research Letters\i reveals that fluoroform (HFC-23) is a \1\Jgreenhouse effect\j\c gas with ten thousand times the warming power of carbon dioxide. Fluoroform is a waste by-product from the manufacture of the refrigerant, HCFC-22, also a greenhouse gas, but not in the same league.\p
Estimates of fluoroform in the \Jatmosphere\j set it at about 135,000 tonnes, increasing by 5% each year. This gives the same \1global warming\c potential as 1.6 billion tonnes of carbon dioxide, which is more than the total output of the worst CO\D2\d emitter, the USA, for one year. Fluoroform is an extremely long-lasting molecule, estimated to survive for 260 years in the \Jatmosphere\j.\p
While the \JKyoto\j climate conference set controls on the emissions of hydrofluorocarbons (HFCs), fluoroform was not taken into account when limits were set. With manufacturers of refrigerants switching from the ozone-damaging CFCs to the "safer" HCHC-22, it looks as though we may be out of the frying-pan, and snugly into the fire.\p
#
"Back to the moon again",501,0,0,0
(Feb '98)
In early March, NASA's Lunar Prospector provided clear evidence of polar ice on the moon, first detected by the Clementine \Jspacecraft\j in 1994. This find will be detailed more fully next month. (See earlier reports in December 1996, June 1997 and January 1998)\p
#
"Terrorists from space?",502,0,0,0
(Feb '98)
An explosion in Northern Ireland last December woke up the people of Belleek about 5 in the morning. Investigation revealed a crater, about four feet (1.2 meters) across, with the remains of an \Jaluminium\j \Jwater\j trough and a milk churn.\p
Given local expectations about the sources of explosions, it is hardly surprising that the people of Belleek assumed the explosion had been caused by a bomb planted by the IRA. Later investigation uncovered a glassy rocky fragment in the churn, revealing that the explosion was caused by a small \Jmeteorite\j, perhaps part of the \Jcomet\j Phaeton. Have the IRA formed an alliance with the Little Green Men?\p
#
"Seriously hot water",503,0,0,0
(Feb '98)
Very hot \Jwater\j has been detected recently inside sunspots. Scientists have achieved this by looking at the infrared spectra of the sunspots, but now a study of absorption features in two rather cooler stars, \1Betelgeuse\c and \1Antares,\c have been found also to contain \Jwater\j.\p
The \Jsun\j is so hot that the presence of liquid \Jwater\j came as a surprise, while the other two stars would have been thought more likely candidates to contain \Jwater\j in liquid form. But what will they find when they look at the stars in Aquarius? It's enough to make your eyes \Jwater\j . . .\p
#
"Lights, window, action!",504,0,0,0
(Feb '98)
Liquid crystals work because they can be aligned, side by side, and made to influence polarised light traveling through the material. In the past, liquid crystals have been used in ink and paint technologies, and in flat-panel displays and thermal imaging. Now there is a new way of using them in a substance called a cholesteric glass, which is able to produce a rewritable, full-color, method of recording an image. Announced in \INature\i during February, the cholesteric glass looks set to have a big future in information display and storage. But will we be able to use them to improve on the view out of our windows?\p
#
"Identifying x-ray sources",505,0,0,0
(Feb '98)
When the x-ray part of the \1cosmic background radiation\c was first discovered in the 1960s, scientists were unable to work out where the x-rays came from, and they have continued to wonder, ever since. The problem comes from the poor resolution images that could be obtained from the small, faint sources in the hard x-ray band (above 2keV) that dominates the background.\p
Unless an instrument can narrow down the source to a very small part of the sky, astronomers are left to guess which of dozens of objects in that patch of sky is producing the x-rays, and until recently, no more than 3% of the flux in the 2 - 10keV band could be attributed to individual sources. Now a new survey, a hundred times more sensitive than previous studies in this x-ray band have lifted the identified sources to a level where 30% of the flux can be tied to identified sources.\p
#
"Now the wine is bad for you",506,0,0,0
(Feb '98)
A University of \JCalifornia\j, San Francisco, epidemiologist, using sampling methods which are said to be suspect, now claims that there is no such thing as a little alcohol being good for you (as opposed to the report quoted in \B". . . for thy heart's sake",\b November 1997 and \B"Red wine heart factor pinned down",\b December, 1997). Her data were drawn from a number of other studies, but overall, made up a biased population with far too many participants from the under-25 range, and we only note her work on account of her apt name, which is Kaye Fillmore.\p
#
"Delivering anti-cancer drugs no pigment of the imagination",507,0,0,0
(Feb '98)
Porphyrins, the \Jpigments\j like \1chlorophyll\c and \1hemoglobin\c (haemoglobin) are able to hold and carry metal ions like \Jmagnesium\j and iron, but now a newly created, larger version of porphyrin is showing promise as a means of ferrying cancer drugs into tumor cells. In pilot trials, the new molecule, called texaphyrin, has shown promise in enhancing the effects of radiation on inoperable brain tumors, and the (US) National Cancer Institute has now added it to its list of the most promising therapies, according to a February report in \IScience.\i\p
#
"Reginald Victor Jones 1911-1997, physicist, inventor and hoaxer",508,0,0,0
(Feb '98)
R. V. Jones died late in 1997, but news has only now spread onto the \JInternet\j of this event. Jones is perhaps best remembered for his 1978 book (also made into a TV series) called "Most Secret War", in which he describes the science behind many of the "secret weapons" of World War II. He was well-placed to do this, having been head of the British Air Ministry's Scientific Intelligence section. After the war, he returned to civilian life as professor of natural philosophy (an old name for physics) at the University of \JAberdeen\j.\p
#
"March, 1998 Science Review",509,0,0,0
\JDo viruses cause heart disease?\j
\JProbing teeth with a microscope meant for jet engines\j
\JWhat goes on inside the cell?\j
\JCan Lyme Disease be biologically controlled?\j
\JShould measles be eradicated?\j
\JInfants flying high\j
\JCan we use unethically gained information?\j
\JMuscle from bone\j
\JBone in muscle\j
\JRitalin re-evaluated\j
\JFat couch potatoes\j
\JA new memory system?\j
\JCost of 'going on the Net' falls\j
\JMirror, mirror, on the wall, who is the smoothest...\j
\JMusical robot\j
\JMaking a superacid\j
\JPotato blight returns\j
\JRapid data access urged\j
\JDutch to prevent cow cloning\j
\JLawsuit targets Yellowstone bioprospectors\j
\JThe wolves of Isle Royale strike hard times\j
\JWas Malthus just a little bit early?\j
\JDinosaurs in the news\j
\JHow the eucaryotic organisms arose\j
\JLizard evolution\j
\JThe first seafarers?\j
\JSinking the idea of forests as carbon sinks\j
\JDengue fever on the move?\j
\JA wet Big Apple?\j
\JNew web source of weather images\j
\JSeeing inside the earth\j
\JStudying earthquakes by satellite\j
\JUndoing the cryptographers' work\j
\JDeath of a death threat\j
\JWater on the moon - once again!\j
\JAstronomical record set - from earth\j
\JStrange pulsar found\j
\JNational Science Board awards\j
\JThe Top 10 scientific papers for 1997\j
\JBRCA1 screening not a general need\j
\JAnd a control gene for BRCA1\j
\JHuman genome\j
\JHow lithium works?\j
\JCalm clams?\j
\JIs alcohol good for you?\j
\JSeeing the shores\j
\JWalking, climbing wheelchair\j
\JSeeing a new energy source on the shores\j
\JMetallic glass\j
\JWriters watch out!\j
\JWhen there is no 'r' in the month\j
\JR Leonis, the incredible shrinking swelling star\j
\JTwo new tests for Carpal Tunnel Syndrome\j
\JAlzheimer's disease can be diagnosed in the very early stages\j
\JThe secret of the Stradivarius - it's all in the chemistry\j
#
"Do viruses cause heart disease?",510,0,0,0
(Mar '98)
More and more "environmental" diseases seem to be associated with living things. The latest in the list: \Jatherosclerosis\j, or hardening of the arteries. Cigarettes and high fat food have long been blamed for the condition, but now a group of researchers is arguing that the condition starts with an attack by a virus which prepares the way for the damage that follows when plaque builds up on the surface of blood vessels. So some people may be able to indulge in smoking and high-fat diets, and never have to pay the price.\p
The idea is not new, but there has never been any direct evidence that viruses can cause this sort of damage. Not until now, that is, according to a research group, led by Herbert W. Virgin, an assistant professor of \Jpathology\j, molecular microbiology and medicine at Washington University's School of Medicine in St. Louis. In a recent report in \INature Medicine\i, they describe a virus, related to the Epstein-Barr virus which can cause mononucleosis and a relative thought to cause Kaposi's sarcoma. This new virus, only found in mice, which can injure a mouse's arteries.\p
The mice do not develop full-blown \Jatherosclerosis\j, but they have damage which resembles the early stages of the disease. The damage was similar also to a group of human vascular diseases - Taskayasu's arteritis, temporal arteritis and Kawasaki's disease - whose origins have always been a mystery.\p
Other recent studies have linked bacterial infections to \Jheart disease\j (see \BOral \Jbacteria\j go for the heart as well\b and \BGerms cause high blood pressure\b, February 1998), so if assorted pathogens do in fact set off these heart conditions, the future may see people who are at risk getting \Jvaccination\j against \Jatherosclerosis\j. The real problem with this work is the delay between when any virus causes the damage, and the time when the \Jheart disease\j shows up, years later, after the person has been exposed to perhaps hundreds of viruses, any of which might be the problem one. So it seems likely that such people would be wise to avoid the high-risk lifestyles, just in case some other unknown germ turns out to be able to perform the same trick.\p
#
"Probing teeth with a microscope meant for jet engines",511,0,0,0
(Mar '98)
The x-ray tomographic \Jmicroscope\j (XTM) was originally invented to analyse ceramic components used in jet engines. Now it is being used by dental researchers to observe structures in the dentine (also called dentin) as small as two micrometres, about the size of a human cell.\p
The idea is to study the structure and properties of dentine in order to find methods and materials that will create a tighter, more permanent bond between the tooth and the plastic-based fillings now used to repair most cavities.\p
Metal alloys make longer-lasting fillings, but the newer plastic-based and ceramic materials are better able to match the colour of real teeth. About 97% of the surface of the tooth is mineral, and this surface enamel is easy to bond to, but cavities eat into the dentin underneath, so filling materials have to be able to bond with this as well.\p
Dentine contains a lot of moisture and organic material, so that it is only about 50% mineral, making bonding more difficult. The new XTM reveals finer detail of the join, and so gives researchers a better chance of improving their methods.\p
Dentists usually treat a cavity with acid, which removes the minerals, leaving a framework mostly made of collagen. If a plastic-based liquid is added to the cavity, it can get into this framework before it hardens, improving the grip, so the XTM is being used to reveal the effectiveness of different acid treatments.\p
#
"What goes on inside the cell?",512,0,0,0
(Mar '98)
Just 150 years ago, the interior of the cell was a complete mystery, mainly because the contents of the cell are transparent to light, so there was nothing to see under the \Jmicroscope\j. Then came a major breakthrough as scientists discovered that organic dyes could be used to stain parts of a cell, making them show up in thin sections under the \Jmicroscope\j.\p
Now a similar breakthrough may be on the way, in the form of a sensor so small that it can detect what is going on inside a single cell. The development was announced during a \Jspectroscopy\j conference in New Orleans during March. At the conference, Raoul Kopelman suggested that one use for the sensor might be in assessing people's exposure to biological or chemical weapons.\p
The levels of certain ions in the cell indicate its general health, and previously, this has been measured by injecting dyes into the cell, or by poking the cells with electrodes. A molecule called an ionophore reacts with positively charged ions of \Jmagnesium\j, \Jcalcium\j and \Jpotassium\j to activate a fluorescent dye. The ionophore and dye mix is packaged in small polymer particles to make "PEBBLEs" (Probes Encapsulated By BioListic Embedding), which can be injected into cells such as mouse eggs or brain cells. Then all the researchers have to do is measure the brightness of the glow to measure the concentrations of the target ions.\p
Each PEBBLE is about one millionth of the size of the whole cell, and they are injected with a "gene gun", normally used to "shoot" DNA into a cell with a burst of \Jhelium\j. Best of all, because the sensor can be located in the cell with a \Jmicroscope\j, the location position can be linked up to the information revealed by the glow of the dye.\p
#
"Can Lyme Disease be biologically controlled?",513,0,0,0
(Mar '98)
Lyme disease and Human Granulocytic Ehrlichiosis (HE) are both spread in North America by the tick \IIxodes scapularis\i. Now it appears possible that these diseases may be brought under control by attacking the tick with a common \Jfungus\j, \IMetarhizium anisopliae\i. Lyme disease can result in chronic \Jarthritis\j or permanent neurological damage if untreated, while HGE is a recently described tick-borne disease which, in rare cases, can turn fatal.\p
Lyme disease is caused by a spirochaete, \IBorrelia burgdorferi\i, which gets into the ticks when they feed on the blood of infected white-footed mice. There is no vaccine against this bacterium, as yet. The cause of HGE has not been discovered since the disease was first described in 1994, but it can be treated easily with \Jantibiotics\j, indicating that it is bacterial. With Lyme disease also found in other parts of the world, and with the ticks now invading New York city parks, an effective answer is needed, and soon.\p
Dr. Rosalind Lowen, Honorary Research Associate at the New York Botanical Garden is investigating the potential use of the \Jfungus\j, strains of which are already used in a number of countries. (As an example, in the United States, the strain ESF1 is used commercially against cockroaches.)\p
Chemical repellents can be used against ticks, but these are less than fully effective. Using natural predators (including fungi) is much safer for everybody - except perhaps the targeted pest.\p
The \Jfungus\j invades the tick's body, multiplies and kills the host from a combination of tissue destruction and fungal toxins, and then emerges from the body to release spores into the air to infect more ticks. This particular \Jfungus\j is unable to grow at or above 95F (35C), the \Jfungus\j offers no threats to warm-blooded animals, but any strain planned for release will need careful checking for safety.\p
For more background information, see \1http://www.nybg.org/bsci/tick.html\c\p
#
"Should measles be eradicated?",514,0,0,0
(Mar '98)
Before \Jmeasles\j vaccine was first used, some 5.7 million people died each year from this disease. By 1995, the death rate was just an eighth of what it had been. A discussion paper in the British Medical Journal during March asked whether we should have an all-out campaign to wipe out the \Jmeasles\j virus. The authors' conclusion was that there could be problems in achieving effective coverage in poor countries, and that there are some safety issues. It would be possible, if these questions were answered, to wipe out the disease, somewhere between 2005 and 2010.\p
#
"Infants flying high",515,0,0,0
(Mar '98)
According to another report in the \IBritish Medical Journal\i during March, jetsetting lifestyles may be harmful - at least to the very young. When 34 infants were exposed to air with only 15% of the normal amount of oxygen, four of the youngsters showed unpredictably severe falls in oxygen saturation in their blood, a condition known as hypoxaemia.\p
The significance of this finding is uncertain: it may explain the huge drops in oxygen saturation which sometimes tie in with respiratory diseases, or it may explain effects such as "cot death", also knows as SIDS, or Sudden Infant Death Syndrome.\p
It might also sound a note of warning about flying in pressurised planes, although an editorial in the same issue of the \IBMJ\i noted that some 750 000 infants have flown safely in \Jaircraft\j in the past ten years, so that if a child is healthy there is no epidemiological evidence to indicate that it is unsafe for it to fly during its first year.\p
#
"Can we use unethically gained information?",516,0,0,0
(Mar '98)
In a guest editorial in the March issue of \IMolecular \JPsychiatry\j\i, Dr. Miron Baron discussed the case of the psychiatric research funded by Adolf Hitler's Nazi party in the 1930s and 1940s. This research had just one aim: to provide a genetic basis for mental illness, leading to the German Law for the Prevention of Genetically Diseased Offspring, which led to sterilisation or \Jcastration\j of the mentally ill. This led, by a sort of natural progression, to the T-4 "\Jeuthanasia\j" program to kill the mentally ill, and in the end, to the \JHolocaust\j.\p
These research activities are among the unavoidable "ancestors" of modern research in psychiatric \Jgenetics\j: what use should we make of such data today? One viewpoint has it that the data should be expunged, deleted from the scientific record, and never used again. Another view is that the knowledge gained can be used, so long as the scientist referring to it puts the data into a clear context. Others argue that the data, no matter how obtained, ought to stand.\p
The main issue lies with the work of German psychiatrist and geneticist, Ernst Rdin, whose data, suggests Baron, may very well have been obtained under extremely dubious conditions. In all, some 400 000 people were sterilised, and around 100 000 were murdered for psychiatric reasons under programs inspired by Rdin's belief that \Jschizophrenia\j was a result of genetic impairment.\p
There is never an easy line to draw. Baron points out that even as people draw back from plans to clone humans, others are planning to clone monkeys for AIDS research, a practice that might make it easier later to clone humans.\p
Although Baron does not raise this particular issue, there is one special case, where German researchers exposed \JHolocaust\j captives to low \Jtemperature\j \Jwater\j, close to freezing, to find out how long humans could survive under such conditions. These mortality results have been used to benefit other humans since: is it better that the victims should have died in vain, or that their deaths be used to save other human lives. As with so many ethical questions, there is no easy answer.\p
Seeking the easy answer, suggests Baron, may be undesirable. If the moral laxness of the past is treated meekly, he warns, the future of the field of modern psychiatric \Jgenetics\j may be in peril.\p
#
"Muscle from bone",517,0,0,0
(Mar '98)
A somewhat distant new hope for sufferers from \B\1muscular dystrophy\b\c was reported in \IScience\i during March. The report describes how bone marrow cells in mice can travel through the blood and change into new muscle fibres in damaged muscle. Before this, it was believed that only nearby cells can repair muscle tissue.\p
Stromal cells, the support cells of the marrow, are known to convert into muscle cells in culture, but there was no evidence before this work that the change could happen in living animals. The evidence that it is indeed possible comes from a strain of transgenic mice carrying a marker gene that causes cell nuclei to turn blue when it is activated, but only in muscle cells. The bone marrow of test animals was first destroyed by radiation, and then transgenic bone marrow was transplanted into the mice.\p
After several weeks, the researchers injected a toxin into the front legs of the mice to damage the muscle there. Two weeks later, the mice not only showed signs of recovery, but there were numerous blue nuclei visible. One interesting speculation has already come out of this: if researchers can use standard genetic \Jengineering\j methods to fit bone marrow cells with a "good copy" of the muscular dystrophy gene, this could provide the decaying muscles with new and normal muscle cells. But if this can be developed, the treatment is still some time away.\p
Late in March came a report that bone, just like that found in the human skeleton, can be found in quite a few diseased heart valves. Aside from triggering a rash of medical jokes about being hard-hearted, this discovery may lead to better ways of treating heart-valve disease.\p
Valve calcification is the leading reason for heart-valve-replacement surgery. To put the matter in context, more than 71 000 Americans required the life-saving procedure in 1995. The problem of valve calcification has been known for over 100 years, but this study was the first to look at a large series of diseased heart valves and find bone.\p
Researchers at the University of \JPennsylvania\j Medical Center, led by Emile Mohler III, studied 228 valves removed from patients who underwent valve-replacement surgery from 1994 to 1997 at the \JHospital\j of the University of \JPennsylvania\j. Organised hard bone tissue was found in thirty of the valves.\p
Mohler and his colleagues had previously found osteopontin, a protein that makes up the molecular scaffolding to which \Jcalcium\j sticks in the formation of bone, in calcified valves, and taking this together with the new findings of bone, Mohler commented on the \JInternet\j that "Finding this protein and actual bone is evidence that valve calcification is an active process of laying down organised bone tissue, not a passive one, as was once thought."\p
But how did the bone cells get to the heart? Did either valve cells or inflammatory cells at the area of heart damage and given the right conditions, undergo a genetic change and start making bone-cell proteins? And if that is the case, what is the trigger for the reaction?\p
#
"Ritalin re-evaluated",519,0,0,0
(Mar '98)
Ritalin is a drug commonly used to treat children with the symptoms of attention deficit \Jhyperactivity\j disorder, with sales of more than $350 million each year. Now chemists at Brookhaven National Laboratory have asked whether it is being administered in its most effective form.\p
Methylphenidate (marketed as Ritalin) is a drug which has \B\1chirality\b\c, meaning that it exists in two mirror-image forms, and is distributed in what chemists call the "racemic" form, as a mixture of the two chiral molecules, called d-threo and l-threo. The two forms, enantiomers to the chemists, may not be equally effective, according to Yu-Shin Ding, who suggested that the d-threo enantiomer, is about 10 times more potent than its chiral counterpart.\p
She argued that this could mean that half the administered drug has no therapeutic effects. Ding and two colleagues used tagged Ritalin carrying a short-lived radioactive isotope, carbon-11, with a half-life of 20 minutes to study the effects of the drug on human and \Jbaboon\j brains. Radio tracers such as carbon 11 can be detected by \Jpositron\j emission \Jtomography\j or PET scans to develop images, and in this case, the images revealed that the d-threo enantiomer bound precisely to the \Jdopamine\j targets in the brain, while the binding of l-threo was mostly non-specific.\p
Ding also asks whether the l-threo enantiomer may have some unwanted influence on the active enantiomer or may contribute unwanted side-effects, noting with some caution that long-term human studies would be needed to confirm this.\p
Interestingly, clinical studies of other drugs, such as the heroin substitute methadone, have shown that single enantiomer forms work best. And just for good luck, Ding and her colleagues suggest that PET scanning can be used to study aging. Because of its high-specificity binding of \Jdopamine\j \Jreceptors\j, d-threo tagged with carbon-11 might prove useful as a PET radio tracer to probe the neuronal loss that occurs in normal aging and in neurodegenerative conditions such as Parkinson's disease.\p
#
"Fat couch potatoes",520,0,0,0
(Mar '98)
Even the bleeding obvious needs to be tested in science. A study in the \IJournal of the American Medical Association\i (\IJAMA\i), in March reveals that as the hours of \Jtelevision\j watched by American children increases, so does their weight. The research, carried out at the Johns Hopkins Bayview Medical Center suggest that even though they live in a society that is increasingly weight and appearance-conscious, many American children may be headed toward sedentary, overweight adulthood.\p
A quarter of all US children watch four or more hours of \Jtelevision\j a \Jday\j, and these children weighed more and were fatter than children who watched fewer than two hours per \Jday\j. TV-watching was also greater among African-American (black) children and Mexican-American children, perhaps because of parental concerns about safety on the streets. The survey, based on more than 4000 children, involved an in-home interview and a detailed clinical examination.\p
#
"A new memory system?",521,0,0,0
(Mar '98)
How would you feel about a memory device that consisted of single oxygen molecules, able to be rotated to one of three orientations?\p
Graduate research assistants Barry C. Stipe and Mohammad Rezaei and physics Professor Wilson Ho reported the experiment at the annual meeting of the American Physical Society in Los Angeles in March, and they published a paper, Inducing and Viewing the Rotational Motion of a Single Molecule, in a March 1998, issue of \IScience\i, describing their work in using brief voltage pulses applied to the molecule to cause it to rotate between three orientations spaced 120 degrees apart, something like a radio knob that clicks into one of three stops.\p
If the voltage pulse is not stopped, the molecule continues to rotate between the three orientations, turning like a tiny motor. The oxygen molecules, absorbed to a \Jplatinum\j surface, reveal interesting information about the nature of chemical bonds and how electrons can cause the motion of the molecule, but the practical prospects appear to have the group most excited.\p
Aside from the memory storage idea, the experiment could lead to interesting developments in the area of tiny motors, of the sort needed for nanotechnology, though the work required a "homemade" scanning tunnelling \Jmicroscope\j (STM) of exceptional precision, and it had to be carried out at a \Jtemperature\j of 8 degrees above \Jabsolute zero\j to prevent random molecular motion.\p
The STM involves a sharp, needle-like tip suspended less than a billionth of a meter over a surface. When a voltage is applied, a very small electric current flows between the needle and the surface. As the needle is moved to scan a surface, its height is adjusted in such a way that the current flow remains constant. A computer can use the ups and downs to construct an image of the surface with such detail that individual atoms and molecules appear as bumps or depressions. For this experiment the researchers also used the tip to apply brief voltage pulses to rotate single molecules.\p
An oxygen molecule is made up of two oxygen atoms. When an oxygen molecule is bonded on \Jplatinum\j one atom rests on the surface while the other is slightly raised so that the axis between the two atoms is tilted up from the surface at a shallow angle. More of the electrons in the molecule gather around the upper nucleus, and the image of the molecule formed by the STM is pear-shaped.\p
The researchers applied a .15-volt pulse lasting about 20 microseconds, with the STM tip right above the axis between the two atoms of an oxygen molecule. This caused a slight change in the "tunneling current" that normally flows between the tip and the surface, signalling that the molecule had rotated. The STM images confirmed that the larger end of the molecule rotated to a new orientation after each change in current.\p
During a voltage pulse, the current flowing between the tip and the molecule is slightly different for each of the three stable orientations of the molecule, the researchers said in their paper. "The computer could be instructed to end the voltage pulse at a particular value of the tunneling current, thus leaving the molecule in any desired location," they said. In other words, a molecule might be set in a particular position to store information.\p
#
"Cost of 'going on the Net' falls",522,0,0,0
(Mar '98)
Since 1995, it has cost $US50 to register a domain name (such as \1www.websterpublishing.com\c) on the \JInternet\j. In mid-March, this dropped to $US35, as the National Science Foundation (NSF) and Network Solutions, Inc. (NSI) today announced the end of the \JInternet\j Intellectual \JInfrastructure\j Fund portion of domain name registration charges. The change began (seriously) on April 1.\p
The \JInternet\j Intellectual \JInfrastructure\j Fund was created to offset government funding for the preservation and enhancement of the intellectual \Jinfrastructure\j of the \JInternet\j. More than $45.5 million has been deposited into the fund to date, and the NSF no longer believes it necessary to continue charging a percentage of the registration fee for the preservation and enhancement of the \JInternet\j's intellectual \Jinfrastructure\j. No money from the fund has yet been spent.\p
The NSF has indicated that registration services for the \JInternet\j are now a self-sustaining activity and are beyond the mission of the agency which is to support science and \Jengineering\j research and education. The NSF does not intend to renew or recompete any agreement for registration services.\p
Domain name registration is the method used to convert the alpha-character string most of us know as a URL or Web address, into a 12-digit \JInternet\j Protocol (IP) number which computers on the network actually use to locate and communicate with each other.\p
Strictly, the IP consists of four parts, each a number in the range 000 to 255, yielding a total of 2\U32\u (around 4 billion) possible IP numbers. While this may seem like a huge range, almost enough to give every person on the \Jplanet\j their own number, what happens when you decide to connect your toaster, your dishwasher and your clothes drier to the \JInternet\j? This is not as bizarre as it sounds: if you wish to control these devices from afar, a larger range of numbers will soon become essential to our society. The pressure will be found first in the United States, where the \JInternet\j originated, and where these changes have effect, but other countries will follow close behind.\p
#
"Mirror, mirror, on the wall, who is the smoothest...",523,0,0,0
(Mar '98)
A good mirror is a smooth mirror. A perfect mirror is one in which not even a single atom pokes up above the surface. Melissa Hines, an assistant professor of chemistry at Cornell University, is seeking a perfect mirror, and expects somebody to make such a mirror within the next five years, a mirror which is "essentially totally flat."\p
During March and early April, Hines described her work to both the American Physical Society in Los Angeles and also to the American Chemical Society in \JDallas\j. Her work, started at Bell Labs, is important in the semiconductor industry, where surface roughness, even on the atomic scale, can greatly decrease the performance of a transistor. As sizes get smaller, so roughness becomes more of a problem.\p
The work started when Bell Labs researchers were looking for a new method of removing dust from the silicon wafers used to produce integrated circuits. The previous method, developed in the 1960s, involved washing the silicon wafers in basic peroxide baths. Today's much smaller circuitry suffers badly from the roughness caused by the "bath".\p
The Bell labs workers found that variations in the acidity and composition of the chemical solution could produce small areas on the silicon surface that were totally flat, even at the atomic level. The surface roughness was equal to only one protruding atom out of every 30 000 surface atoms.\p
Only one form of silicon surface, silicon (111), can produce this level of perfection, but silicon (111) is a different plane from the silicon (100) used for integrated circuits. But Hines is not fazed by this. "At this point we know what is going on," she says. "Next we have to change the chemistry to control the reactions. I'm completely convinced this is possible."\p
#
"Musical robot",524,0,0,0
(Mar '98)
The theremin is the world's oldest \B\1electronic music\b\c instrument, and now it can be played by one of the world's most advanced robots, a dual-arm humanoid robot at Vanderbilt University's Intelligent Robotics Laboratory.\p
The theremin is one of the few musical instruments which can be "played" without being touched, since it responds to the closeness of the player's hands to two aerials to control both pitch and volume. The humanoid robot is the creation of the Center for Intelligent Systems Director Kazuhiko Kawamura and fellow electrical and computer \Jengineering\j professors D. Mitchell Wilkes and Richard Alan Peters II, and having two arms, it has all it needs to play the instrument.\p
A theremin was used for the high tremolo notes played in the Beach Boys' big 1966 hit, "Good Vibrations". Led Zeppelin used the theremin in such songs as "Whole Lotta Love." The group also used it in their movie, "The Song Remains the Same." It has been popular over the years to produce "ghostly" music in a variety of movies.\p
The theremin works on a capacitance effect between the antenna and the player's hand, so the robot has been equipped with pitch detection sensors, allowing it to play "by ear". The arms are driven by pneumatically controled actuators called rubbertuators that simulate the movement of human muscles, allowing the robot to produce human-like effects, such as vibrato and tremolo.\p
On top of that, the robot has been fitted with a MIDI interface, so a tune played on a \Jguitar\j or synthesiser can be fed through to the robot which can then repeat the tune on the theremin.\p
The theremin offers a non-linear response to the player: the higher notes are closer together, and there is no simple positional relationship between notes and positions as there is with a \Jpiano\j or a \Jguitar\j. For this reason, the theremin is probably one of the few instruments which could be better played by a robot than by a human being.\p
And where is Vanderbilt University, where the robot is located? Nashville, of course, which is why one of the participants in this project has created a theremin web page at \1http:///www.nashville.net/~theremin/\p
#
"Making a superacid",525,0,0,0
(Mar '98)
School chemistry teaches us that acids are excellent for dissolving things, especially metals, while crime fiction seems to dwell more on acids as a way of disposing of unwanted bodies. In real life, acids are the most important catalysts in the chemical industry, and the stronger they are, the more important they are.\p
Superacids have all the strength you might ask for, and when Christopher Reed described them at the March meeting of the American Chemical Society in \JDallas\j, he described them as chemicals a trillion times stronger than swimming pool acid.\p
The superacids have potential applications in fuel cell technology and the chemical and \Jpetroleum\j industries. Most acids do not react with \Jhydrocarbons\j such as \Jpetroleum\j oil, but superacids do. They are usually thick, viscous and highly corrosive fluids that are difficult to use. In contact with \Jhydrocarbons\j, the acids break \Jhydrocarbons\j into positively charged hydrocarbon cations, which usually exist for only an instant before the continuing \Jchemical reaction\js destroy them.\p
These cations (catt-EYE-ons) are technically referred to as carbocations (carbo-catt-EYE-ons), and scientists have known for many years that superacids can stabilise carbocations so that they could be studied.\p
All acids dissociate, breaking down into a positively charged cation and a negatively charged anion (ann-EYE-on) in the presence of \Jwater\j. Then because the free ions are so reactive, acid breaks down numerous other compounds, explaining why acids are so important in the chemical industry.\p
A superacid is defined by Reed as anything as strong as, or stronger than, 100% sulfuric acid, although pure sulfuric acid in fact shows no acidic properties, needing the addition of a small amount of \Jwater\j to start the acid dissociating into ions which actually do all of the work.\p
Usually the acid's cation, the positively charged \Jhydrogen\j ion, is regarded as playing a key role in triggering reactions, but Reed says the role of the negatively charged anion can be crucial and has been underappreciated. There will always be an anion present, and by reacting with the carbocations, they reduce the effectiveness of the currently known superacids.\p
Reed, with colleague Nathanael Fackler, both at the University of Southern \JCalifornia\j, have found an extremely inert anion in the carborane family. This anion makes possible a new generation of superacids that will produce hydrocarbon cations that are not broken down.\p
Carboranes are \Jboron\j compounds which were originally synthesised by chemists at E.I. du Pont de Nemours &\Jamp\j; Co. in the 1960s, and while they are now too expensive to be widely used in the commercial chemical industry today, one American company is presently marketing them.\p
A USC graduate student who is working with Reed, Robert Bolskar, has already used the new anion to stabilise two new carbocations derived from \B\1buckminsterfullerene\b\c. "Usually, the buckyball compound is rapidly chewed up by a superacid," Reed says. "We've made two different cations that people did not think could be put in a bottle."\p
#
"Potato blight returns",526,0,0,0
(Mar '98)
More than 150 years after the \Jfamine\j that took an estimated 1 million lives in Ireland, the \B\1potato blight\b\c is on the attack again, this time in America. A newer and more virulent strain of the \Jfungus\j, \IPhytophthora infestans\i, better known as late blight, is devastating north American crops.\p
The new strain, known as US-8, presents a bigger threat to \Jpotato\j and \Jtomato\j crops, mainly because it is more aggressive in the way it attacks plants, but also because it is proving resistant to the most effective fungicide, metalaxyl. The strain US-8 first appeared in 1992 and 1993 in New York state and Maine, but has now spread along the whole eastern seaboard of North America, skipping only Newfoundland, Virginia and South Carolina, also surfacing in \JCalifornia\j, Kansas, \JIdaho\j, \JTexas\j, \JColorado\j, \JNebraska\j and South Dakota.\p
The late blight disease runs through a whole cycle in just five days, all the way from penetration, colonisation, sporulation to renewed dispersal, with each lesion able to produce as many as 300,000 sporangia a \Jday\j.\p
Some infected tubers may be destroyed before harvest, but with so many virulent spores, harvested potatoes can easily become diseased in storage. \JBacteria\j that cause soft-rot diseases often invade \Jpotato\j tubers infected with late blight, literally resulting in a "meltdown" of stored potatoes. Under severe infection, whole storages have to be discarded.\p
A temporary website has been set up at http://www.scisoc.org to provide information on the \Jbiology\j of the blight. This interactive, educational website looks at the historical impacts of late blight, highlights key research articles, presents an online curriculum and offers links to other key late blight information.\p
#
"Rapid data access urged",527,0,0,0
(Mar '98)
While it is a standard part of the \B\1scientific method\b\c for scientists to report their results first in a peer-reviewed journal, the human \Jgenome\j researchers broke with tradition in 1996, agreeing to release data as soon as they became available.\p
Now it is the turn of the researchers working on microbial sequences and the DNA of other animals to take the same stance, agreeing at a meeting in \JBermuda\j in February to release data immediately. Workers from \JJapan\j, \JFrance\j, Britain, \JGermany\j, and the United States unanimously proposed that all large-scale sequencing centres follow suit.\p
The participants first looked at mouse \Jgenome\j data, but quickly moved on to broaden it to other organisms including microbes and plants. They noted that in the human example, the steady flow of human \Jgenome\j data has been a boon to research and has not cluttered up databases with incomplete information, as some had predicted.\p
The governments of the five countries, the only ones which currently support large-scale gene sequencing, will need to endorse the policy proposal, once it is formally presented to them.\p
#
"Dutch to prevent cow cloning",528,0,0,0
(Mar '98)
In early March, the Dutch minister of agriculture, Jozias van Aartsen, put an end to \Jcloning\j experiments carried out by Pharming, a company based in \JLeiden\j, the Netherlands. Pharming specialises in producing cows which secrete drugs in their milk. The ban was a rapid reaction to Pharming's announcement on February 26 of the birth of two calves, Holly and Belle, both cloned from embryonic cells. The ban applies only to work at Pharming, and only certain types of work involving transfer of nuclei. Pharming has already indicated that it will move its work to the United States.\p
This is likely to be a common future reaction when governments seek to limit genetic \Jengineering\j work within their jurisdictions. In this particular case, the ban is to remain until the company proves that drugs from such animals are better than those made by other methods. The term "better" seems not to have been defined, but if it means "more effective", the company will face a stiff battle.\p
Pharming uses a procedure where embryos are injected with the gene for a desired drug, one which is a pharmaceutical protein, and the embryo is then placed in another cow to come to term. This method is a bit "hit or miss", and leads to many nontransgenic calves which do not produce the target drug. Now Pharming is investigating what may be a more efficient method, nuclear transfer, to clone cows, because the technique could speed the process of growing herds that reliably produce drugs in their milk.\p
The Dutch government only allows genetic \Jengineering\j and animal \Jcloning\j when there are no feasible alternatives in lower organisms, and when the benefits to society outweigh animal suffering. So far, Pharming has been unable to meet this limitation, and so their work has been placed on hold. Pharming complain that Scottish and US laboratories are going full ahead in this area, leaving them commercially crippled.\p
Who owns the organisms that grow freely in the environment, and who owns their genes? In the USA, the National Park Service announced in August 1997, plans to allow Diversa Corp. to sample soil, \Jwater\j, and detritus in \JYellowstone\j National Park over the next 5 years in return for a one-time fee of $175,000 and up to 10% in royalties on any commercial products derived from microbes found in the park.\p
Areas such as \JYellowstone\j are of particular interest, because the life forms found in hot springs must have some fairly fancy biochemistry if they are going to survive. All over the world, the haunts of extremophiles, as these life forms are called, are being searched for interesting biochemicals. (See \BSome Like it Hot\b, February 1998.)\p
In March, three environmental organisations announced that they were undertaking legal action to stop the park authority from entering into a formal agreement with a San Diego-based biotechnology company.\p
The National Park Service, they say, is supposed to protect--not exploit--organisms in its protected habitats. When the groups' call for an environmental assessment was denied, along with its request for disclosure of the financial arrangements of the agreement, The Edmonds Institute in Edmonds, Washington teamed up with a regional organisation called the Alliance for the Wild Rockies to file a lawsuit against the Department of Interior and its National Park Service.\p
Already, a company is making millions of dollars from an \Jenzyme\j called Taq polymerase, derived from a \JYellowstone\j microbe, and the park has seen none of it, so the park management are keen to avoid this happening again, while the conservationists argue that park management is about protecting life forms, not exploiting them.\p
#
"The wolves of Isle Royale strike hard times",530,0,0,0
(Mar '98)
The balance between wolves and moose in the Isle Royale National Park in \JMichigan\j has been a standard example used in \Jecology\j textbooks for years. The wolves first reached the island back in the 1940s, perhaps by walking across the frozen ice of Lake Superior. However they arrived, the wolves have been thriving ever since, living mainly on old or sickly moose.\p
While this isolated \Jecosystem\j appears simple, it has not been easy, over the years, to predict the variations in the number of Isle Royale wolves, and the most recent change was even harder to predict. The winter of 1996-97 saw the wolf population increase to 24, even though moose populations had plummeted by 80% in the previous year. Then after the northern winter (1997-98) which has just passed, there appear to be only 13 wolves, even though moose numbers climbed back from 500 to 700.\p
The only likely explanation seems to be some kind of disease, so the northern summer of 1998 will see blood tests on captured wolves. One problem for the wolves: over half a century of inbreeding, they have become very similar genetically, so that any disease that attacks one of them is highly likely to attack all of them.\p
#
"Was Malthus just a little bit early?",531,0,0,0
(Mar '98)
The Reverend Thomas Malthus warned us in 1798 of the consequences of overpopulation, giving us the word "Malthusian" in the process. In fact, Malthus was pre-dated by some 208 years by an Italian writer named Botero, who made much the same point in a book published in 1590, but that is not the name we recall - if Malthus was ahead of his time, Botero was seriously ahead of his time.\p
Yet over the past two centuries, there has been little sign of the gloom and doom that Malthus forecast. As we approach the bicentenary (on June 7, 1798) of the publication of Malthus' essay, it may be time, scholars are saying, to consider whether we are not just about ready for a "disastrous Malthusian correction", to use the words of David Price, an anthropologist who is also a research associate with the Cornell University Population and Development Program.\p
Nobody wants to accept the "Malthusian correction" as inevitable, but since Malthus' time, the world population has risen from 1 billion to 6 billion, in spite of all the checks that Malthus thought would keep us within our means of subsistence. Malthus thought such growth would be countered by "preventive checks" such as \Jinfanticide\j, \Jabortion\j and contraception, and by "positive checks" such as famines, plagues and wars.\p
No doubt we will hear more of the Reverend Mr Malthus during 1998.\p
#
"Dinosaurs in the news",532,0,0,0
(Mar '98)
\JDinosaur\j finds in several parts of the world were announced during March, including a new and very bird-like \Jdinosaur\j from Madagascar, others from the Gobi Desert, beautifully preserved dinosaurs, complete with their dinners, in \JItaly\j, and excellently preserved \Jdinosaur\j tracks in Wyoming, dating back to 165 million years ago.\p
There are many who happily accept that the dinosaurs never died out, that they just turned into birds, once they grew feathers. The members of this "heretical" group point to a great deal of evidence that the dinosaurs were already warm-blooded.\p
This point of view received a nice boost in early April, with clear evidence of Australian dinosaurs living under polar conditions, something which cold-blooded animals simply could not have done. It is also worth noting that your reporter, having sifted the evidence and talked to the experts, is firmly a member of the bird-\Jdinosaur\j group. So with that declaration of bias, the bird-\Jdinosaur\j claim is one that desperately needs more "missing links" like the well-known \IArchaeopteryx\i. Some of that evidence has just appeared . . .\p
A new raven-sized \Jfossil\j bird, showing clear evidence of the close relationship between theropod dinosaurs and birds, has been discovered on the island of Madagascar and was announced by a team of researchers -- led by palaeontologist/anatomist Catherine Forster of the State University of New York (SUNY) at Stony Brook in \IScience\i during March. The \Jfossil\j bird dates from the late Cretaceous, at around 65 - 70 million years ago, and it was found in 1995. It now has a new name: \IRahona ostromi\i, meaning "Ostrom's menace from the clouds."\p
While \IRahona\i has a long forearm bone, a clear sign that it was a good flier, a view supported by evidence of well-developed feathers, it also had a long, bony tail and sported a large, sickle-like killing claw at the end of a thick second toe on the hind foot. This unique toe and claw is identical to the one carried by a group of fast, predatory theropod dinosaurs called "maniraptorans."\p
The maniraptorans will be better known to movie-goers and to children (who always know their dinosaurs!) by two of their members, \IVelociraptor\i and \IDeinonychus\i, and it is this group that is generally regarded by the "bird-\Jdinosaur\j" group. In other words, the bird-\Jdinosaur\j group's implicit prediction that early birds (feathered dinosaurs?) would show maniraptoran characters has just been proven.\p
Forster added, "\IRahona\i was at the base of the bird family tree, right next to \IArchaeopteryx\i. It had a feathered wing and many bird features in its hips and legs, including a perching foot. But it also kept the big killing claw of its theropod ancestors." Palaeontologists have long suspected that theropods gave rise to birds, and the presence of this "maniraptoran" toe and claw on the \JMalagasy\j bird "clinches it for us. This discovery lends a lot of weight to the idea that birds are a side-branch of the theropod family tree."\p
Two days later, \INature\i gave us a report of work done in the Gobi Desert, where the first skull has just been found of a transitional bird-\Jdinosaur\j beast from the group known as the Alavrezsauridae. This group is of special interest because it provides further evidence in support of the theory that birds evolved from dinosaurs and reveals an advanced stage in this transition.\p
The new fossils date back to about 70 million years (late Cretaceous), and were discovered during one of a series of joint American Museum of Natural History/Mongolian Academy of Sciences expeditions to search for dinosaurs and other fossils. The most exciting thing about these finds is that they include, for the first time, skull material of an alvarezsaurid.\p
The new find has been named \IShuvuuia deserti\i, from the Mongolian word shuvuu, meaning "bird," and the Latin for desert, in reference to the ancient climate in which the animals lived. \IShuvuuia deserti\i, which was about the size of a turkey, walked on two legs, had a long tail and neck, and quite unlike most primitive birds, had stubby forearms that ended in a single, blunt claw. How the animals used these strange appendages is a mystery, but they were clearly unable to fly.\p
Related specimens, including a creature called \IMononykus\i, have previously been collected in \JMongolia\j, Argentina, and North America, but none of them had a skull. Numerous physical characteristics in the \Jfossil\j skulls show that these strange creatures were actually early birds, challenging the traditional view that all primitive birds looked similar to their modern-\Jday\j cousins. \JFossil\j skulls are extremely important because they contain key physical characteristics that enable researchers to trace the evolutionary history of different life forms.\p
The \IShuvuuia deserti\i skulls reveal an important physical characteristic that is found only in birds: the animal was capable of "prokinesis", the movement of the snout up and down independently of the rest of the skull. This allowed the animal to open its mouth quite wide in order to eat large food items. The diet of \IShuvuuia deserti\i is not known, but may have included insects, as well as lizards and even small mammals. Prokinesis is considered a very advanced characteristic of birds. To find evidence of this characteristic in such a primitive bird is surprising and indicates that this ability actually arose early in bird \Jevolution\j.\p
Curiously, \IArchaeopteryx\i is more primitive as a bird than \IShuvuuia deserti\i, yet \IArchaeopteryx\i fits the stereotypical conception of a bird much better than the more advanced \IShuvuuia deserti\i. As one of the team commented, " The new discovery illustrates the complexity of the \Jevolution\j of birds and hints at the number of surprises yet to be uncovered in tracing the development of their lineage."\p
One week later, another theropod, \IScipionyx samniticus\i, was in the news when Italian scientists announced that they had found one of the best preserved dinosaurs found anywhere in the world, complete with internal organs and muscles. The find is amazing, not only for the soft tissues, but because this is the first time a \Jdinosaur\j has been found in \JItaly\j. Signore and Cristiano dal Sasso, of the Museo Civico di Storia Naturale in Milan, found the hatchling specimen in the Matese mountains north of Naples. Although it is an area rich in \Jinvertebrate\j fossils and fish, no one imagined it contained \Jdinosaur\j remains.\p
The lizard-like hatchling, about 13 inches (35 cm) long, was buried on its left side with the head slightly upturned. The remains show the creature's \Jwindpipe\j, large \Jintestine\j and bits of liver.\p
The United States also played its part, with the discovery of the Red Gulch \JDinosaur\j Tracksite, featuring 165 million year-old \Jdinosaur\j footprints near Shell, Wyoming. \JDinosaur\j tracksites like this from the Middle Jurassic Period are rare in the world, according to palaeontologist Brent Breithaupt, University of Wyoming Geological Museum director who is leading the \Jfossil\j track investigation of the site.\p
The site is located on Bureau of Land Management (BLM) administered public land. Erik Kvale, an \JIndiana\j Geological Survey geologist, found the tracks in spring 1997 during a family outing. Kvale was surprised to find the footprints in the Sundance Formation that is better known for its plentiful marine fossils.\p
Many of the footprints appear to have been made by meat-eating theropods that traveled on a tidal flat along a shoreline. In-depth site research will begin almost immediately to uncover the tracks and map the 40-acre tracksite.\p
The area has been in the \Jdinosaur\j news before, as where "Big Al", one of the most complete \IAllosaurus\i fossils ever found, was discovered in this general area in 1991. "Big Al" is ruled out as one of the footprint makers on the track site, as he lived 15 million years after the tracks were laid down.\p
For more information on the track site, Breithaupt suggests that people visit the University of Wyoming Geological Museum web site at \1http://www.uwyo.edu/legal/geomuseum/geolpage.htm\c\p
#
"How the eucaryotic organisms arose",533,0,0,0
(Mar '98)
According to \B\1serial endosymbiosis theory\b\c or SET, the eucaryotic organisms arose when simple protists joined together. These eucaryotic (or eukaryotic) cells can be recognised because they have organelles, small specialised parts, sealed off from the rest of the cell by membranes.\p
Now a new theory has appeared in \INature\i, suggesting that the eucaryotic organisms arose not from a haphazard accident but from a beneficial relationship that was a matter of survival, rather than the standard SET interpretation.\p
In the SET scenario, the first complex cell was a predator, which evolved the ability to eat other \Jbacteria\j. It gained its organelles, like the \Jenergy\j-producing mitochondria, when some of its prey happened to escape digestion and took up permanent residence. Recently, though, an examination of the genes in certain single-celled eucaryotes called protists, hint that eucaryotic cells might have acquired their mitochondria before they had evolved the ability to engulf other simple cells.\p
William Martin is a biochemist from the Technische UniversitΣ in Braunschweig, \JGermany\j. He says he was looking one evening at a picture of a protist called \IPlagiopyla\i, whose cells play host to \Jhydrogen\j-eating \Jbacteria\j called methanogens, which group together near \Jhydrogen\j-producing organelles called hydrogenosomes. These organelles are thought to be related to mitochondria, and Martin realised that what he saw inside the protist, the partnership of the organelle and the methanogens, might be similar to the friendly association which led to the first eucaryote.\p
In discussion with Mikl≤ Mⁿller of Rockefeller University in New York City, he concluded that a partnership between an ancestral methanogen and a \Jhydrogen\j-producing eubacterium could have led to the first complex cell. They propose that the relationship started casually, in an oxygen-free, \Jhydrogen\j-rich environment.\p
The two organisms must have later found itself somewhere where the methanogen could not survive on its own. At that point, Martin and Müller suggest, an exchange of genes made the partnership permanent, allowing the host bacterium to enclose its guest completely. The new genes enabled the \Jhydrogen\j-dependent methanogen to import small molecules, make sugars, and break them down into food for the enclosed \Jhydrogen\j-producer.\p
#
"Lizard evolution",534,0,0,0
(Mar '98)
In April 1997, we brought you an account of a study on the anole lizards of Staniel Cay in the \JBahamas\j (\BGreat \JEvolution\j Leaping Lizards\b). The anole lizards were back in the news again in March, with a detailed study of the DNA of 56 species found throughout the large Caribbean Islands of Puerto Rico, \JCuba\j, \JJamaica\j and the Greater \JAntilles\j. (There are about 150 \IAnolis\i lizards in the area.) Now the same research group, a team led by Jonathan B. Losos, associate professor of \Jbiology\j in Arts and Sciences at Washington University, has added further to our knowledge of these lizards, and also shown us how \Jevolution\j works.\p
A report in \IScience\i at the end of March describes a family "tree", which reflects the notion of convergent \Jevolution\j almost perfectly. The tree is based on the fine detail of a number of common genes found across the species, and what it shows us is that species evolve in similar adaptations to the environment despite living geographically apart.\p
While random events may send \Jevolution\j off along strange and unexpected pathways, given the same or similar material in similar situations, much of \Jevolution\j seems to be fairly predictable.\p
Different species of anole lizard have adapted to use different parts of the environment by evolving differences in limb length, toe pad size and other characteristics. Within a given island, species use different parts of the environment and have evolved different features to adapt to their particular habitat. A species living near the trunks of rain forest trees will have long legs to let it run and jump, another species which lives on twigs will have short legs, allowing it to creep along the smaller diameter living surface. Species living high in trees tend to have big toe pads, which would be important for clinging, while those that are more terrestrial have small toe pads.\p
Other habitat specialists may live in grass, and each island seems to show the same general pattern of types, yet the species are different on each island. This could be explained in three entirely different ways.\p
First, the same specialist species may have been distributed across the islands, long ago, and have evolved into entirely different species over time, coupled with genetic isolation. Alternatively, each specialist may have evolved on one island, and then have spread, later to evolve into distinct species.\p
The third possibility is that some member of the group reached a new island, and then slowly adapted to the different habitats that were there for the taking, with the same type evolving repeatedly on each island. The family tree states clearly that this last mechanism is the one which happened. The \IAnolis\i evolutionary tree shows the habitat specialists from the different islands are not closely related, despite exact similarities in their physical traits.\p
#
"The first seafarers?",535,0,0,0
(Mar '98)
As reflected in our entry on \B\1Cro-Magnon Man\b\c, a number of Australian researchers believe that the first evidence of fully modern humans, equipped with language, comes from the first human settlements in \JAustralia\j. Long before the Cro-Magnon people of Europe appeared and painted their first caves, early Australians had planned and built boats or rafts capable of making long sea journeys.\p
A report in \INature\i during March suggested that our ancestors, \IHomo erectus\i, may have had the same sort of seafaring ability, some 800 thousand years ago. The Indonesian island of Flores is isolated from Java, where the remains of \IHomo erectus\i were first found, with several deep sea straits blocking access along the 600 km route to Flores.\p
A joint Dutch-Indonesian group found stone tools on Flores in 1994, and these were dated to 750 thousand years ago, a time when only \IH. erectus\i was in the area, but no human remains were found, and the dating method used, palaeomagnetic dating, is a little uncertain.\p
The new work relies on stone tools found between layers of volcanic rock at Mata Menge, dated by fission-track dating. This looks at tiny tracks left in volcanic crystals such as zircon by the spontaneous fission of uranium-238 atoms. After dating 50 zircon grains from ash layers just above and below the tool-bearing \Jsandstone\j layer, Paul O'Sullivan and Asaf Raza at La Trobe University in Victoria, \JAustralia\j, got ages of 800 000 to 880 000 years for almost all of the grains.\p
Even when sea levels were at their lowest, say the excited researchers, there would have been 19 km of open \Jwater\j to be crossed, much too far for a "pregnant woman on a log" to drift across. Others have expressed more caution: the tectonics of the area are unstable, and there may have been a land bridge, or smaller islands, long since disappeared, to make island-hopping possible.\p
#
"Sinking the idea of forests as carbon sinks",536,0,0,0
(Mar '98)
Canada's \B\1boreal forests\b\c make up roughly 10% of the world's forested areas, and they have long been regarded as a major absorber of carbon dioxide. Even though this was always bad science, some scientists have joined politicians in assuming the best for these forests.\p
Why is it bad science? Simply put, as a forest matures, it begins to take up less carbon dioxide, while decaying plant matter, fires and leaves eaten by animals all begin to return carbon to the \Jatmosphere\j once more. Eventually, the forest will establish an equilibrium, taking up as much carbon dioxide as it releases.\p
Now it appears that even this limited "cure" for the enhanced \Jgreenhouse effect\j may be excessively promoted, though the true answer can only be found by a strict accounting of the carbon budget of entire forest areas. The potential equilibrium can be knocked off balance by all sorts of disturbances.\p
Scientists from the Canadian Forest Services are studying the variations over time of the carbon storage capabilities of Canada's forests, taking inventories of age-class distributions of forests, which conveniently reflect past disturbances. Their conclusion is that the forests of Canada have moved from a sink of atmospheric carbon at the start of this century to a source in the last decades, due to a change in disturbance regimes. The change in disturbance regime is possibly related to climate change, while human impacts are still negligible in this region.\p
Under the UN Framework Convention on Climate Change held in \JKyoto\j in December 1997, signatories are to take into account emissions from \Jfossil\j fuels, while the guidelines require the signatories only to account for those fluxes of carbon between land and \Jatmosphere\j directly associated with human activities (e.g. harvesting and land-use changes). As this newly identified source of CO\D2\d is not of human origin, it falls beyond the guidelines, and so makes yet another problem for our environment.\p
#
"Dengue fever on the move?",537,0,0,0
(Mar '98)
Global warming is expected to increase the range of \IAedes aegypti\i, a mosquito which transmits the \B\1dengue fever\b\c virus. \JDengue\j fever is now considered the most serious viral infection transmitted to man by insects, whether measured in terms of the number of human infections or the number of deaths.\p
This is the conclusion of researchers who have been using computer models to explore what the future holds. Their report appeared in the March issue of \IEnvironmental Health Perspectives\i, the monthly journal of the National Institute of Environmental Health Sciences.\p
Most of the areas where the mosquito is likely to advance are temperate regions bordering areas where \Jdengue\j fever is already found. The only barrier has been lower temperatures which kept the mosquito out. Unlike the yellow fever virus, carried by the same mosquito, the \Jdengue\j virus is not vulnerable to any vaccine or drug.\p
The geographic range of \IAedes aegypti\i is limited by freezing temperatures that kill overwintering larvae and eggs, so \Jdengue\j virus transmission is limited to tropical and subtropical regions. Global warming would not only increase the range of the mosquito but would also reduce the size of the mosquito's larval size and, ultimately, its adult size. Since smaller adults must feed more frequently to develop their eggs, warmer temperatures would boost the frequency of double feeding and increase the chance of transmission, which will happen when the first person bitten is carrying the virus.\p
Warmer temperatures also reduce the incubation time for the virus. The incubation period of the \Jdengue\j type-2 virus lasts 12 days at 30░C, but only seven days at 32-35░C. Half the world's population is currently at risk from the disease, and it has recently become a serious problem in Latin America. \JBrazil\j alone had a quarter of a million cases in 1997.\p
#
"A wet Big Apple?",538,0,0,0
(Mar '98)
A new study from Columbia University suggests that New York City may face serious flooding over the next century. Presenting her results at a conference in March, Dr Vivien Gornitz, associate research scientist at Columbia's Center for Climate Systems Research, presented some gloomy predictions. Local temperatures could rise by as much as four degrees Fahrenheit (2░C), and sea levels could increase by up to eight inches (20 cm) by 2030 and by as much as four feet (1.2 metre) by 2100, under the most extreme scenarios, she said.\p
This would tend to flood subways, airports and low-lying coastal areas, she predicted. The meeting, one of a series to assess problems of climate change, will contribute to a report to be presented to the US Congress and the President by 2000.\p
Northeastern United States is dropping by about one \Jmillimetre\j a year, offsetting a rise in southern Canada, which was previously compressed by glaciers, and this will contribute about an inch to the estimated sea level "rise" by 2030, but three different scenarios all indicate a total rise of 4 to 8 inches (10-20 cm).\p
#
"New Web source of weather images",539,0,0,0
(Mar '98)
A new Web site, available to all, provides Web surfers with real-time \Jweather\j images, coming from a \B\1weather \Jsatellite\j\b\c near you, and the site even lets you animate the \Jweather\j in your area (or any other area) over the past few hours.\p
Images of north America are broadcast by the GOES-8 \Jsatellite\j every thirty minutes, and posted to the Web within a further fifteen minutes. You can choose the area to look at, and you can zoom in for finer detail. To cover the rest of the globe, the GOES-9, China's FY-2 \Jsatellite\j and \JJapan\j's GMS-5 satellites are also used.\p
The URL for the new GOES interactive site is:\p
\I\1http://www.ghcc.msfc.nasa.gov/GOES/\c\p
The offering comes from a joint endeavour between NASA's Marshall Space Flight Center, the Universities Research Association, and the Space Science and Technology Alliance of the State of \JAlabama\j, working as the Global Hydrology and Climate Center, or GHCC.\p
The GHCC needs access to geostationary \Jsatellite\j data to monitor short-term components of the \JEarth\j's \Jwater\j cycle and to develop a long-term data base for regional applications and climate research, and to meet this need, the GHCC recently set up a ground station to gather data from GOES-8, which is positioned 35,680 km (22,300 miles) above the equator at 74.7 deg. W longitude.\p
Aside from the pictures of general interest, the site also offers downloads of digital data by anonymous FTP for more serious workers.\p
#
"Seeing inside the earth",540,0,0,0
(Mar '98)
Researchers at the U.S. Department of \JEnergy\j's Lawrence Berkeley National Laboratory (\1 http://www.lbl.gov\c) have used seismic wave data gathered from tens of thousands of earthquakes to produce the first three-dimensional image of the \JEarth\j's entire structure, all the way from the crust to the inner core.\p
The researchers, Don Vasco and Lane, wrote up their results in the February 1998 issue of the \IJournal of Geophysical Research\i. The key point from their analysis is that the outer core is not homogeneous, as has been long believed.\p
Seismic data from some 40 000 earthquakes in the 1960s, 1970s and 1980s was analysed by computer to get the travel times. Using this large data set, Vasco and Johnson were able to characterise the seismic velocity of materials which make up our \Jplanet\j. Vasco describes the work as something like a CAT scan of the \Jearth\j, only instead of using thousands of rays, they have used thousands of \Jenergy\j bursts.\p
The outer core, which starts about 3000 kilometres (1850 miles) below the \JEarth\j's surface and is 2300 km thick, is thought to be a liquid, with a \Jviscosity\j not much different from \Jwater\j. This led some to conclude that the outer core has no real structure. But the researchers found indications of heterogeneity at the bottom of the outer core, which Vasco thinks is an iron-nickel-sulfur compound. High pressures and temperatures could be causing nickel-rich iron to solidify, depleting the nickel at the base of the outer core, he thinks, which could help explain the \JEarth\j's magnetic field. The depleted iron is less dense than the surrounding core, which would cause it to rise, leading to convection and a magnetic field.\p
A high resolution 533K TIF image can be downloaded from \1http://www.lbl.gov/images/PID/seismic-\Jearth\j-plot.tif\c\p
#
"Studying earthquakes by satellite",541,0,0,0
(Mar '98)
The Global Positioning System (GPS), designed for navigation, is already being used to study \Jplate tectonics\j (January, 1997) and the uplift of the \JAndes\j (January 1998), and now it is being used to measure the positions of markers thousands of miles apart to a precision of less than an inch (2.5 cm), becoming an even more powerful tool for \Jearthquake\j studies around the world.\p
While plate movements could previously be estimated by gathering averages from timed data over several million years, the presumed rate of travel - about as fast as a fingernail grows, can now be studied in real time. Seth Stein, professor of geological sciences at Northwestern University, spoke during March to a Seismological Society of America meeting in Boulder, \JColorado\j, reporting on new advances in this field, and summarising the state of the art.\p
Most importantly, we now know that the previously calculated average rates of travel are the same as the actual rates of travel, allowing greater confidence in \Jearthquake\j hazard studies based on models of plate motion which are used to infer how often large earthquakes on average occur in \JJapan\j or \JCalifornia\j.\p
GPS also reveals the slow squeezing of the interior of plates which give rise to earthquakes like the New Madrid earthquakes which shook the USA in 1811 and 1812. These movements are particularly slow, but GPS can still measure them and provide the needed data, using a set of markers which now span the New Madrid seismic zone. "Although most of the motions are probably the after-effects of the earlier quakes, some could be building up for the next quake," Stein said. He also referred to work on the subduction of the Nazca plate ("\B\1Watching the \JAndes\j Grow\b\c", January 1998).\p
#
"Undoing the cryptographers' work",542,0,0,0
(Mar '98)
Modern methods of encrypting computer data rely on the problems we encounter in trying to find all of the factors of a large number which is the product of two primes. To take a simple (but non-trivial) example, the number 1729 has three prime factors which are in \B\1arithmetic progression\b\c, making it one of an interesting subset of the Carmichael numbers, but even finding the small factors of such a simple number can take a human quite a while.\p
Computers can do it faster, but if you ask a computer to factorise a 1000-digit number, that would take the fastest existing supercomputer about 10 billion times the age of the universe. But hope for the code-breakers may be on the way, sooner or later.\p
The trick is to develop a quantum computer, which nobody has done yet. But when the first quantum computer is unveiled, the algorithms will be there to help program it, including an \Jalgorithm\j which would factorise the 1000-digit number in about half an hour!\p
Described in the 16 March issue of \IPhysical Review Letters\i, this \Jalgorithm\j could dramatically speed up chores like searching through reams of data, especially in scheduling problems where teachers and students have to be fitted into a timetable with no clashes. The limitation in doing this on an ordinary computer is that each item must be in binary form, coded as a single bit which is either on or off, 1 or 0.\p
A "bit" in a quantum computer can have more than one state at a time, represented by the quantum characteristics of a particle like an \Jelectron\j or \Jphoton\j. This means that a string of quantum bits can hold all possible schedules at the same time. When the quantum states of the particles are measured, this forces the system to choose one of the possible configurations, one of the schedules. Each schedule has a probability of being chosen, and this can be set to depend on having the smallest number of conflicts. The \Jalgorithm\j is the work of Tad Hogg of the Xerox Palo Alto Research Center in \JCalifornia\j.\p
Even if quantum computers are years away from being built, studying quantum algorithms clarifies the nature of problems by probing their structures ever more deeply - and it may just turn out to be useful to those who design the new forms of computer, some time in the 21st century.\p
#
"Death of a death threat",543,0,0,0
(Mar '98)
No sooner was the news out that we were facing the end of civilisation as we know it, than the threats were called off again. There was barely time for cynics to assert that the barbarians are not only at the gate, but they are living in the hall before we were told that civilisation as we know it was safe from falling \Jasteroids\j.\p
The threat of \JArmageddon\j, if not now, then in the foreseeable future, came and went within 24 hours during March. If you missed it, don't worry, it won't happen, but for a short period there, the \Jorbit\j of an asteroid, at least 1 \Jkilometre\j wide, was predicted to pass between the \JEarth\j and the Moon, around 1:30 p.m. Eastern (USA) time on 26 October 2028. The \Jorbit\j of the massive asteroid, known as 1997 XF11, was posted on the \JInternet\j by the International Astronomical Union on March 11.\p
Even though astronomers estimated the chance of a hit as less than 1%, excitable media all over the world had just got into their "end of the world is nigh" stride when a new analysis appeared. Archival images of the asteroid, identified this morning, and a more sophisticated analysis of the \Jorbit\j suggest the object is likely to whiz by at a safe 900 000 km from \JEarth\j, more than twice as far away as the moon.\p
The problem arose because the "risk" had been assessed on a fraction of the object's \Jorbit\j. The announcement urged other astronomers to train their sights on the object over the following few weeks, before it fades away as it gets further from the \Jearth\j. It will return again in early 2000, and yet again at the end of October 2002, when the asteroid passes to within 10 million km of \JEarth\j, and the intention was to ensure that the \Jorbit\j was firmly nailed down by alerting other astronomers to note and record its path.\p
XF11 is bright enough that a good \Jtelescope\j can track it over most of its \Jorbit\j, and this is how the problem was able to be dismissed so quickly. Given partial information, astronomers can calculate an "error ellipsoid," a region of space where the asteroid is likely to be during its nearest pass on 26 October 2028. Given further information, the error ellipsoid can be refined and reduced to a smaller size.\p
The new data came from images of the asteroid taken at the Mt. Palomar Observatory in 1990 and pulled from \Jarchives\j by Eleanor Helin at JPL when the excitement levels were getting extreme. These allowed Brian Marsden of the Harvard-Smithsonian Center for Astrophysics in Cambridge, \JMassachusetts\j, to refine his prediction, and while his original ellipsoid showed a possibility of the \Jorbit\j overlapping the \Jearth\j, the chances of this are now assessed as nil.\p
Some astronomers have criticised the whole affair, suggesting that there is a risk of being like the boy who cried "wolf".\p
#
"Water on the moon - once again!",544,0,0,0
(Mar '98)
The on-again-off-again story of moon \Jwater\j featured in these files several times last year. In early March, NASA revealed that they have found convincing evidence of \Jwater\j at the lunar \Jpoles\j, after analysing data from the first mission to explore the moon in 25 years.\p
The find, made by the "dirt-cheap" NASA mission, called Lunar Prospector, promises a practically limitless supply of \Jwater\j that could drastically reduce the costs of lunar colonisation.\p
The theory is that the moon has been pelted by icy comets for billions of years. Much of this ice would melt and vaporise as its kinetic \Jenergy\j (motion \Jenergy\j) is converted to heat, and most of he remainder would be boiled away by the intense 120C surface \Jtemperature\j of the moon at midday.\p
Except at the \Jpoles\j, that is, where many of the craters provide areas which are permanently shaded from the \Jsun\j, where any \Jwater\j vapour, drifting in the moon's low gravity, would settle down as ice. Like the \Jearth\j's \Jpoles\j, the moon's \Jpoles\j get very little sunlight, so they are very cold.\p
Space-based radar probes on the Clementine space craft appeared to reveal tell-tale signs of ice, but later \Jearth\j-based radar studies failed to confirm that there was ice at the moon's \Jpoles\j.\p
Lunar Prospector used a neutron spectrometer which recorded neutrons produced by high-\Jenergy\j cosmic rays striking the lunar surface. When these neutrons bounce off a \Jhydrogen\j atom, they slow, and this slowing down is taken to be a sign of the presence of \Jwater\j, since there are unlikely to be any other \Jhydrogen\j-bearing molecules close to the lunar surface.\p
The estimates are still a little rough, but something between 0.5% and 1% of the lunar soil around the \Jpoles\j seems to be made up of fine ice crystals. In total, depending on the assumptions you make, there could be 10 million tonnes of \Jwater\j, or perhaps 300 million tonnes.\p
And what would the \Jwater\j be used for? The moon's gravity is weaker than the \Jearth\j's, so if a settlement could be established, \Jelectrolysis\j could provide extra oxygen to breathe, while the melted ice would provide \Jwater\j for hydroponics. Most importantly, solar-powered \Jelectrolysis\j would provide \Jhydrogen\j and oxygen, excellent rocket fuels, allowing low-cost space exploration from the low gravity of the moon.\p
#
"Astronomical record set - from earth",545,0,0,0
(Mar '98)
These days, when we expect all the best discoveries to come from the Hubble Space \JTelescope\j, it is something of a surprise when we find discoveries being made from the ground. Now comes a universe-expanding discovery, due to appear in the \IAstrophysical Journal Letters\i soon.\p
A galaxy, called 0140+326RD1 (or RD1 for short), discovered at the world's largest \Jtelescope\j, the 10-meter Keck on Mauna Kea, the galaxy lies so far out in the expanding universe that the wavelengths of its light have been stretched more than sixfold. It has a \Jredshift\j of 5.34, compared to 4.92 for the old record-holder.\p
The light we see from the galaxy right now left the galaxy when it was just 6% of its present age, about 820 million years after the Big Bang. It is 90 million light years further away than the previous most distant object, at an estimated 12.22 billion light years distant, based on the reasonable assumption, questioned by some astronomers, that the universe is about 13 billion years old.\p
Johns Hopkins University astronomer Arjun Dey and colleagues from Hopkins, the University of \JCalifornia\j at Berkeley, and the Keck \Jtelescope\j found the new record-holder last December following a systematic search for distant galaxies. They found RD1 while looking at another nearby galaxy with a \Jredshift\j of 4. RD1 was obviously more redshifted than this, but it took a 10-hour observation to gather enough light to confirm its age.\p
The galaxy which is more than a hundred million times fainter than the faintest star visible to the naked eye, remains too faint for astronomers to gather a full spectrum, so the nature of its stars and gas will remain a mystery for now. It is bright in the ultraviolet range, indicating newly-formed stars.\p
The \Jredshift\j record may not last long. At the Keck \Jtelescope\j, a group from the University of Hawaii and the Institute of \JAstronomy\j in Cambridge, England, is now reporting that they have picked up a spectral line, but so far, no image, of a galaxy at a \Jredshift\j of 5.64.\p
\BKey names\b: Arjun Dey, Hyron Spinrad, Daniel Stern, James R. Graham and Frederic H. Chaffee. An image showing the area of the sky containing the galaxy can be downloaded at the following Web address: http://www.jhu.edu/news_info/news/home98/mar98/images/stellar.gif\p
#
"Strange pulsar found",546,0,0,0
(Mar '98)
Colleen Wilson, a NASA scientist, will be reporting in \IThe Astrophysical Journal\i in June on her recent discovery of a pulsar that pulses twice in each of its "years". The accreting x-ray pulsar, GRO J2058+42, appears to burst in x-rays twice each time it circles its primary star. The \Jaccretion\j-powered pulsars are pulsing stars that burst in x-rays and gamma rays as they gobble gas in a disk from a larger parent star.\p
Pulsars were first found in 1965 when radio astronomers discovered several objects that emitted radio waves with clock-like precision. They were soon identified as rapidly rotating neutron stars with intense magnetic fields. But where the radio pulsars have the regularity of a precision watch, \Jaccretion\j pulsars are like cheap alarm clocks that easily gain and lose time, and go off when you least expect it, says Wilson.\p
Since the launch of BATSE, the Burst and Transient Source Experiment aboard the Compton Gamma Ray Observatory, in April 1991, a number of new x-ray pulsars have been found. Wilson was reviewing BATSE data in September 1995, when she found a burst that registered 140 milliCrabs. That is, it was 140/1,000ths of the brightness of the Crab Nebula, which astrophysicists use as a standard \Jcandle\j.\p
Using a computer to fold the data on itself, Wilson found that the source repeated every 198 seconds, an indication of a massive, compact object spinning at high speed. The fact that it repeated, \Jday\j after \Jday\j, indicated that it was a real object.\p
The name GRO J2058+42 tells us that it was discovered on the Gamma Ray Observatory, and that it lies about 20 hrs, 58 minutes along the celestial equator, and 42 degrees north. The object shows bursts every 54 days, but every second burst is brighter, a most unusual occurrence.\p
Wilson speculates that J2058+42 is a binary star system composed of a type Be star (a type B star with emission lines), about 8 to 15 times the mass of our \Jsun\j, with a neutron star in a lopsided \Jorbit\j. She thinks a more likely explanation has variable amounts of material being "eaten up" by the pulsar as it continues it \Jorbit\j.\p
The answer will probably come when the object is visually sighted. For now, the "error box", the smallest part of the sky that can be drawn with a certainty of including the object, is too large to search. It is now down to about 4 arc minutes across, about an eighth of the diameter of the moon, as we see it from \Jearth\j, but that is still quite a large area for an optical \Jtelescope\j to search. It is, believes Wilson, around 23 to 50 thousand light years away.\p
#
"National Science Board awards",547,0,0,0
(Mar '98)
The US National Science Board's first awards for contributions to public understanding of science and \Jengineering\j, to be presented in May, were announced during March. The winners: Jane Goodall, for her "inspirational and dignified" primate studies, and the "renowned and standard-setting" Public Broadcasting Service's NOVA \Jtelevision\j series\p
David Perlman, chairman of the selection committee for this year's award, cited Goodall's "lifetime of work communicating the results of her research to the broadest possible publics," and Goodall's "international network of institutions to encourage the participation of youngsters and adults in the scientific enterprise."\p
PBS' NOVA series, now nearing 500 programs, were described as "a bright beacon lighting our way to understanding science and technology." He said, "NOVA set the standard for giving us insights into how science is done and what drives those who do it."\p
#
"The Top 10 scientific papers for 1997",548,0,0,0
(Mar '98)
One measure of scientific importance much favoured by social scientists is \B\1citation analysis\b\c, a simplistic count of how many times other scientists have referred to a particular paper. In selecting the "top ten" for 1997, the question is "how often was this paper cited in other scientific papers published this year?\p
According to the Institute for Scientific Information in Philadelphia, apoptosis, or programmed cell death, was the most cited topic in scientific papers during 1997, even beating Dolly the sheep. All ten of the "top ten" gained an advantage from being published in either January or February, as did all the remainder of the "top forty".\p
In a fairer test, the most cited researcher (gauged by the number of highly cited papers over a 2-year period) was geneticist Ronald M. Evans, a Howard Hughes investigator at the Salk Institute, with six papers.\p
Apoptosis featured in three papers, cell death in one, with ataxia, presenilins, neuron survival, Dolly, oncogenes and the BRCA1 (breast cancer) gene (see the "shorts" below).\p
#
"BRCA1 screening not a general need",549,0,0,0
(Mar '98)
One of the most exciting discoveries of the decade, the BRCA1 gene which leaves women at risk of breast cancer, now appears to be less common than people had thought. There is still a good case, say researchers, for screening women whose families have a history of cases of both breast and ovarian cancer, or where at least four cases of breast cancer have been identified. Widespread screening of the general population would not, they suggest, be a good use of medical resources.\p
#
"And a control gene for BRCA1",550,0,0,0
(Mar '98)
Researchers at The Wistar Institute in Philadelphia have identified a new gene, called BAP1, associated with breast and lung cancer development, according to a report in \IOncogene\i at the start of the month. This gene encodes an \Jenzyme\j that helps to regulate levels of BRCA1. When a faulty BRCA1 gene is inherited, a woman's chances of developing breast and/or ovarian cancer are greater than 80% during her lifetime.\p
Laboratory studies show that BRCA1 and BAP1 form a complex in the cell that controls BRCA1's activities, including its deterioration. The Wistar investigators have also learned that, like BRCA1, the BAP1 gene is a cancer gene. Mutations of BAP1 have been found in non-small-cell lung cancers. It appears that both genes are involved in \Jtumour\j suppression, so that the genes "cause" a cancer by failing to stop it.\p
#
"Human genome",551,0,0,0
(Mar '98)
After comparing notes at a conference of gene sequencers in February, the researchers have concluded that \Jchromosome\j 22 will become the first human \Jchromosome\j to be completely sequenced, perhaps as early as the third quarter of 1998.\p
#
"How lithium works?",552,0,0,0
(Mar '98)
\JLithium\j has been used to treat manic depression for almost fifty years, but nobody has ever known why. According to a report in the \IProceedings of the National Academy of Sciences \iin early March, it may be that the drug protects nerve cells from being stimulated to death. Overstimulation is also a feature of other diseases, such as Huntington's and Parkinson's, so the new results offer hope that \Jlithium\j might be useful in preventing the early stages of these maladies as well.\p
\JLithium\j seems to prevent or reduce the number of manic attacks, when sufferers become hyperactive and delusional. The researchers began with the suspicion that the \Jlithium\j somehow protects nerve cell triggers called NMDA \Jreceptors\j from damage by the \Jneurotransmitter\j. To test this, they applied toxic amounts of glutamate to three cultures of nerve cells, and found that, without \Jlithium\j about half the nerve cells died in 24 hours, as against just 10% deaths when \Jlithium\j was added.\p
#
"Calm clams?",553,0,0,0
(Mar '98)
The logic of feeding the popular anti-depressant, Prozac, to shellfish may seem a little elusive at first glance. A researcher in America has been doing exactly this, and with good reason, giving Prozac to fingernail clams and zebra mussels.\p
One of the problems in culturing shellfish is to get all of the animals to spawn at the same time, so as to get a crop of animals all of the same size at the end of the growth period. One solution is to use high concentrations of a natural substance, \Jserotonin\j, which causes the shellfish to produce eggs and sperm at the same time. The only problem: \Jserotonin\j is rather expensive in the concentrations that are needed.\p
\JSerotonin\j is used by nerve cells when they communicate with each other. Antidepressants such as Prozac, Luvox, and Paxil make more \Jserotonin\j available, by blocking the molecular mopping up of \Jserotonin\j, making more of the compound available for transmitting neural messages.\p
In humans, \Jserotonin\j regulates behaviours such as appetite, sleep, arousal, and depression. In shellfish, it seems to trigger spawning, and at concentrations between a hundred thousand and a million times less than the \Jserotonin\j levels needed to obtain the same effects.\p
#
"Is alcohol good for you?",554,0,0,0
(Mar '98)
The argument goes on. In a Los Angeles study on \Jatherosclerosis\j, data from 577 symptom-free utility workers age 40 to 60 show a "significant" relation between total alcohol intake and fractionally less thickness of the wall of the large carotid artery in the neck - but only in women. So in one group, at least, alcohol seems to have a role in reducing the risk of \Jatherosclerosis\j, though curiously, wine seemed to give the greatest benefit.\p
We say curiously, because the Society of \JToxicology\j's annual meeting in Seattle was told last month that beer seems to have some useful side-effects. It seems that several compounds in hops, the dried flowers which give beer its bitter taste, slow the growth of cancer cells in test tubes and boost the action of a cancer-fighting \Jenzyme\j called quinone reductase.\p
Nine flavonoids were isolated from hops, and tested on various cultures, where a number of them proved to have a significant effect on the cancerous cells while leaving the normal cells alone. This is a long way from a cure, of course, but the researchers note that the flavonoids are similar in chemical structure to many other suspected cancer-preventing chemicals in plants, such as genistein, a substance in soy products that may protect women in Asia against breast cancer.\p
As the researchers from Oregon State University move on to animal tests, they have already applied for patents on the compounds. You may need to drink quite a few beers to get a significant effect, say the researchers, but capsules of the flavonoids are a distinct possibility in the future.\p
#
"Seeing the shores",555,0,0,0
(Mar '98)
In many parts of the world, midden heaps, piles of seashells left by hungry humans, dot the shores. Along \JDelaware\j's Cape Henlopen, there are many such middens, dating back a thousand years or more, most of them buried in salt marshes by the shore.\p
These middens are a treasure trove for archaeologists, providing pottery fragments and stone tools as well as the shells, but how do you find out where to dig? The answer, according to one American researcher, is to use Ground Penetrating Radar (GPR) to locate the middens.\p
Archaeologists have previously used GPR to examine hidden underground building foundations, and geologists use the same equipment to identify subterranean features such as \Jaquifer\j sands, but GPR signals have little effect in penetrating salt \Jwater\j, so researchers have not used the technique in coastal marsh areas because nobody thought it would work. Now they know better.\p
The GPR system includes a transmitter, a receiver and a power source, which are carried into the field in backpacks, when a signal is sent into the ground, to reflect back into the receiver. The best find so far: a shell midden, 8 metres (25 feet) below the surface. The researcher: William J. Chadwick, doctoral candidate at the University of \JDelaware\j, presenting to the American Geological Society.\p
#
"Walking, climbing wheelchair",556,0,0,0
(Mar '98)
A standard wheelchair, passing along a smooth path in a suburban park, can be blocked by something as minor as a \Jwater\j hose, because the force required to cross the hose is greater than the wheelchair rider can manage.\p
Goats and spiders have no such problems, and a new wheelchair, based on studies of goats and spiders, has been built by Vijay Kumar, a biomedical engineer at the University of \JPennsylvania\j who has designed and patented an all-terrain wheelchair that can climb up to 12-inch steps and amble over obstacles. Descriptions of the new chair were released onto the World Wide Web during March.\p
The prototype vehicle has powered rear wheels and two robotic arms which can anchor the chair like crutches or ski \Jpoles\j, pull it from the front, or push it from behind. To climb a stair, which must be wider than the vehicle, the arms pull the wheelchair up and over the raised ledge then rotate behind the device to push the rear end up and onto the elevated step.\p
#
"Seeing a new energy source on the shores",557,0,0,0
(Mar '98)
Gas hydrates, deposits of frozen \Jmethane\j gas under the world's oceans, are now being seriously looked at as a source of cheap clean \Jenergy\j. \JMethane\j hydrate has an \Jenergy\j density (volume of \Jmethane\j at standard conditions per volume of rock) that can be as much as five times greater than conventional sources of \Jnatural gas\j. The potential of this \Jenergy\j source is exciting to researchers and especially to countries that are not rich in oil and gas resources. There is probably 3000 times as much \Jmethane\j trapped in these deposits as exists in our world's \Jatmosphere\j.\p
According to one researcher, these unusual mixtures of frozen gas either occur in the permafrost zones of the polar regions or in continental margins in \Jwater\j depths below about 500m. They have been identified in all the world's oceans like "a ring around a bathtub". Estimates suggest that the deposits represent more gas than has ever been extracted or identified in conventional reservoirs. People are starting to look seriously at it as not only an \Jenergy\j source but as a clean \Jenergy\j source.\p
#
"Metallic glass",558,0,0,0
(Mar '98)
To scientists, a glass is any material that can be cooled from a liquid to a solid without crystallising. Most metals crystallise as they cool, arranging their atoms into a highly regular spatial pattern called a lattice. But if crystallisation does not happen, and the atoms settle into a nearly random arrangement, the final form is a metallic glass.\p
In a strict sense, a glass is a liquid, because of the random arrangement of its particles, but like window glass, most other glasses give us a feeling of satisfying solidity. Yet a metallic glass has some wonderful properties that can make it ideal for electric transformers, golf clubs and other products.\p
Todd Hufnagel, a Johns Hopkins University researcher, is trying to produce new metallic glasses in bulk form with superior strength, elasticity and magnetic properties. Unlike window glass, metallic glasses are not transparent, but their unusual atomic structure gives them distinctive mechanical and magnetic properties. The metallic glass is not brittle like window glass, and it does not bend out of shape like ordinary crystalline metal.\p
Even window glass makes excellent springs, but according to Hufnagel, "If you rank materials for how springy they are, metallic glasses are off the chart, they're far and away better than anything else out there."\p
Hufnagel's aim is to create a new metallic glass that will remain solid and not crystallise at higher temperatures, making it useful for engine parts. Also (hush!) the new metallic glass may also have military applications as armour-piercing projectiles. Most crystalline metal projectiles flatten into a mushroom shape upon impact, but Hufnagel believes the sides of a metallic glass head will sheer away on impact, essentially sharpening the point and providing more effective penetration.\p
Perhaps the next century will see us agreeing sagely that those who live in stone houses should not throw bottles - not if they are made of metallic glass.\p
For more information on this topic, see Hufnagel's Home Page, located at \1http://www.jhu.edu/~matsci/people/faculty/hufnagel/hufnagel.html\p
#
"Writers watch out!",559,0,0,0
(Mar '98)
Brutus.1, the world's most advanced story generator can generate short stories based on the notions of deception, evil, and to some extent voyeurism. One such example, a story called \IBetrayal\i, is already on the Web. But if Brutus.1 is to generate stories outside the concept of betrayal, researchers would need to define mathematically other literary themes such as unrequited love, revenge, jealousy, and patricide.\p
The proud parent of Brutus.1, Selmer Bringsjord, associate professor of philosophy, \Jpsychology\j, and cognitive science and director of the Minds and Machines program at Rensselaer \JPolytechnic\j Institute, says that in the future, the entertainment industry will rely on such artificially intelligent systems.\p
Cynics might say that happened years ago, but so far, science writers seem to be safe.\p
For more information about Brutus.1, see Bringsjord's web site \1http://www.rpi.edu/~brings\c and for \IBetrayal\i and other stories by Brutus.1, visit \1http://www.rpi.edu/dept/ppcs/BRUTUS/brutus.html\c.\p
#
"When there is no 'r' in the month",560,0,0,0
(Mar '98)
The northern hemisphere rule of thumb for foods which go "off" easily is to avoid them in the hotter months, May, June, July and August, but this rule is of no use at all in the southern hemisphere, and not even a great deal of use. Often the best answer is to use people's noses, though these are notoriously uncertain as testing instruments.\p
Three University of \JFlorida\j scientists, Murat Balaban, Diego Luzuriaga, and Maurice Marshall, are now testing highly accurate electronic noses that sniff out fishy seafood before it gets to the consumer. The "nose" detects minute amounts of odour, telling quickly which food is good, and which is bad. The alternative is to do \Jbacteria\j counts which can take days.\p
The odour detectors, now widely used in Europe, are computerised tabletop units with sensors that detect odour molecules. They are also being used to find \Jbacteria\j in wounds, inspect toxic waste sites and check the quality of wine and \Jcoffee\j.\p
But how do you validate such a system? The researchers "trained" a nose to mimic judgments that food inspectors make. In 43 tests on good and bad shrimp, the electronic nose was in perfect agreement with Food and Drug Administration inspectors who visited their campus.\p
The main problem will be validating the "nose" in the law courts, where a machine is unable to give evidence in any satisfactory way, so that a disreputable company could tie the machines up in legal argument. A more likely immediate use is with reputable companies, who could use the "nose" to make decisions about which seafood to process, and which to sell fresh.\p
#
"R Leonis, the incredible shrinking swelling star",561,0,0,0
(Mar '98)
A star called R Leonis has been accused of "outrageous behaviour" because it has changed its diameter by up to 35% over a period of a year. One of a group of stars called Mira variables, R Leonis will eventually lose most of its mass and turn into a white dwarf. It brightens and dims on a year-long schedule, and has now been under close scrutiny for two years.\p
R Leonis is 300 light years away, too far away for study by normal telescopes to observe it, but a clever new Cambridge instrument, called COAST, for Cambridge Optical Aperture Synthesis \JTelescope\j, captures light with four small mirrors, each just 16 centimetres across, spaced as much as 6 metres apart. This simulates a six-metre \Jtelescope\j which sees right through atmospheric distortion.\p
The diameter of the star swells from 450 times that of the \Jsun\j up to 600 times the \Jsun\j's diameter. One explanation is that the star is unstable, with the compact form trapping radiation inside the star, and the expanded form allowing \Jenergy\j to flow out faster than it is formed.\p
#
"Two new tests for Carpal Tunnel Syndrome",562,0,0,0
(Mar '98)
Carpal tunnel syndrome (CTS) is a subject to raise strong emotions. To some, CTS is an excuse on the part of employees to obtain compensation. To others, especially self-employed science writers, CTS is a factor to be kept in mind daily, and to be kept at bay by sensible exercises, sensible posture, and an understanding of basic \Jergonomics\j.\p
But how do you test somebody who claims to have CTS, sometimes also called RSI (repetitive strain injury)? According to a recent issue of the \IAmerican Journal of Physical Medicine and Rehabilitation\i, the best preliminary tests physicians can use to determine the presence of CTS are the square-shaped wrist and abductor pollicis brevis (or thenar) weakness tests.\p
The thenar weakness test detected the presence of CTS in 66% of 142 hands with CTS. The square-shaped wrist was correct in 69% of the cases. The alternative method, the nerve \Jconduction\j study, or NCS, is accurate in about 90% of CTS cases. The NCS test uses electrical signals to gauge how fast nerve impulses move through the median nerve of the hand, but it costs several hundred dollars to run the NCS on a patient.\p
The thenar test assesses any weakness of the thenar muscles, which are located in the palm of the hand. Patients place their thumb and small finger together while the physician pushes on the thumb. If the patient shows weakness, the sign is considered positive for CTS.\p
In the square-shaped wrist test, a physician uses a caliper to determine if a wrist is more square or rectangular. It seems that having a square-shaped wrist is a major predisposing factor for CTS.\p
#
"Alzheimer's disease can be diagnosed in the very early stages",563,0,0,0
(Mar '98)
A large-scale study of aging people has shown that the signs of developing Alzheimer's Disease can be detected two years earlier than was previously thought. Even small changes associated with the disorder can be distinguished from the memory changes that occur with normal aging.\p
Early diagnosis will become more important as new drugs are found to treat the condition, but more importantly, there are other problems which produce similar symptoms, such as \Jhypothyroidism\j and depression. Then again, if people know that they are entering into Alzheimer's disease, they will be able to make decisions and plans about their future.\p
The study, reported in the March issue of \IArchives of \JNeurology\j\i, involved 224 patients who were at some later stage diagnosed with Alzheimer's. Some of them had been followed for as long as 16 years before they died, somewhere between 46 and 106 years of age. Each patient was examined yearly and classified as having Alzheimer's or being cognitively healthy. The diagnoses were based on videotaped interviews with patients, relatives and friends. The patients were rated on memory, orientation, judgment and problem solving, cognitive functioning at home and in the community, and their ability to undertake personal care.\p
Autopsy results confirmed 93% of the 207 positive diagnoses, including those of 17 people in the very mild stage of the disease. The remaining 7%, diagnosed as having Alzheimer's, turned out to have rare degenerative diseases of the brain that also can produce \Jdementia\j. One person did not show any signs of Alzheimer's at autopsy.\p
#
"The secret of the Stradivarius - it's all in the chemistry",564,0,0,0
(Mar '98)
According to a \JTexas\j professor of chemistry who is also an amateur violin maker, the magnificent tones of violins made by the 17th and 18th century northern Italian violin artisans lies in brine-soaked wood.\p
At the end of March, Dr. Joseph Nagyvary showed the national meeting of the American Chemical Society in \JDallas\j what can be done with the right treatment of the wood. He believes that a key factor in the vibrant tonal quality of the instruments made by Stradivari and others is wood soaked in brine, and he has made chemically-treated violins which have drawn favourable comment from expert violinists, among them international concert violinists Elizabeth Matesky and Zina Schiff.\p
#
"April, 1998 Science Review",565,0,0,0
\JGetting more wheat\j
\JFreeze-concentrated milk\j
\JThe end of the CCD?\j
\JA new use for bagasse\j
\JHydrogen gets cheaper\j
\JPortable power gets cheaper\j
\JA waste of energy...\j
\JA saving of time...\j
\JFroth and bubble\j
\JCD players to get larger?\j
\JStable collagen\j
\JHow bacteria work together\j
\JRAP and Staph\j
\JTaking another tack\j
\JA possible vaccine against tuberculosis\j
\JEat up your greens...\j
\JThe chicken that laid the golden eggs?\j
\JBreast cancer screening\j
\JAspirin and heart attacks\j
\JA place in the sun...\j
\JSponges in the news\j
\JTea with milk, anybody?\j
\JGene therapy in the news\j
\JThe revenge of the non-mutant ulcer bacteria\j
\JMarking by computer\j
\JOff their rockers?\j
\JThe spread of the mammals\j
\JAstronomy in the Sahara?\j
\JHow Lucy walked\j
\JEarlier human speech?\j
\JExplaining grandmothers\j
\JMeasuring old earthquakes\j
\JExtending the seismic net\j
\JA weather index for ordinary people\j
\JA few hot years\j
\JGrauer's gorillas still there\j
\JComing soon to a clean environment near you...\j
\JNew sensor for bacteria\j
\JStop mowing!\j
\JEinstein proved right, even when space-time is seriously curved\j
\JGamma ray burst interpreted\j
\JBlack hole?\j
\JOrion's clouds\j
\JBirth of a planet?\j
\JInternational Space Station problems\j
\JLarsen B Ice Shelf loses 200 square kilometers\j
#
"Getting more wheat",566,0,0,0
(Apr '98)
For some years now, the maximum yield of wheat from even the best strains under the best conditions has been no more than 12 tonnes per hectare (almost 5 tons per acre). During April, a new strain of wheat was announced which is able to deliver 18 tonnes per hectare (7.5 tons per acre). Unfortunately, the new wheat breed requires a great deal of \Jfertilizer\j, and this could mean higher levels of \Jphosphates\j and nitrates in rivers and ground \Jwater\j.\p
The strain is the result of twenty years intensive work on a wide range of wheat varieties, aimed at improving every part of the plant, but so far, the strain lacks disease resistance. Adding resistance genes to the strain may take as much as five years, but after that time, the new wheat should be available for use on the world's farms - or at least those farms where the farmers can afford to buy the necessary chemical fertilizers.\p
The developers say that a new method of cultivation called "bed planting" may be the answer. This method requires 30% less chemical \Jfertilizer\j for the same yields.\p
#
"Freeze-concentrated milk",567,0,0,0
(Apr '98)
Dried milk has a distinctive taste, often described as a burnt or "heated" taste. While dried milk is much easier to transport and store, the unpleasant taste transfers across into foods made from the dried milk. According to connoisseurs, ice cream made from dried milk and mixed with chlorinated \Jwater\j is one of the least memorable tastes in the culinary world.\p
So any method of milk preservation which avoids the burnt taste has a lot going for it. David Barbano, a food science professor at Cornell University in the USA has been drying milk out, using a process similar to that used in making "ice-filtered" beer.\p
Milk is about 90% \Jwater\j, and as it flows through the system, the \Jwater\j is converted to ice crystals which can then be scraped away and removed, until only the milk solids remain. The process uses more electricity than heat drying, but the method runs continually, where heat systems need to be stopped regularly for cleaning, taking the heat machines out of operation for up to six hours a \Jday\j, while the freeze-concentration method can run for as much as thirty days, and causes fewer problems with bacterial contamination.\p
Not surprisingly, the development of this new method has been funded by the machine's manufacturers and an electricity supply company.\p
#
"The end of the CCD?",568,0,0,0
(Apr '98)
Conventional digital cameras use a charge-coupled device or CCD to capture an analog image which is then converted, pixel by pixel, into a digital image, a string of binary codes that can be used to reproduce an image. The CCD is basically a set of detectors, one after another, each catching the light of a small part of an image, turning it into a number value which can be used to build up a digital image.\p
In 1997, the digital image industry sold more than US$1.5 billion of camcorders, cameras, scanners and printers, all including CCDs more powerful than those which, as recently as 1990, were regarded as munitions, objects which could only be exported from the US after specific government approval had been given. By 2001, the industry is expected to sell more than US$2.1 billion worth of equipment using CCDs.\p
The CCD has a number of drawbacks. It is comparatively slow, because the pixels are read off one after another, and it lacks the dynamic range to handle both bright daylight and dim indoors in the same shot. Now hope is on the way, in the form of a CMOS chip which performs both digital capture and image processing, all in one operation.\p
Referred to as the Stanford programmable digital camera program, the work has funding from a number of industry leaders. While the Stanford researchers have already taken out four patents on different aspects of pixel-level processing, information on the process remains a little sketchy.\p
They have revealed that the new imaging chip is made from CMOS, the same technology used to make low-power computer chips. This allows engineers to combine the imaging sensors with computer circuitry, reducing the chip count and cutting production costs. Perhaps more importantly, the pixels from the CMOS chips can be read out in parallel, row by row, rather than pixel by pixel, giving a much faster reaction time.\p
The team also sees an application for the chip in specialized situations such as face recognition systems in automatic teller systems. The chip may be programmed, for example, to capture aspects of a customer's face, so as to provide positive identification.\p
The idea of a camera-on-a-chip has excited some observers to speculate on a future where cameras are as common as a notebook and pencil are now, where a school child might have a shirt button which was also a camera, able to capture images as necessary during the \Jday\j. But first, the new chips have to make it into commercial production, and bring an end to the charge-coupled device.\p
#
"A new use for bagasse",569,0,0,0
(Apr '98)
The stem of a sugar cane is mainly made of sugar, \Jwater\j, and fiber. When the cane is crushed in a sugar mill, the \Jwater\j and sugar are taken out, and the cane fiber is left over. This coarse fiber, known as bagasse, has always been a disposal problem for sugar mills. Solutions have varied from drying the bagasse to provide a fuel which can partly power the mills, to turning the fiber into a board with limited use as a building material.\p
Two American researchers announced a new use in April: turning the bagasse into a coarse mat that can be used in erosion control. Louisiana produces a great deal of sugar - and hence a great deal of bagasse each year, and wife-and-husband research team, Billie and John Collier, felt it was time to do something with the bagasse.\p
Their solution is to treat the fibers with a strongly alkaline solution, which partly breaks down the lignin in the fibers, to make a sort of sticky glue that binds the rest of the fibers together. The result is a coarse cheap fabric that requires no other materials in its manufacture, which can be laid on embankments and earthworks, complete with plant seeds to give the ground a permanent cover.\p
Best of all, the fibers can be baled, shipped to the site, mixed with \Jwater\j, and applied with a high pressure hose, where they will form a secure mat within 45 minutes. Right now, sugar mill owners around the world usually have to pay to dispose of their bagasse. If this scheme takes off, they would be able to take a small profit from the waste material instead.\p
#
"Hydrogen gets cheaper",570,0,0,0
(Apr '98)
Solar cells provide clean \Jenergy\j, but fairly obviously, they are of little use at night. The theoretical answer to that problem has always been to store the solar \Jenergy\j generated during the daylight hours. Old-fashioned lead-acid accumulators, like car batteries, might be used, but these are expensive to buy and maintain, so other forms of storage are being actively pursued.\p
Two main storage solutions are usually put forward: pumping \Jwater\j uphill to a storage dam and later using it to generate hydro-electricity, or using the electricity to produce \Jhydrogen\j which can later be burnt to produce \Jenergy\j to generate electricity.\p
The problem with the \Jwater\j solution has always been finding a suitable high place to store the \Jwater\j, while the \Jhydrogen\j path has had a serious problem: getting a sufficient amount of \Jhydrogen\j for the money spent on setting up the system.\p
A report in \IScience \ithis month describes a new solar \Jwater\j splitter with nearly twice the \Jhydrogen\j output previously recorded. The standard \Jwater\j-splitting method that we all met at school, \Jelectrolysis\j, depends on electricity generated mainly by burning \Jfossil\j fuels to split the \Jwater\j, but this method is far more direct.\p
Solar \Jwater\j splitters work with the help of semiconductor-based solar cells which absorb photons, creating mobile electrical charges which are then channeled to \Jwater\j-splitting electrodes. Then the only problem is to get splitters which can create electrical charges with just the right amount of \Jenergy\j for the process, but which are also good at absorbing sunlight - usually, the splitters which fill the bill on one count miss out on the other.\p
Two chemists, John Turner and Oscar Khaselev, have combined two separate semiconductor layers to get around this difficulty. The first layer is made from \Jgallium\j indium phosphide, which absorbs ultraviolet and visible light and produces electrons with just the right amount of \Jenergy\j needed to produce \Jhydrogen\j at one electrode. The other layer is made from \Jgallium\j arsenide which absorbs infrared light and creates mobile positive charges with the right amount of \Jenergy\j to produce oxygen at the other electrode. \p
The overall result is a sunlight-to-\Jhydrogen\j efficiency of nearly 12.5%, which is quite impressive, but only a start, according to the researchers, who are now looking for cheaper semiconductors to carry out the same task.\p
#
"Portable power gets cheaper",571,0,0,0
(Apr '98)
\JLithium\j batteries, as used in most laptop computers and cellular phones, may soon be cheaper. \JCobalt\j, the most expensive component in the batteries, can be replaced by \Jaluminium\j, one of the cheapest and most abundant elements on \JEarth\j.\p
An April report in \INature\i indicates that the batteries were designed from the ground up, using theoretical considerations - it is more common to begin with existing models, and work by trial and error. Electrical cells (strictly speaking, what we call a "battery" is usually a single cell - a battery is a string of linked cells) all work the same way. Electrons are generated at a negative electrode, flow through a circuit of some sort, and are taken up at a positive electrode.\p
The negative electrodes of "\Jlithium\j batteries" are made of an oxide of \Jlithium\j and \Jcobalt\j. \JCobalt\j is expensive, but scientists had previously assumed that only \Jcobalt\j or metals like it would fill the bill. Gerbrand Ceder and colleagues at the \JMassachusetts\j Institute of Technology used the principles of quantum mechanics to deduce that an \Jaluminium\j-\Jlithium\j oxide would be just as useful. They then made some of this material, tested it, and found that their predictions were fulfilled.\p
The team now believes that its breakthrough may even be important in developing electric cars, because the new batteries will be cheap enough to be used in large banks in suburban runabouts, rather than in expensive consumer items as they are now. One interesting aspect of the study is that the operation of the \Jcathode\j now appears not to depend so much on the \Jcobalt\j, as had been assumed. Rather, it is a function of the oxygen in the oxide, so there could be further developments in this area, since this theoretical work has revealed a whole new class of materials to explore.\p
The \Jcathode\j still contains a small amount of \Jcobalt\j, as the \Jaluminium\j version, while providing a higher voltage as predicted, failed to conduct electricity. It was, say the researchers, all voltage and no current, but the addition of a small amount of \Jcobalt\j fixed this, by fixing the \Jconductivity\j problem.\p
\JLithium\j cells are favored because they have the highest charge density of any rechargeable battery: that is, they can deliver more power between charges than any other type of battery of the same size. Now, if the price can be reduced, the cells could power a car for about US$20,000, although this would only take the car about 120 miles (200 kilometers) between charges. While this would usually satisfy most people's needs, most observers believe that a 200 mile/300 kilometer barrier will need to be beaten to gain public acceptance of electric cars.\p
The team involved in the development had to call in a wide range of other experts in areas such as \Jceramics\j (the \Jcathode\j is a ceramic material, and needs to have its properties "just right"). In the end, they were so successful in getting higher voltages that the \Jelectrolyte\j, the conducting material inside the cell, became unstable, requiring yet more experts.\p
Although it was not covered in the \INature\i report, Professor Anne Mayes at MIT has since produced a new \Jelectrolyte\j in the form of a polymer which is a solid. This is unusual, as most electrolytes are liquid. This property means batteries can now be made in almost any shape, rather than being made as containers designed to hold a liquid. The research group has even speculated that batteries could be made in the form of body panels, cutting down on the weight of an electric car, and improving its efficiency.\p
The MIT group has submitted a number of patent applications on its work to date. The next step: pushing the solid-state \Jlithium\j battery further, and searching for better \Janode\j materials for the battery.\p
#
"A waste of energy...",572,0,0,0
(Apr '98)
\JNatural gas\j is normally found, along with crude oil, in oil wells. Under ideal conditions, the gas can be trapped, treated, transported and sold, but in remote drilling sites, especially on drilling platforms in the sea, collecting and transporting the oil is just not worthwhile, so it is burnt off. Gas is expensive to handle, while liquids are much cheaper to handle, because the \Jenergy\j content in a given volume is much less.\p
So why not convert the \Jmethane\j in the gas to methanol, a liquid, and ship it away in that form? This is easy enough to do: at about 625║ \JCelsius\j, 1150║ Fahrenheit, the \Jmethane\j will convert to methanol in the presence of oxygen, but if it is that hot, it will burn, producing \Jwater\j and carbon dioxide.\p
The answer is to use a catalyst, a chemical which remains unchanged while helping the reaction to take place at lower temperatures. A report in \IScience\i at the end of April indicates that just such a catalyst has been found. The first attempts produce suitable catalysts, organic compounds laced with metal, which were able to convert about 2% of the \Jmethane\j. In 1993, a mercury-based catalyst reached a level of 40% conversion of the \Jmethane\j, but mercury is toxic, and should be avoided wherever possible, so the search continued.\p
The latest catalyst, described as "based on \Jplatinum\j", converts 70% of the \Jmethane\j to methyl bisulfate, which can then be easily converted to methanol in the laboratory. The only question is whether or not the second conversion can be managed in an industrial situation. If it can, the catalyst will provide a new \Jenergy\j source, save on greenhouse emissions, and also remove a threat to seabirds, which are sometimes attracted to, and burnt by, the flares on marine oil rigs.\p
#
"A saving of time...",573,0,0,0
(Apr '98)
Testing catalysts is a slow and difficult business. Two weeks before the methanol catalyst story broke, \IScience\i carried a report on the use of combinatorial chemistry to take the pain out of catalyst development and testing.\p
The method will allow technicians to screen thousands of catalysts simultaneously, and remarkably quickly. The purpose of a catalyst is to take an existing \Jchemical reaction\j, and make it happen faster, or produce a greater yield, or in a few cases, to make the reaction happen at all. \p
Combinatorial chemistry was developed by Mario Gleysen of Glaxo Wellcome, as a way of discovering new drugs, and the method is now used by all of the major drug companies. As it is used here, the method relies on polymer beads to which chemicals can be attached, and a thermographic video camera, a camera which can detect heat by measuring the infrared radiation coming off the beads.\p
The idea is to attach different catalysts or combinations of catalysts to different beads, and then tip all of these into one container, along with the reactants which are to be combined. Where the best catalysts are found, there will be more \Jchemical reaction\js, and that means more heat. More heat means brighter beads when the container is examined with a thermographic camera. So a technician watching the video screen can use a fine needle to draw out the brightest beads, which can then be analyzed to see which catalyst combinations they carried. The researchers are then able to confirm their findings with standard chemical methods.\p
#
"Froth and bubble",574,0,0,0
(Apr '98)
A novel method of separating and purifying chemicals, using air bubbles, has just been announced by Dr Dan Armstrong, inventor of the Chirobiotic column, used to purify drugs.\p
Many organic chemicals such as drugs come in two mirror-image forms, and the two forms are often quite different in their effects, so there is a lot to be said for separating the two forms in many cases. In a press release issued on the \JInternet\j during April, Armstrong's university, the University of Missouri-Rolla, described a recent experiment which led eventually to a patent on the separation of chiral (mirror-image) forms by bubbling air through a solution containing both forms.\p
The air bubbles are more attractive to oil-like organic molecules, so they tend to be carried to the surface, along with the bubbles. The Chirobiotic column, he says, relies on a special class of \Jantibiotics\j to drag out impurities that might make drugs more dangerous or less effective. That makes sound chemical sense, but the same cannot be said for the new line of inquiry, not as it is described.\p
"We wondered if bubbles could be used to separate these mirror image -- or left- and right-handed - compounds," Armstrong says. "No one had ever tried it before. We didn't have a proposal or funding. We were just curious to see if it would work."\p
Well work it did, though not perfectly. The apparatus was a piece of "frit" (porous pot) at the bottom of a glass tube packed with glass beads and filled with a solution which contained a chiral foaming agent in a 50-50 mix, through which air was then bubbled. When the top \Jfoam\j was analyzed, one enantiomer (chiral form) was present in a concentrated form.\p
This is a remarkably cheap process, and Armstrong points out that it does not need to deliver a 100% result on the first pass - it can be repeated time and time again, producing a slight improvement with each pass.\p
The report raises a curious problem: there is absolutely nothing about bubbles that would make them lock onto molecules of just one chirality, so we contacted Armstrong's research group to find out what else was going on. They confirmed that there is indeed no way for a bubble (alone) to discriminate between enantiomeric molecules. Bubbles don't have to do so if one dissolves a chiral surface active "collector molecule" in solution. The chiral surfactant both attaches to the bubble and interacts differentially with chiral molecules, they said.\p
Some of the chiral collectors that also have surface activity are: octyl-beta-cyclodextrin, 2,6-dimethylcyclodextrin heptylproline, dodecylproline, digitonin and vancomycin. The researchers say they are now working on small solid particle collectors that also stick to bubbles.\p
#
"CD players to get larger?",575,0,0,0
(Apr '98)
The amount of information which can be crammed on a CD or a DVD depends on one crucial feature: the wave-length of the \Jlaser\j used to read the disc. Typically operating in the red region of the visible spectrum, these domestic lasers represent one of the main uses for lasers in the world today.\p
As the wave-length gets shorter, so the bits of information can be packed closer together, so more information can fit on a disc. Until late in 1996, the world's shortest wave-length from a tunable \Jlaser\j came from the OK-4 klystron, an ultraviolet free-\Jelectron\j \Jlaser\j in \JRussia\j which delivered bursts of light at 240 nanometers, then the record was narrowly taken by a similar Japanese device, but the Russian \Jlaser\j, now situated in the United States, has achieved a new record of 236 nanometers.\p
Electrons provide the \Jenergy\j for all lasers, storing \Jenergy\j and then releasing it suddenly, but in normal lasers, the electrons are held around atomic nuclei. As a result, the laws of quantum physics limit the electrons to emitting \Jenergy\j only at a few specified wave-lengths which depend on the atom being used.\p
Free-\Jelectron\j lasers generate the \Jlaser\j light by perturbing beams of free electrons, not associated with atoms at all. And because of this freedom, the electrons can be tuned to different wave-lengths. The OK-4 works by passing electrons through magnetic fields to make the electrons emit light, which is then bounced up and down the length of an "optical cavity".\p
But don't count on seeing one of these in your home just yet: the central component of this \Jlaser\j is a cavity 173 feet (52 meters) in length. Maybe the CD players will need to be a bit bigger next year? For now, we may have to be satisfied with seeing the \Jlaser\j used in a range of scientific experiments and medical applications. Your CD player will need to make do with a small infrared semiconductor \Jlaser\j with lower information densities.\p
#
"Stable collagen",576,0,0,0
(Apr '98)
Collagen is a protein which is found in bone, cartilage, skin, and tendons. It is collagen fibers which give our bodies their structure and shape, but scientists have long puzzled over how the material can be made more stable.\p
In the ordinary world, we know collagen as the material used in cosmetic surgery, where collagen from \Jcattle\j can be used to give fuller lips and wrinkle-free skin, but these improvements are lost as the collagen breaks down. More importantly, many serious illnesses are caused by collagen breaking down, conditions like \Jarthritis\j, \Jrheumatism\j, brittle bones, lupus, \Jcirrhosis\j, and cataracts.\p
A report in \INature\i during April describes a way of looking at collagen which may expand its potential in treating serious disease, healing wounds, and repairing damaged organs. The author, Ron Raines says "We have essentially shown the way to create a stronger collagen that would not be as susceptible to breakdown in the body. This research marks a fundamental change in how we understand the structure and stability of collagen."\p
There are about ten types of collagen in the human body, each formed from chains of amino acids arranged into helixes. The links formed between hydroxyl groups by "bridging \Jwater\j molecules" appear to hold the triple helix together. The material appears to unravel at the molecular level when these links are broken.\p
Raines replaced the hydroxyl groups in collagen with \Jfluorine\j, producing a much more stable form. He believes that the molecules in his new form of collagen are holding themselves together with electrostatic forces.\p
The standard measure of collagen stability is its "melting point", the \Jtemperature\j at which the strands begin to unravel. Normal collagen forms are stable at up to 58║ \JCelsius\j (136║F). The \Jfluorine\j-laced collagen in Raines' lab remained stable at up to 91║ C, or 196║ F, making it much more stable at body temperatures.\p
Raines plans now to see whether natural collagen can be stabilized in a similar way. The \Jfluorine\j treatment is not feasible, but the experiments are at least pointing in the right direction. One interesting aspect may be in treating \Jarthritis\j, where the body's own immune system attacks collagen and destroys it: Raines suggests that the process might be triggered when collagen begins to unfold.\p
#
"How bacteria work together",577,0,0,0
(Apr '98)
\JBacteria\j existing together in a structure called biofilm are often more resistant to attack by \Jantibiotics\j and the immune system than they are as individual cells. A report in \IScience\i during April describes the way in which one bacterium, \IPseudomonas aeruginosa\i, sends a chemical message that is required to create the biofilm.\p
\JBacteria\j may be individuals, but they are also very successful at forming communities. A bacterium finds a surface, settles on it, and begins to reproduce and spread. At a certain density, the cells build a complex biofilm structure with built-in \Jwater\j tunnels to carry \Jnutrition\j into the cells and carry the waste out. This process, referred to as differentiation, is under the control of the newly-discovered chemical signal.\p
The signal is made of two different molecules, both of which turn on the genetic machinery that produces bacterial toxins, but only one molecule is responsible for the cell-to-cell instruction to the \Jbacteria\j to build the biofilm. When the signal gene is removed from \Jbacteria\j, they fail to form biofilm, and when the signal is added, they once again form biofilm.\p
#
"RAP and Staph",578,0,0,0
(Apr '98)
Around the world, the dangerous pathogen \IStaphylococcus aureus \iis now acquiring resistance to the antibiotic of last resort, vancomycin, and as more cases arise, researchers are looking at other ways of controlling the bacterium which causes infections ranging from skin abscesses to toxic shock syndrome.\p
A report in \IScience\i in April indicates that a protein in the bacterium called RAP is responsible for controlling the production of toxins and other proteins that make the bacterium pathogenic. More importantly, the researchers find they can reduce the effects of \IS. aureus\i infections in mice by controlling the activity of RAP. The future, they suggest, may be made secure by developing drugs which stop RAP from doing its job, or by developing vaccines to make our immune systems attack RAP. In either case, the defense would not attack the \Jbacteria\j, and so should not trigger another round of evolutionary counter-action from the \Jbacteria\j as the "weaker" forms are selected out of existence.\p
RAP was discovered some years ago by Naomi Balaban, an infectious-disease researcher. It seems that when RAP levels reach a critical concentration, it sends a signal to trigger the production of toxins. In an extension of that work, Balaban and her colleagues have been inoculating mice with purified RAP. The immune response takes some weeks to develop, but people at risk could be treated before they go into \IS. aureus\i "hot zones," mostly found inside today's hospitals.\p
#
"Taking another tack",579,0,0,0
(Apr '98)
An alternative approach to antibiotic resistance may be to go after some other mechanism in the bacterium. In particular, researchers are looking at compounds which inhibit "two-component signal transduction systems," systems which help \Jbacteria\j to get established in the hostile environment which is a new host.\p
\JBacteria\j produce chemicals known as "virulence factors," which help the bacterium to survive and grow at the site of the infection. If any one of these can be attacked, it may offer the chance to trip the odds in favor of the host (us) overcoming an infection which could otherwise kill us. The factors may be attacked by \Jmutation\j, antibody neutralization (an immune reaction which might be set up by \Jvaccination\j), or chemical inhibition, which is how \Jantibiotics\j inhibit \Jbacteria\j today.\p
The two-component signal transduction systems are the only common regulatory elements shared by a wide range of virulence systems. This means there is a possibility that a single broad spectrum inhibitor to such elements may suppress virulence in a variety of microorganisms. A compound known only as RWJ-49815, and its derivatives, were reported in April as the first series of inhibitors of two-component systems with demonstrated bactericidal activity against a broad range of microorganisms. This may be a topic to keep an eye on.\p
#
"A possible vaccine against tuberculosis",580,0,0,0
(Apr '98)
TB kills 3 million people each year, more than any other infectious disease, it shows a high level of antibiotic resistance, and it looks set to infect half a billion people by the year 2050. While the BCG vaccine is of some use in protecting children, it seems to have little effect in protecting adults.\p
BCG (bacille Calmette-GuΘrin) was first prepared in 1927 by attenuating \IMycobacterium bovis\i by more than 230 passages on \Jbile\j \Jglycerol\j \Jagar\j. Resistance to this species of bacterium gives children protection against the related TB germ. Now scientists at the Max Planck Institute for Infection \JBiology\j, Berlin and at the University Clinics of Ulm say they have developed a likely vaccine candidate to work directly against \IMycobacterium tuberculosis\i, which actually causes TB.\p
A report in the \IProceedings of the National Academy of Sciences\i at the end of April explains that protection against tuberculosis depends on T cell-mediated immunity. Two types of T cell react to the TB bug: Major histocompatiblity complex (MHC) class II-restricted CD4 T helper cells and MHC class I-restricted cytotoxic CD8 T cells. These two types are not affected by the BCG vaccine.\p
In simple terms, they have used recombinant BCG strains which carry genes from \IListeria monocytogenes\i. These genes cause the BCG to stimulate the CD8 T cells, leading to the r-BCG strains becoming potential tuberculosis vaccines with greater power than anything tried previously.\p
#
"Eat up your greens...",581,0,0,0
(Apr '98)
Most vaccines are at least partly made of proteins and related compounds, making it difficult to take the treatment by swallowing it, as our unintelligent digestive systems take the vaccines for food, and break them down into small bits which are probably nutritious, but of little use in fighting disease.\p
Late in April, in \INature Medicine\i, researchers supported by the National Institute of \JAllergy\j and Infectious Diseases (NIAID) described for the first time an edible vaccine which can safely trigger significant immune responses in people. Such drugs have the potential to be very important in parts of the world where clean needles are too costly, and where storage of injected vaccines can cause major problems (most vaccines need refrigeration).\p
Trials involved volunteers eating bite-sized pieces (50 grams/2 ounces or 100 grams/4 ounces) of \Jpotato\j which had been genetically engineered to produce part of the toxin secreted by the \IEscherichia coli\i bacterium, which causes diarrhoea. The transgenic potatoes had previously been shown to stimulate strong immune responses in animals. The trial enrolled 14 healthy adults, and 11 were chosen at random to receive the genetically engineered potatoes while three received pieces of ordinary \Jpotato\j. \p
The investigators periodically collected blood and stool samples from the volunteers to evaluate the vaccine's ability to stimulate both systemic (blood) and intestinal immune responses. Ten of the 11 volunteers (91 percent) who ate the transgenic potatoes had fourfold rises in serum \Jantibodies\j at some point after \Jimmunization\j, and six of the 11 (55 percent) developed fourfold rises in intestinal \Jantibodies\j. The potatoes were well tolerated and no one experienced serious adverse side effects during the three-week trial.\p
Other edible vaccines are on the way, with bananas and potatoes being conscripted in the fight against Norwalk virus, a common cause of diarrhoea. Potatoes (once again) and tomatoes may in the future help protect us against hepatitis B. The other plants may be regarded as good news, since raw \Jpotato\j is not all that much fun to eat. In the trial, it had to be peeled to avoid any reaction from the volunteers to chemicals which may be present in the \Jpotato\j skin.\p
The gene which was inserted into the \Jpotato\j is similar to a troublesome gene in the \Jcholera\j germ, pointing to a potentially important future for this technology. Bananas take longer to start producing fruit, but are a favorite with children. As one of the research team commented, "Which would you rather be vaccinated with - a needle or a banana?"\p
#
"The chicken that laid the golden eggs?",582,0,0,0
(Apr '98)
Another interesting idea which gained mention in the press during April involves getting drugs from transgenic chickens - or rather, from the early embryo cells in eggs laid by transgenic fowls. Special genes are inserted into embryonic cells which are then placed in another embryo in another egg. Later, the proteins made by the descendants of the altered cell can be harvested, either from the chicken, from eggs, or from their blood.\p
The secret lies in the way the target cells are cultured, so as to slow the rate at which they mature and develop into more complex forms, a process called differentiation.\p
#
"Breast cancer screening",583,0,0,0
(Apr '98)
A review of mammogram techniques shows that over a decade of annual screenings, half of all American women will get at least one false-positive response, with almost 20% undergoing a \Jbiopsy\j. The results were published in the \INew England Journal of Medicine\i, early in April. \p
There will always be a trade-off between false positives and false negatives, where somebody with cancer is told they are in the clear. The overall level of false responses can be reduced by making the test more reliable (and that usually means more expensive) but in the end, a cut-off has to be set, and that needs to take into account the likely effects of false responses of either sort. A false positive means some unnecessary worrying, a false negative could mean somebody dying.\p
Mammograms were not the only source of false positives: clinical examinations produced about half as many as mammograms did. Overall, for every $100 spent for screening, an additional $33 was spent to evaluate the false-positive results.\p
False-positive results are far less common in two comparable countries, Sweden and \JAustralia\j, but no clear reasons for the differences have been put forward: it could relate to better training of the people reviewing the mammograms, or it may relate to a greater fear of being sued for a fatal error in the USA. In either case, it would seem important for American women undergoing screening to be told in advance about the high level of false positives, and the need to test further when an abnormality is found.\p
\BKey names\b: Joann G. Elmore, Mary Barton, Suzanne Fletcher, Philip Arena, Victoria Moceri and Sarah Polk.\p
#
"Aspirin and heart attacks",584,0,0,0
(Apr '98)
A report in \IThe Lancet\i during April offers an explanation for the protection aspirin seems to give against heart attack. The effect appears to relate to a specific genetic factor present on the surface of clotting cells called \Jplatelets\j.\p
Somewhere between 25% and 50% of the population can expect to reduce their risk of a heart attack if they take daily doses of aspirin, but the limitation of the effect to just this group has been puzzling researchers. Now it seems that aspirin may specifically target patients who display an altered gene, called the PlA2 polymorphism, which impacts upon the protective action of aspirin.\p
Genes change in two main ways: by mutations and polymorphisms. A \Jmutation\j is a gross change which alters the order of amino acids in a protein, which then alters the shape of the final molecule, changing what it can or cannot do. In most cases, the mutated form is unable to operate. A polymorphism produces a change which does not seriously alter the final shape of the folded protein molecule, so that it can still carry out its normal function. Occasionally, a polymorphism may even turn out to be good for the people carrying them.\p
Heart attacks arise when blood begins to clot, and this happens when special \Jreceptors\j on the surface of blood \Jplatelets\j bind to fibrinogen, making the \Jplatelets\j "sticky." If the \Jplatelets\j are not sticky, clots do not develop, and heart attacks do not happen. The effect of aspirin seems to be tied in with stopping the process of binding to fibrinogen.\p
The frequency of the PlA2 polymorphism in the general population is similar to that of the aspirin effect, and the blood \Jplatelets\j of people with the PlA2 polymorphism are 10 times more sensitive to the effect of aspirin than are the \Jplatelets\j of individuals who do not have the polymorphism.\p
This might mean "gene-typing" patients in the future, so that if patients lack the PlA2 polymorphism but still have problems with heart attacks, those patients might be given one of several other anti-clotting drugs, such as warfarin.\p
#
"A place in the sun...",585,0,0,0
(Apr '98)
In the USA, old people flock to \JFlorida\j. In Britain, many retired people head for \JSpain\j. \JAustralia\j's elderly head for the warmth of \JQueensland\j. Now it seems that rats have a similar strategy. American researchers claim that geriatric rats instinctively ward off sickness by huddling in hot spots. They suggest that ongoing studies of rats' behavior may suggest drug-free strategies to help older people fight infections.\p
Like many older people, elderly rats develop limited or no fevers when they have an infection. High fevers may be dangerous to very young children, but raising the body \Jtemperature\j appears to be a powerful weapon in the immune system's arsenal. In most cases, fever helps the body combat dangerous pathogens.\p
\BKey names:\b Evelyn Satinoff, Maria Florez-Duquet, Elizabeth D. Peloso, University of \JDelaware\j.\p
#
"Sponges in the news",586,0,0,0
(Apr '98)
Sponges are among the simplest of all multi-celled animals. While humans have hundreds of different tissue types and even more cell types, sponges have just a few different cell types, and these do not separate out into specialized tissues, or groups of similar cells. In fact, you can strain the separated cells of a sponge through a cloth, and the cells will recombine into a sponge again on the other side.\p
Dividing cells use "motors" called kinesins to move the separated chromosomes to opposite ends of the cell as the nucleus divides. The kinesin "motor" can be disabled by a chemical produced by a sponge from the western Pacific, according to a report in \IScience\i during April, based on work carried out in \JCalifornia\j.\p
Nobody knows very much about the kinesins, or how they act, but all cancerous cells have to divide, so researchers think it possible that the chemical, acodiasulfate-2, or AS-2 for short, may help to combat cancers.\p
Meanwhile, \JFlorida\j researchers have signed an agreement with a Swiss pharmaceutical company to market discodermolide, which comes from the sponge \IDiscodermia dissoluta\i, first identified in 1987. The action of the chemical appears to be similar in its effects to those of Taxol.\p
#
"Tea with milk, anybody?",587,0,0,0
(Apr '98)
\JAustralia\j has the highest rate of skin cancer in the world. Two out of three Australians develop some form of skin cancer during their lifetime. This is probably due to a mixture of causes: \JAustralia\j has a large population of pale-skinned people living in or close to the tropics, and the Australian enthusiasm for all sorts of sport leads to high levels of ultraviolet exposure.\p
There may, however, have been one saving factor - at least in the past, when most Australians drank tea every \Jday\j. Tea seems to provide protection against skin cancer, at least if you are a mouse, and so long as you drink it with milk. Mice under this treatment experienced a reduction in the development of skin cancer of 50 per cent and a reduction in the development of papillomas of 70 per cent when exposed to UV A+B.\p
Tea is a rich source of special \Jantioxidants\j called flavonoids, considered to be some of the most potent \Jantioxidants\j in nature, and these are assumed to lie behind the protective effect. The experimental mice drank only tea with 10% milk, while control groups were given 10% milk in \Jwater\j and plain \Jwater\j.\p
Skin cancers were less common in the past, and it might be possible to argue that 19th century Australians drank more black tea, so these results are important, because they show that milk does not bind the active ingredient in the tea, whatever it is. The results were presented at an International Symposium on Tea and Health, hosted by CSIRO and supported by the Lipton Tea Centre in Sydney.\p
#
"Gene therapy in the news",588,0,0,0
(Apr '98)
A report in the \IProceedings of the National Academy of Sciences\i indicates that \Jarthritis\j in rabbits can be treated by gene therapy. \p
Arthritic joints become inflamed when immune cells cause nearby tissues to release two molecular signals, IL-1 and TNF-\fa\n, which set off a round of swelling, tissue degradation, and pain. The search has been on to find a way of stopping just these signals, without causing broad immune suppression.\p
The trick was to alter adenoviruses to carry IL-1 and TNF-\fa\n soluble receptor genes, and then to inject these into the knees of arthritic rabbits. Four treatments were carried out: a control group with an equivalent but useless protein, and the experimental groups with either one of the receptor genes, and finally, a group with both.\p
The rabbits which received IL-1 alone or with TNF-\fa\n showed a marked improvement, even in their uninjected knee, and a further test with an adenovirus carrying the luciferase gene, which makes a protein that glows, showed that the genes were being carried across in the rabbits' white blood cells.\p
Mice with a disease like multiple sclerosis also have a better outlook from gene therapy. Researchers were able to insert a gene coding for anti-inflammatory proteins, known as suppressor cytokines, into immune system cells that naturally home in on inflamed tissues, using specially inactivated retroviruses as the delivery vehicle. The viruses are able to survive and carry genes, but they cannot reproduce.\p
The immune cells then began churning out the \Jinflammation\j-fighting proteins where they were most needed. Mice treated with the tailored viruses showed less severe symptoms of the MS-like disease than untreated mice, raising hopes that the technique might also be useful in treating rheumatoid \Jarthritis\j and diabetes.\p
The study focused on mice with a disease known as experimental autoimmune encephalomyelitis, which serves as an animal model for MS. In both diseases, the immune system turns against the very tissues it was designed to protect, attacking a protein called \Jmyelin\j that insulates nerves. The nerve impulses then go awry, causing impaired vision and motor control and leading to a gradual decline in function that ends in death.\p
The most exciting aspect is that retroviruses permanently integrate themselves into the DNA, so they make ideal vehicles for carrying genes into the body. An additional plus: the researchers were able to monitor the amount of suppressor cytokine being produced by tagging the inserted gene with a green fluorescent "marker." If the T-helper cells were bright green, that meant they were producing lots of the helpful protein. Knowing this enabled the researchers to calibrate the dosage levels and minimize toxicity.\p
\BKey name in the \Jrabbit\j research\b: Paul Robbins.\p
\BKey names in the mouse research\b: Garrison Fathman, Michael K. Shaw, Lawrence Steinman.\p
#
"The revenge of the non-mutant ulcer bacteria",589,0,0,0
(Apr '98)
In a report in the journal \IMolecular \JBiology\j\i during April, researchers Avery Goodman and Paul Hoffman reveal why the \Julcer\j-causing bacterium, \IHelicobacter pylori\i, is sensitive to metronidazole, and how it becomes resistant to this drug. The bacterium infects more than half the world's stomachs, and is a major early risk factor for \Jstomach\j cancer.\p
The researchers say that there is a danger in using the drug alone as a treatment to get rid of \IH. pylori\i, as metronidazole on its own can be converted by a bacterial \Jenzyme\j to hydroxylamine, a mutagen and cancer-causing chemical. Metronidazole is a generic drug sold as Flagyl, MetroGel and Protostat.\p
In the United States and Western Europe, between 10% and 30% of \IH. pylori\i strains are resistant to the drug, rising to 80% in some developing countries. The resistance arises from \Jmutation\j in a gene called rdxA. This gene codes for one of the nitroreductase enzymes that allow \IH. pylori\i to break down organic \Jnitrogen\j compounds. This \Jenzyme\j also happens to convert metronidazole to hydroxylamine, which damages DNA, proteins and other macromolecules and almost as an aside, kills the bacterium. When the rdxA gene is inactivated by \Jmutation\j, however, \IH. pylori\i cannot break down metronidazole and therefore becomes resistant.\p
So it is a small consolation: if the bacterium can stand up to the drug you use in treatment, it will not produce the cancer-causing hydroxylamine, but if you are able to take the bacterium out with the drug, it will also be paying you the same compliment.\p
The research involved \Jcloning\j and sequencing rdxA, inserting the gene into \IEscherichia coli\i, which is normally metronidazole-resistant, making it sensitive to the drug, and also inserting extra copies of rdxA into \IH. pylori\i, to make it sensitive to the drug once more.\p
In many countries, the drug can be purchased very cheaply without a prescription, and it usually is used at doses that are insufficient to kill all the \IH. pylori\i cells a person might carry. That person therefore would accumulate resistant strains, selected by the drug, and sensitive strains, which would make hydroxylamine in the \Jstomach\j.\p
It might be a good time to check your medicine cabinet.\p
#
"Marking by computer",590,0,0,0
(Apr '98)
Essay examinations are generally preferred by educators over objective tests which allow the answers to be "machine-marked." The essay question allows well-prepared people to show very clearly just how well they can put their knowledge together, but the reliability of a single marker on different occasions is not always good. With more and more students composing their answers on computers, it is now possible to consider marking essays by computer as well. \p
So would you be happy with a computer marking your essays? A group of American educational researchers say that you should. After ten years of development, they have announced a piece of software, the "Intelligent Essay Assessor" which has returned reliable results across a wide range of ages. The software is, they claim, the only method for scoring the knowledge content of essays which has been extensively tested and published in peer-reviewed journals. They describe the software as using mathematical analysis to arrive at an estimate of the quality of the knowledge expressed in the essays.\p
The software is also able to advise essay writers on ways to improve their work. In one trial, students were able to submit their work over the Web, get advice about its shortcomings, revise the essays, and resubmit them. According to the system's developers, the essays improved with each revision, and they claim that students preferred the computer over human markers.\p
The system is based on an artificial intelligence method called Latent Semantic Analysis, which requires around "twenty times the power of a normal PC" to do the complex analysis that is required. The principles are rather similar to those of a neural network. The software program is "fed" somewhere between fifty thousand and ten million words from online textbooks and other machine-readable sources, allowing the system to assign mathematical measures of similarity between words, so that synonyms can be identified and given equal credit.\p
The program can evaluate essays by taking a good statistical sample of graded essays, and then grading the remainder to fit that pattern, ensuring that the machine grades on the established pattern. Alternatively, the system can assess essays against a specially written specimen answer, a standard of perfection, set by a faculty member, or the system may be used only to provide advice to the essay writers to help them improve their work.\p
The system shows perfect mark-remark reliability, and correlates as closely with the performance of a human marker as two human markers do with each other. And can the "system" be beaten? Yes, say the developers: you can cheat the system by knowing the work really well, and writing a good essay.\p
#
"Off their rockers?",591,0,0,0
(Apr '98)
Elderly patients in nursing homes, showing symptoms of \Jdementia\j can get rid of their anxiety and depression by rocking in a rocking chair for an hour or two a \Jday\j. So the stereotype of old people, happily rocking on a porch seems to have some truth behind it after all.\p
The experimental study involved disabling the rocking mechanisms on some rocking chairs and monitoring patient behavior for six weeks, and then comparing this with a six-week period when the rocking mechanism was functioning again.\p
A number of behaviors reflecting distress lessened during the rocking period, and several patients asked for pain medication less often when the rockers were rocking. The most enthusiastic rockers also seemed to have a better sense of balance, an important consideration for old people whose bones have grown brittle.\p
So long as they don't smoke in their rockers, perhaps. Late in April, a paper to the American Academy of \JNeurology\j's 50th Anniversary Annual Meeting described how smokers may lose their cognitive abilities, such as remembering, thinking or perceiving, more rapidly than elderly non-smokers.\p
The study was based on four European population-based studies, and included 9223 non-demented people age 65 and older. Twenty-two percent were current smokers, 36 percent were former smokers, and 42 percent had never smoked. They were tested twice: initially and approximately two years later on functions important in daily life, including short-term memory, time and place orientation, attention, and calculation.\p
The result pointed to one conclusion: that current smokers had a significantly larger decline than people who stopped smoking and people who never smoked, even when the data were adjusted to allow for other factors such as age, education, and history of stroke. Pointing in the same direction, cognitive decline for former smokers was slightly more rapid than for never smokers, although the difference was not statistically significant, suggesting that this was a less powerful effect.\p
But is it the fault of the old people that they smoke? A study at the University of \JMichigan\j Medical School suggests that only about one-third of the teenagers who experiment with \Jtobacco\j go on to smoke regularly. There is now a strong suspicion that some people are "destined" to become smokers because they are inherently more sensitive to the effects of nicotine, particularly the pleasurable effects, than people who are not tempted to smoke again.\p
If the suspicion is well-founded, say the authors, then \Iany\i advertising that encourages teenagers to try that "first" \Jcigarette\j is incredibly dangerous. The researchers point also to the possibility that the tendency to find nicotine pleasant may be inherited - it is already known that smoking tends to run in families, but there is some argument about whether this is a genetic effect or an environmental effect. If the smoking tendency is linked to enjoying nicotine, this would support a genetic explanation. The research is detailed in a paper to be published in the April issue of the journal \IAddiction\i. The senior author was Ovide Pomerleau. \p
#
"The spread of the mammals",592,0,0,0
(Apr '98)
Most people know these days that there were mammals around when the dinosaurs ruled \JEarth\j, but these early mammals are mostly thought of as small timid things, keeping out of sight of the terrible lizards, and only coming into their own, spreading out to fill all the new niches, once the dinosaurs died out.\p
According to a report in \INature\i in late April, Sudhir Kumar and S. Blair Hedges have shown otherwise. Using a huge array of genetic sequences that have been determined for different \Jmammal\j groups, they believe they can show that the major groups of mammals emerged well before the \Jextinction\j of the dinosaurs. Instead of a sudden explosion of \Jevolution\j, powered by opportunism, 65 million years ago, they paint a picture of the modern orders of mammals first evolving when the continents were separating during the Cretaceous era about 100 million years ago.\p
Fossils can never tell a story like this, because the \Jfossil\j record will always have gaps, and \Jfossil\j species, in any case, do not necessarily have descendants alive today. From the other point of view, every animal alive today had ancestors back then.\p
Kumar and Hedges worked on Genbank, the growing collection of gene sequences which is maintained by the US National Institutes of Health. Working on thousands of vertebrate gene sequences from hundreds of species, they selected a set of 658 genes from 207 vertebrate species which develop mutations at a constant rate over time, and then used this set to trace each group back to its origin. Unlike fossils which often yield an underestimate of separation dates, the gene sequences "start the clock" just as soon as two populations separate.\p
The genetic clock was calibrated by using a widely accepted date, 310 million years ago, as the point at which birds and mammals split into separate lines of development. Some of their findings matched the \Jfossil\j record, but others were quite different. According to the fossils, mice and rats separated about ten million years ago, but the gene sequences say the two lines split some 41 million years ago, based on the evidence of 343 separate genes.\p
#
"Astronomy in the Sahara?",593,0,0,0
(Apr '98)
A report in \INature\i, early in April, described an interesting set of stone constructions found in the Sahara Desert, at the Nabta Playa in southern \JEgypt\j. The alignment of the stones seems to indicate either a calendar or an astronomical observatory, together with what are being described as tombs for \Jcattle\j.\p
The \Jcattle\j were presumably sacrificed, and this leads archaeologists to believe that the stone structures were set up by a complex society with a hierarchical structure. The dead \Jcattle\j were buried in chambers lined with clay and covered with stones, which suggests sacrifice as the likely cause of the animals' deaths. This view is supported by finding articulated (that is, unbutchered) animal bodies at the site. If the animals had been killed for other reasons such as a feast, the parts would be unlikely to be found together.\p
There were a number of alignments of unshaped stones, and what may be \JEgypt\j's earliest astronomical measuring device, a "calendar circle," which appears to have been used to mark the summer \Jsolstice\j. The circle included four pairs of stones, each pair separated by a narrow gap. Two of the pairs point north-south, the other pair aligns with sunrise at the summer \Jsolstice\j, the time when the monsoon rains started.\p
The Nabta Playa is an ancient lake bed near the Tropic of Cancer, about 600 miles south of \JCairo\j. The region was extremely arid until around 11 000 years ago, when the summer monsoons of Central \JAfrica\j moved northward. The increased rain allowed temporary lakes such as Nabta to form each year in geological depressions. The monsoons shifted southward again about 4800 years ago, again leaving the area with virtually no rainfall.\p
During the "wet period", the area probably had between 10 and 20 cm (4-8 inches) of rain each year. This would have been enough to support Nabta as a centre where widely separated groups could gather for various ceremonies. The estimated dates for the find are in the vicinity of 8100 to 7600 years before the present, well before the rise of the dynasties which would later build the pyramids.\p
#
"How Lucy walked",594,0,0,0
(Apr '98)
The average \Jfossil\j is just a few scraps of bone, mainly because dead bodies are usually eaten, crunched and scattered by scavenging animals. This means the people who study fossils have to bring the full range of science to bear, in order to understand what they were like.\p
Early in April, the American Association for Physical \JAnthropology\j was told how \Jmathematics\j and \Jengineering\j can also be used to reconstruct the way that "Lucy" walked. Patricia Kramer says that \IAustralopithecus afarensis\i, the \Jhominid\j species known best from "Lucy" may have had short legs, but on her analysis, they walked with greater ease and efficiency than was previously believed.\p
Conventional wisdom has it that our longer legs evolved because they were more efficient for walking, but Kramer believes that Lucy's legs were excellent walking equipment - at the time. Kramer, a registered civil engineer, works for Boeing, and used \Jengineering\j concepts that are more usually applied to the structural problems of \Jaircraft\j to explain the early evolutionary record of humans.\p
Her study looked at the \Jenergy\j required to walk at a slow stroll, a normal pace as we would understand it, and walking in a rush. From this, Kramer deduces that hominids like Lucy walked slowly, probably only covering small distances each \Jday\j, while the hominids who came later would have been faster walkers, covering much larger distances. \p
In short, they were slow-speed, strolling foragers, who must have lived in an environment where the food was fairly easy to find. Scientific evidence generally suggests that the climate in \JAfrica\j got a great deal drier at the time when the earliest members of the long-legged \IHomo\i \Jgenus\j were evolving, which would make sense of Kramer's thesis.\p
#
"Earlier human speech?",595,0,0,0
(Apr '98)
When did humans first begin to speak, and when did they first begin to communicate? While the speech area of the brain, called Broca's area, appears to be present in brain casts of fossils almost 2 million years old, anatomical evidence from the throat region in fossils suggests that only modern humans could speak. Before that time, hominids and humans may have communicated by signing, but speech appears to be ruled out, based on the anatomy. A tentative date for speech is set at 40 thousand years, because that is when symbolic representations begin to appear in the archaeological record in Europe.\p
This belief that speech is recent is an inference, based on certain assumptions. True speech requires a large number of separate sounds, so there can be a significant variability and a large vocabulary. Neandertal humans do not seem to have had the necessary vocal equipment to produce many different sounds. The big problem is that the key section of the base of the skull which would tell us more is usually missing.\p
A report in the \IProceedings of the National Academy of Sciences\i in late April takes a different approach. The researchers made \Jrubber\j casts of the hypoglossal canals in the skulls of chimpanzees, gorillas and humans, as well as those of three specimens of the early "man-\Jape\j" \IAustralopithecus\i, two archaic members of the \Jgenus\j \IHomo\i, two Neandertals and one early \IHomo sapiens\i. The hypoglossal canal carries the motor nerve controlling the tongue, and they believe that the size of this canal reflects the fineness of the motor control over the tongue.\p
Their conclusion: the canal in Neandertals and early humans more closely matched that of modern humans than did the smaller canals of apes and proto-humans such as \IAustralopithecus\i. In other words, on this piece of anatomical evidence, humans may have spoken much earlier than we presently believe.\p
The researchers tried to make allowance for the different sizes of the tongues in different specimens by correcting for the size of the oral cavity (mouth), but the hypoglossal canal also carries two small arteries and a vein, and they have made the assumption that the relative sizes of these blood vessels in apes and early humans were constant.\p
#
"Explaining grandmothers",596,0,0,0
(Apr '98)
Why do women lose the ability to reproduce, many years before they reach old age. One explanation of menopause has been that women need child-rearing help from their mothers, and that menopause frees older women to help out. Or maybe it is just a natural consequence of aging.\p
If the "grandmother hypothesis" is correct, if there is an evolutionary advantage in having older females helping to rear the young, this ought to show up in other species as well, but University of \JMinnesota\j \Jecology\j professor Craig Packer has found just the opposite in his studies of lions and baboons, according to a report in \INature\i during April. \p
Rather than being an evolutionary advantage, it looks as though menopause just fails to be an evolutionary cost, and so it continues. There is certainly no advantage in females breeding right up until the time they die, and when fertility drops off a few years earlier, there is still no "cost". In pre-technological societies, where women could expect to live 50 years, and where a child, in order to survive, needed its mother until the age of 10, then reproductive decline could begin at age 40. It appears that a model like this still operates among humans, and that we simply have not adjusted to our extended life expectancies.\p
Female baboons usually die at about 26 or 27, and stop reproducing at 21 years: a \Jbaboon\j cub, orphaned at less than two years, usually dies. But the best evidence came from lions, where only those older females who were still raising cubs seemed to be able to nurse grand-cubs. Lion cubs are independent at one year, and so lionesses are able to breed almost to the end of their lives.\p
Although grandmother lions and baboons both engage in what's called kin-directed behavior, they had no measurable impact on the survival or reproduction of their grandchildren or adult daughters. Supporters of the grandmother hypothesis are unconvinced, arguing that the human menopause goes on for a much longer time, with more than 80% of female hunter-gatherers surviving past menopause, compared with much smaller numbers of female lions and baboons. For now, the argument remains open.\p
#
"Measuring old earthquakes",597,0,0,0
(Apr '98)
Once an \Jearthquake\j is gone, you might assume that there was no way of getting it back, but now researchers have found a way of identifying ancient seismic activity. A report in \IGeology\i during April describes the damage caused to the walls of a castle destroyed by an \Jearthquake\j with a \Jmagnitude\j of about 7.6 which shook the Middle East during the \JCrusades\j, on May 20, 1202.\p
A 2-meter kink in the walls of the ruined fortress, Vadum Jacob, overlooking the Jordan Valley, provide clear evidence of a major \Jearthquake\j. The quake happened along the Dead Sea Transform, a dangerous fault that lies between the \JSinai\j and \JArabia\j plates, and one which could "go again at any time," according to experts.\p
The 1202 \Jearthquake\j affected a large part of modern Israel, \JSyria\j, Jordan, and \JLebanon\j, but the fault has been quiet since that time, perhaps warning us of a looming problem.\p
The castle was built from precisely aligned \Jlimestone\j blocks by the Crusaders in 1178, and sacked by Muslim forces a year later. There is a total shift of 2.1 meters, and arrow-heads and other traces of warfare suggest that there was an original movement of 1.6 meters in the first quake, with the rest coming from major temors in 1759 and 1837.\p
#
"Extending the seismic net",598,0,0,0
(Apr '98)
Accurately locating an \Jearthquake\j depends on having seismic detectors in the right places. While most large land masses now have enough stations, the 70% of the \JEarth\j covered by the seas has been poorly observed in the past. Now that is all to change. During April, the US National Science Foundation's Ocean Drilling Program installed the first of many planned Geophysical Ocean Bottom Observatories (GOBO), in which a permanent \Jseismograph\j station will be established on the sea floor, where it will monitor \Jearthquake\j activity. Up until now, the oceans have been covered only by a few island stations: the planned GOBO network will fill in the picture for \JEarth\j scientists far more effectively.\p
#
"A weather index for ordinary people",599,0,0,0
(Apr '98)
How do you explain global warming when the effects may be measured in terms of hundredths of a degree? These changes may mean something to professionals, but they hardly register with lay observers, who respond rather more to apparent "signs" such as unusual \Jweather\j which may have nothing to do with long-term change.\p
A report in the \IProceedings of the National Academy of Sciences\i in mid April proposes a commonsense index, combining a number of different measures of climate change. In some parts of the world, especially \JAlaska\j and parts of Asia, signs of permanent climate change are fairly obvious to long-term residents, but the changes in other parts of the world are more subtle. This means that normal random variations can be misinterpreted as global warming, but with the new index, such misinterpretation should be avoided.\p
The data developed by the team, led by climatologist James Hansen, used in the index include total rainfall, numbers of heavy storms, average seasonal \Jtemperature\j, and "degree days" - a measure of the heating or cooling required to maintain an inside \Jtemperature\j of 18║ C (65 ║F). If Hansen's index is above +1 for many years in a row, then this would indicate a persistent trend.\p
Since 1951, only \JAlaska\j and \JSiberia\j have an index which has exceeded +1 each year, but many other parts of the world have come close to it, say the researchers. All that is needed now, they say, is for the media to pick up on the index and report it, just as they report the stock indexes, short-term \Jweather\j and \Jpollution\j reports, and currency fluctuations.\p
#
"A few hot years",600,0,0,0
(Apr '98)
An April report in \INature\i, by Michael Mann, Raymond Bradley and Malcolm Hughes, offers a reconstruction of temperatures over the past 600 years, and identifies 1997, 1995 and 1990 as the warmest years since at least AD 1400.\p
The study covers more than half \JEarth\j's surface, and estimates northern hemisphere yearly temperatures to a fraction of a degree back to AD 1400. Based on these estimates, it looks as though natural changes in the brightness of the \Jsun\j and volcanic emissions both played an important role in governing climate variations over the period studied. More recently, the greenhouse gases produced by human activities appear to have had an increasing influence on temperatures.\p
While we can rely on careful written records for this century and much of the last, the problem has always been one of extending the time scale to get a better perspective on global warming, and whether it is happening. What the researchers have done is to calibrate a number of "natural \Jarchives\j" against known \Jtemperature\j variations in recent times, and then to work backwards through those \Jarchives\j, reading the data off.\p
Interestingly, historical sources suggest that there may have been a strong El Ni±o event in 1791, and the reconstructed \Jtemperature\j pattern supports this view. The 1816 cooling which resulted from the eruption of the Indonesian \Jvolcano\j Tambora was also present in the data, gathered from tree rings, ice cores, corals, and other sources. These two independent "matches" allow us to place more credibility and reliance on the results.\p
#
"Grauer's gorillas still there",601,0,0,0
(Apr '98)
According to modern theories, there are now three sub-species of \Jgorilla\j in \JAfrica\j: the western \Jgorilla\j, known to scientists as \IGorilla \Jgorilla\j \Jgorilla\j\i, the mountain \Jgorilla\j, \IGorilla \Jgorilla\j beringei\i, and Grauer's \Jgorilla\j in the eastern lowlands, \IGorilla \Jgorilla\j graueri\i. A report in the April issue of \IOryx\i, the journal of the Wildlife Conservation Society (based at the Bronx Zoo), says that there are still something like 17 thousand of these gorillas in The Democratic Republic of \JCongo\j (formerly Zaire).\p
This is many more than expected, since the last survey, carried out in 1960, put the total at somewhere between 5000 and 15,000, and human pressures on the species have grown since then. In particular, preliminary surveys indicated severe poaching, though other threats include political instability, deforestation, and habitat loss.\p
\JGorilla\j populations near large population centers have plummeted, but further away, the situation is not nearly as bad within parks and remote areas. Even within protected areas, the team found evidence of poaching. In Kahuzi-Biega National Park, for example, at least one individual in each of the groups which are used to tourists had lost a hand to snares. According to the study, recent reconnaissance indicates that many gorillas were killed immediately following the recent \Jcivil war\j.\p
Overall, allowing for the problems of assessing a population from partial sightings, the writers believe that the population, made up of eleven separated groups, is somewhere between 8700 and 25,500, with some groups almost wiped out. The article summarizes a number of recent studies, while giving new information on the largest population in the Kahuzi-Biega National Park lowland sector, Kasese.\p
The assessment methods included complete nest counts through some areas, line transect and reconnaissance, and in some cases, forest reconnaissance only. A line transect means traveling through the forest on a pre-determined bearing, checking for \Jgorilla\j "signs," such as nests. The total population sizes were estimated by assuming that there were 0.33 infants per adult female, as infants are hard to detect separately.\p
While the gorillas have been estimated as eleven separate populations, the article indicates that there is some hope that the groups are still interbreeding. This will become more of a problem as more forest is converted to pasture, and as new roads are built, cutting small groups off from each other, and threatening genetic diversity.\p
Some \Jgorilla\j habitats have been further degraded by the arrival of a million Rwandan refugees, with their needs for agricultural land and firewood. Hunting, both for meat and the killing of adults to allow the capture of young gorillas, will continue to take their toll as well. The Wildlife Conservation Society says it is the only body studying all three \Jgorilla\j sub-species at the moment.\p
#
"Coming soon to a clean environment near you...",602,0,0,0
(Apr '98)
Futurology is a risky business. Asking experts to predict the future does not necessarily give you a clear picture of what the future will be, but it does give you a picture of what the experts are worrying about right now.\p
A team of researchers at the US Department of \JEnergy\j's Pacific Northwest National Laboratory has identified the 10 most important technological breakthroughs that will lead to a cleaner environment while providing major benefits to consumers over the next decade. Technologies that help prevent problems before they arise surfaced as a major theme.\p
In summary, they see the main developments to be expected by 2008 as:\p
ò genetically engineered crops which place less demand on the environment, needing less \Jfertilizer\j and pesticide treatment;\p
ò smart \Jwater\j treatment using membrane technology and self-unclogging filters to process waste \Jwater\j;\p
ò better storage for renewable \Jenergy\j sources, such as flywheels;\p
ò microtechnology, where small is beautiful, with microchemical plants producing industrial chemicals as they are needed, avoiding transport and storage;\p
ò a paperless society, where material such as newsprint is replaced by an electronic form of presentation, maybe even direct projection onto the \Jretina\j;\p
ò molecular design, where just the right molecules are made for a particular job;\p
ò bioprocessing, with microorganisms and plants producing many necessary products, maybe using extremozymes, enzymes found in organisms living in extreme environments;\p
ò enviromanufacturing and recycling, where products have to be environmentally friendly from cradle to grave, where less hazardous chemicals are used to do cleaning jobs;\p
ò lightweight cars, with a family sedan delivering 80 miles to the gallon (around 25 km from a liter of fuel);\p
ò real-time environmental sensors, able to detect dangerous pathogens and sound the alarm.\p
#
"New sensor for bacteria",603,0,0,0
(Apr '98)
The predictors may have got one right already. In early April, researchers Pamela St. John, Harold Craighead, Carl Batt, Robert Davis, John Czajka, and Nathan Cady reported a simple biosensing method. This method merges the fields of nanofabrication and \Jbiology\j to produce a simple but effective means to detect harmful \Jbacteria\j on meat. The authors are all presently or formerly associated with Cornell University.\p
In simple terms, the new biosensors can detect minute quantities of \Jbacteria\j, from the slaughterhouse to the restaurant, and send up a red flag when there is a problem. The idea is to stamp or mark food with a set of harmless \Jantibodies\j which will bind to \Jbacteria\j, forming a pattern, which then can be read just like a bar code, using a \Jlaser\j beam.\p
The test process used \Jrubber\j stamps to place patterns which could detect \IE. coli\i O157:H7, a deadly pathogen that has been linked to deaths resulting from the contamination of tainted hamburger in the US. In this case, the patterns were placed on silicon chips which could be used anywhere that such \Jbacteria\j might be a threat, or at strategic points on a food production line or other sensitive area and tied to a central computer to monitor bacterial contamination.\p
The main advantage of this early warning system is that it would allow early detection of the \Jbacteria\j, before colonies can multiply and spread across large areas.\p
#
"Stop mowing!",604,0,0,0
(Apr '98)
In the United States, a Connecticut botanist says that lawns are bad for the environment, denying space to natural plants which provide a living space for other species, and absorbing some 67 million pounds of pesticides and about 3 million tons of \Jfertilizer\j annually on lawns in the United States alone. As a result, William Niering has proposed SALT, or \BSmaller American Lawns Today\b to do something about what he calls "lawn mania."\p
Some scientists in \JAustralia\j would agree, as they have just found that lawns and grasslands release vast quantities of pollutants into the air. The mower is bad, too, they say: emissions of chemicals increase around 100-fold after grass is cut, taking hours to reduce to their original level. The researchers from Melbourne's Monash University and the government researcher, CSIRO, undertook a two-year study of grasslands to see what effects light, \Jtemperature\j, and \Jdrought\j had on emissions.\p
The results were startling, with some predictable aspects: emissions reached a maximum around noon on warm days, and fell to zero at night. But after they cut the grass in a special test chamber, gas release from clover rose by a factor of 80, and emissions from grass increased by 180 times. \JCattle\j grazing or trampling will have a similar effect to mowing, increasing emission rates from grass.\p
The researchers believe that the gases, which include the volatile organic compounds methanol, \Jethanol\j, \Jpropanone\j, and butanone, act as natural \Jantibiotics\j to disinfect wounds in the plants.\p
#
"Einstein proved right, even when space-time is seriously curved",605,0,0,0
(Apr '98)
Since the early 1940s, physicists have accepted that a massive object would warp space, bending it so steeply that any object getting too close would fall into it, generating x-rays as it goes. Now a careful observation of the x-rays from a neutron star called 4U 1820-30 has revealed evidence of just such an effect. The observation was carried out from NASA's Rossi X-Ray Timing Explorer \Jsatellite\j over the course of a year.\p
The Rossi Explorer had already measured the x-rays coming from such sources, and shown that the brightness of neutron stars varied as much as a thousand times a second, making them the most rapidly variable objects in the universe. Neutron stars have a mass about ten times as great as the \Jsun\j, but are only about 15 km (10 miles) in diameter. This density produces highly curved space-time, according to Einstein, and that is an interesting place to look at, if you are a physicist. In Newtonian physics, gases can \Jorbit\j in circular orbits at any distance, but Einsteinian relativity says that if you have curved space-time, there are no stable circular orbits.\p
So where does the gas come from? If the neutron star is in a binary with an ordinary star, it tears matter away from the other star, so it spirals down into an \Jorbit\j around the neutron star, traveling at speeds close to the speed of light. As a black hole or a neutron star spins, it sets up huge forces that force this gas into a disc, called the \Jaccretion\j disc. The disc may wobble around its outer edges, but in close to the massive object, the disc must line up with the equator of the object. Somewhere on that \Jaccretion\j disc, there is a region where matter passes inside the last stable \Jorbit\j, and then it tumbles catastrophically inwards.\p
The point of instability (where matter pours in) rotates around the \Jaccretion\j disc as it spins, pulled by the gravity of the neutron star. X-rays pour out from this point, like the beam of a lighthouse, and with a frequency for the x-rays which depends on the distance of the disc's inner edge from the neutron star. The disc is in balance, pulled inwards by gravity, and pushed outwards by the radiation coming from the star, but if a large block of material falls in, this may block the radiation hitting the disc (but not the gravity), allowing the disc edge to tumble in, producing the characteristic \Joscillation\j.\p
If Einstein is right, say the experts, then there should be a limit to the frequency of the x-rays, and this should define the point of no return. So they watched the radiation coming from the star. They watched the brightness, which told them how much material was falling in, and they watched the frequency, but again and again, it hit a ceiling, an upper limit. Four times, they say, is no coincidence - Einstein wins again!\p
The oscillations stabilized at 1050 hertz, even when the x-ray power increased, there was still this simple limit applied to the radiation. The main point to this, they say, is that the x-ray brightness oscillations could be used to determine the masses and dimensions of neutron stars.\p
Of course, everybody knew that Einstein was right about ordinary space, where the curvature of space is minimal, but now we have good direct evidence that it also works around seriously dense objects, in strongly curved space-time. Two papers describing the theory and the results have been accepted for publication in the \IAstrophysical Journal\i.\p
One final comment from Professor Frederick Lamb, one of those who has been at the centre of this exciting work: "Studying how matter moves in the strongly curved space-time near neutron stars also has allowed us to extract interesting new bounds on the masses and dimensions of these stars and on the stiffness of the superdense matter inside them. The new evidence reported today suggests that the \Jstrong nuclear force\j is more repulsive than many nuclear physicists had expected and that the superdense matter in neutron stars is rather stiff."\p
\BOther key names\b: Coleman Miller, Dimitrios Psaltis, William Zhang.\p
#
"Gamma ray burst interpreted",606,0,0,0
(Apr '98)
April saw the first interpretations in public of a massive gamma-ray burst, named GRB971214, after its date of occurrence, December 14, 1997. The blast of high \Jenergy\j radiation lasted about a second, releasing as much \Jenergy\j as all of the universe's billion trillion stars combined. Once it was noticed, all over the world, astronomers got to work, capturing what information they could, and since then, they have been working on their data.\p
In late April, Shrinivas Kulkarni told a meeting of the American Physical Society about how they located the source of the burst. Using the 10-meter Keck \JTelescope\j in Hawaii and NASA's Hubble Space \JTelescope\j, Kulkarni and his Caltech colleague George Djorgovski took the spectrum of a fuzzy patch in the right area, assumed to be the source of the burst, and found a red-shift of 3.42, equivalent to a distance of some twelve billion light years. The event happened some eight billion years before our \Jplanet\j formed.\p
Gamma-ray detectors are poor at locating a gamma source, but the cameras aboard the Italian-Dutch BeppoSAX \Jsatellite\j spotted the x-ray afterglow of the 14 December event and determined a relatively accurate position for it. Then astronomers were able to look for a fading optical display in that area, and once it stopped fading, they could take a spectrum and determine its distance. This was only the third time that astronomers had achieved the optical detection of a gamma-ray burst.\p
The actual cause of the burst was the subject of several papers published in \INature\i in early May, and while there is still some disagreement, one popular theory is that the burst was triggered when two neutron stars collided. Alternatively, it may have been the result of a hypernova, the name given to an event when a supermassive star (80 to 100 times the size of our \Jsun\j, with a "life" of about 3 million years) collapses to form a rotating black hole. The name is justified: an ordinary \Jsupernova\j only puts out about a hundredth of the \Jenergy\j recorded in this burst.\p
The case history of the burst's location shows how \Jastronomy\j works today. GRB971214 was first detected by the x-ray observatory aboard an Italian-Dutch \Jsatellite\j, BeppoSAX, and then by the Compton Gamma-Ray Observatory. Enrico Costa of the Italian Instituto di Astrofisica Spaziale in Frascati, who is part of the BeppoSAX research team, telephoned David J. Helfand, professor of \Jastronomy\j at Columbia University in the USA, where it was 11:15 on a Sunday evening, and notified him of the approximate location of the event.\p
Professor Helfand relayed the message to a Dartmouth astronomer, John Thorstensen, who was viewing the sky using the 2.4-meter \Jtelescope\j at the MDM Observatory on Kitt Peak, near Tucson, \JArizona\j, a facility jointly owned and operated by Columbia, Dartmouth, the University of \JMichigan\j, and Ohio State University.\p
Professor Thorstensen was able to photograph the region of the gamma-ray burst within 12 hours of the \Jsatellite\j detection, and the next night identified the object, which was now noticeably fainter, establishing that it was the optical afterglow of the gamma-ray burst. Jules Halpern, professor of \Jastronomy\j at Columbia, interpreted the results.\p
The Columbia report was confirmed by the 3.5-meter \Jtelescope\j at the Apache Point Observatory nearby and was reported to the astronomical community through the Central Bureau for Astronomical Telegrams. Within two weeks, Kulkarni's team had located an extremely faint galaxy at the location indicated.\p
#
"Black hole?",607,0,0,0
(Apr '98)
Around midday, March 31, US time, the orbiting Rossi X-ray Timing Explorer, or RXTE, detected a new x-ray object. Labeled XTE J0421+560 to identify how it was detected and where, it was almost immediately suspected of being a black hole. X-ray sources like this are generally regarded as black hole transients, which decay fast. This source had halved its output in just 12 hours, and four days later, was down to just 2% of its former glory.\p
Astronomers believe there could be as many as several hundred thousand black holes in our galaxy, but that these can only be detected when they are involved in a binary system, with two massive bodies swinging around each other. If one of these bodies is a black hole, and it drags matter from the other body, this will produce strong x-ray emissions of the sort detected as April Fool's \JDay\j broke across the world.\p
The standard explanation of the basic effect is that two stars revolve around each other. One of them, a normal star like our own \Jsun\j, loses material to a compact companion, either a neutron star or a black hole. In a process called \Jaccretion\j, matter in the form of \Jhydrogen\j and \Jhelium\j plasma moves towards the compact companion where it forms a flat disc, an \Jaccretion\j disc. Then the plasma spirals down onto the compact body, pouring out huge amounts of \Jenergy\j as the matter is accelerated up to speeds close to the speed of light.\p
Some of the x-ray sources in the Milky Way are permanent, always there when we turn our detectors on them, but about once a year, a transient like XTE J0421+560 turns up. The first was discovered in 1967, when x-ray \Jastronomy\j was done on five-minute rocket flights, or from balloons. The operation of the transients is little-known, and this object has contributed very little extra information, as the source died away far more quickly than usual.\p
The best estimate for the moment is that the mystery object is either a black hole or a neutron star in a binary with a star in the \Jconstellation\j Camelopardalis, the Giraffe. This star, identified as CI Camelopardalis (or CI Cam for short) has been known since 1933, when it was noted as an unusual light source. It is probably some 3000 light years away, putting it inside our own galaxy.\p
The evidence pointing at this object as the source has come from a number of directions. First, CI Cam is in the right place, within one arc-minute of the location of the source of the x-rays, as pinpointed by several instruments on the RXTE, by the Compton Gamma Ray Observatory, the Italian x-ray observatory BeppoSAX, and by a Japanese observatory called ASCA. Radio telescopes noted strong emissions from this source at about the same time, and optical telescopes also indicated a brightening of the light from CI Cam.\p
Most importantly, two radio jets were detected emerging from CI Cam. The velocities of these jets were at least 15% of the speed of light. Similar jets had been observed in two previous black hole transients.\p
The fast decay makes the object unusual, and this is what makes its identity as a black hole a little uncertain - past black hole transients have run for weeks or even months. Further observation should lead to an estimate of the object's mass and offer us some clues about its nature, but that could be years away.\p
One clue that we do have: the emissions contain an "iron line" in their spectrum, indicating that the source is a hot gas. This was one of the key observations made for CI Cam when its strangeness was first noted, but the spectrum also indicated ionized \Jhelium\j, a sure sign that something rather cataclysmic was going on.\p
Most probably, say theorists, the \Jaccretion\j disc became unstable, and large amounts of material poured in. This, they said, would produce twin jets from the north and south \Jpoles\j, exactly the sorts of jet which were detected soon after.\p
Photos are available via FTP from \Bftp.lowell.edu/pub/rmw/cicam\b\p
Additional photos from the VLA available from \Bftp.aoc.nrao.edu/pub/press\b\p
#
"Orion's clouds",608,0,0,0
(Apr '98)
A huge mass of \Jwater\j vapor in a \Jcloud\j of interstellar gas near the Orion nebula has just increased the known amount of \Jwater\j in our galaxy twenty times over. The \Jwater\j is inside the Orion molecular \Jcloud\j, a giant interstellar gas \Jcloud\j, a trillion miles across, composed primarily of \Jhydrogen\j molecules. The \Jcloud\j is 1500 light years away from us.\p
The observations were made in October 1997 and were reported in the \IAstrophysical Journal Letters\i during April. The detection was achieved by the long-wavelength spectrometer aboard the Infrared Space Observatory (ISO), looking in the far-infrared region of the electromagnetic spectrum. The gas \Jcloud\j is being hit by shock waves which compress and heat the gas. The shock waves are the result of the violent early stages of star birth in which a young star spews out gas that slams into its surroundings at high speed.\p
The evidence suggests that all of the oxygen present in the clouds is being mopped up by excess \Jhydrogen\j, but at a tremendous rate - \Jwater\j is being formed at a rate sufficient to fill all of our oceans sixty times a \Jday\j! Astronomers are reported to be delighted, as this is what theory predicted, and they have even gone so far as to suggest that this may be a step in the formation of new stars as the gases and other material in the \Jcloud\j condense out. They say that the \Jwater\j in our \Jsolar system\j may have formed in a similar way, long ago.\p
#
"Birth of a planet?",609,0,0,0
(Apr '98)
NASA announced on April 21 the discovery of a disc of dust around a binary star which may be forming planets. The star is 220 light years away, and the important thing is that there appears to be a hole in the \Jcloud\j, suggesting that there is some kind of celestial body - astronomer-talk for what we would probably call a \Jplanet\j - in the \Jcloud\j.\p
While a similar disc was found around the single star, Beta Pictoris, about 14 years ago, this is the first time a system with two stars has been seen with a dust disc. The binary star, HR 4796, is made up of two stars, HR 4796A and HR 4796B 75 billion km (47 billion miles) apart in the \Jconstellation\j \JCentaurus\j, visible primarily from the Southern hemisphere, which explains why the find was made at Cerro Tololo, \JChile\j. The \Jcloud\j is actually around HR 4796A.\p
Our \Jsolar system\j is about 80 astronomical units from Pluto to the \Jsun\j (1 AU is 93 million miles, or 150 million km), while the HR 4976 disc is more than three times that size, which might produce a much larger planetary system than ours, although it is possible that planets are only forming in the inner parts of the disc.\p
And how do astronomers know where to look, to find these discs? They look for stars which emit more infrared radiation than expected, because this usually turns out to be from a disc of warm dust. This has been known for some time, and just a couple of days later, \INature\i published new and detailed pictures of Vega, Fomalhaut, and Beta Pictoris, showing excellent detail of their discs, with mysterious bright blobs around Vega and Beta Pictoris and a gap in the Fomalhaut disc.\p
These stars are all several hundred million years old, well beyond the best \Jplanet\j-forming age, but young HR4796A is probably giving birth to planets right now, say the astronomers. All this evidence suggests that once a \Jplanet\j has a disc, planets are almost sure to follow, making it all the more likely that, somewhere in the universe, there is intelligent life.\p
#
"International Space Station problems",610,0,0,0
(Apr '98)
NASA's space station plans appear to be headed for disarray. A 1993 overhaul came up with a budget of US$17.4 billion for a station to be ready by 2002, but an April reports indicates that the project will now cost some US$7 billion more, and be delayed by one to three years. In fact, it could be as late as 2006 for the project which was proposed by President Reagan in 1984.\p
The station involves cooperative efforts from \JRussia\j, Europe, \JJapan\j, and Canada, and one major problem stems from Russian delays in completing the service module, a key component containing command and control functions. Delay in this unit will necessarily delay NASA's whole launching schedule.\p
#
"Larsen B Ice Shelf loses 200 square kilometers",611,0,0,0
(Apr '98)
Early April brought the news that a chunk of the Larsen B Ice Shelf had broken away from \JAntarctica\j. Measuring 40 km by 5 km (about 25 miles by 3 miles), the break was spotted from \Jsatellite\j photographs taken during February and March.\p
The break-up of the ice shelves has been anticipated for some time, and was covered in the Webster's Science updates in March 1997 (see \BTalking About the \JWeather\j\b - the first time we used that heading). Looking back, the break was already there in late February 1998, but the news was not released until extra photos taken in late March had been analyzed.\p
Earlier studies by the British Antarctic Survey predicted that the 12,000 square kilometer ice shelf was nearing its stability limit, and if this model is correct, there should be further major breakaways in early 1999, in the southern summer.\p
The area around the Antarctic Peninsula ice shelves has been subjected to a regional climate warming of 2.5║ C, or 4.5║ F, since the 1940s, several times the global average rate of warming, but the reason for this warming remains unknown. It seems to be related to a reduction in \Jsea ice\j, ice which forms on the ocean surface, and a lot of effort is now directed to trying to find out what has caused this reduction. The warming has caused surface melting and cracking on the ice shelves.\p
Overall, about two thirds of the ice shelf is likely to break away, while the remainder will stay, locked in bays which protect it. The Larsen B is greater than all the other ice shelves which have broken off in the last two decades, including the Larsen A shelf, which broke up in a storm in 1995, and the Wordie, which disappeared in the late 1980s.\p
Ice shelves are found around \JGreenland\j and \JAntarctica\j. They are fed by snowfall and glaciers, but the ice, up to 800 meters thick, is floating on the ocean. The largest shelf, the Ross Ice Shelf, is rather larger than \JTexas\j or \JFrance\j, about the size of a small Australian state like New South Wales. The Larsen B shelf, on the other hand, is only about twice the size of Manhattan, about the size of Connecticut, or a small Australian farm.\p
#
"May, 1998 Science Review",612,0,0,0
\JReconstructing vancomycin\j
\JNew diagnostic test for antibiotic resistance\j
\JA new class of antibiotics?\j
\JMonitoring antibiotic resistance\j
\JIdentifying alcoholics early\j
\JTreating HIV-positive pregnant women in Africa\j
\JChoking off cancers\j
\JWhy do plants not develop cancer spontaneously?\j
\JNew carcinogens identified\j
\JWill the heart ever be safe again?\j
\JWorried about mad cow disease?\j
\JAutoimmune diseases or the results of pregnancy?\j
\JCircadian clockworks uncovered\j
\JCuring leprosy\j
\JTB kills women\j
\JTransgenic calves\j
\JThe deaf mouse that isn't\j
\JArchaeans fix DNA better\j
\JWhy are there so many genes for cystic fibrosis?\j
\JTreating the sick with viruses?\j
\JJellyfish sugar?\j
\JLeft side, right side\j
\JThe crab that came in from the wet\j
\JWhen did humans first walk upright?\j
\JA new Madagascan dinosaur sheds new light\j
\JWorld's glaciers still melting\j
\JGlobal warming and disease\j
\JEl Ni±o and health\j
\JGalapagos penguins dying off\j
\JX-ray vision?\j
\JDefeating dust mites\j
\JIdentifying refrozen food\j
\JHigh-redshift gamma-ray burst\j
\JSunquake\j
\JMagnetic quakes shake neutron stars\j
\JJinmium gets younger\j
\JLife on Mars evidence under doubt\j
\JStill no metallic hydrogen\j
#
"Reconstructing vancomycin",613,0,0,0
(May '98)
Vancomycin was unable to help when a New York man died at the end of March 1998. He was only the fourth known case of resistant \IStaphylococcus aureus\i infection in the world, and the first death in the United States. The first case occurred in \JJapan\j in 1997 (\BBad news bugs\b, April 1997), and while authorities have said there is no cause for alarm, the New York death will not be the last from this cause. In medical terminology, these are thought to be "sentinel cases." This knowledge gives extra impetus to efforts to modify vancomycin so it bypasses the defences of the resistant \Jbacteria\j.\p
The structure of vancomycin was revealed by Patrick Loll and Paul Axelsen in 1997 in the \IJournal of the American Chemical Society\i, (\BUnravelling vancomycin\b, August 1997) triggering a range of attempts to use powerful computational chemistry techniques to design vancomycin variants that might be able to circumvent bacterial resistance.\p
A report from University of \JPennsylvania\j Medical Center researchers in the May issue of \IChemistry & \JBiology\j\i may show the way. This report gives details of x-ray \Jcrystallography\j results which may explain why certain variant molecules synthesized by chemists at Eli Lilly and Co. show marked activity against vancomycin-resistant \Jbacteria\j.\p
Vancomycin is a dimer, a molecule made up of two units, and it seems that the powerful variants have been assembled in a "face-to-face" orientation of the sub-units rather than the "back-to-back" arrangement thought to be the norm for vancomycin.\p
Loll, co-author with Axelsen of the latest report, says that the whole problem of making a drug that works in this context boils down to one molecule of the drug recognizing and binding specifically to another molecule in the cell wall of the \Jbacteria\j. The resistant \Jbacteria\j have altered the relevant docking molecule so that the normal vancomycin dimer can no longer bind to it. In some unspecified (for the moment) way, the new arrangement gets around this defence.\p
\JBacteria\j can swap drug-resistance genes among species - the vancomycin-resistant \IStaphylococcus aureus\i may have acquired its capability from another, less virulent (though still unpleasant) microbe called \IEnterococcus\i (see next story). Given the rate at which resistant staph \Jbacteria\j have spread around the world, this is not a problem that will go away, but will the drug companies be ready for this development?\p
"It is difficult to generate enthusiasm in funding agencies or industry for proactive drug research," Axelsen observed in a press statement. "Even though we can be quite certain that the vancomycin resistance problem will get much worse rather than go away, adequate funding to solve the problem will probably not materialize until it is more generally perceived as an immediate and broad threat to public health."\p
#
"New diagnostic test for antibiotic resistance",614,0,0,0
(May '98)
One group of potentially lethal \Jbacteria\j, the vancomycin-resistant enterococci, or VRE, have been a threat to patients in part because current tests take 48 hours to establish that a patient is infected with a VRE strain. A new test announced in mid-May offers a test which provides an answer in just 8 hours. Because VRE are immune to all currently available \Jantibiotics\j, it is vitally important to isolate patients with VRE as quickly as possible.\p
The standard method of identifying \Jbacteria\j requires patient samples to be grown in culture and tested. In addition to the lengthy period needed to grow the \Jbacteria\j, small sample size, low organism density and overgrowth by other contaminants make reading the culture difficult. Any method which bypasses or accelerates this culturing process will offer the chance to act much sooner, and save lives.\p
The new technique was revealed at the American Society for Microbiology's annual meeting in \JAtlanta\j. It relies on detecting and amplifying minute quantities of the genes for two types of this \Jbacteria\j, vanA and vanB. Using a new method called hot-start polymerase \Jchain reaction\j (PCR), the researchers say they detected vanA and vanB in 13 of 15 specimens versus 12 of 15 specimens detected using the standard culture method. Combining the methods resulted in detection in 14 of 15 cases.\p
The standard form of PCR often replicates other nonspecific strands of DNA, but the newer hot-start method not only cut down on nonspecific binding, but also increased the yield of the PCR process.\p
So long as \Jantibiotics\j are used unwisely, the problem of resistance will increase. One of the most worrying aspects of antibiotic resistance has been the lack of any new antibiotic "families" on the horizon, new classes of drug with new modes of action, novel cures that might give us a fresh start in the battle against \Jbacteria\j.\p
While some new \Jantibiotics\j have been detected in nature, many of these are only available in minute amounts from living microbes, making their production uneconomical. This is the reason why vancomycin, in spite of serious side-effects, needs to be used to deal with \Jbacteria\j resistant to other \Jantibiotics\j. And as vancomycin fails with more and more \Jbacteria\j, the need becomes ever greater.\p
A recent report in the \IJournal of Organic Chemistry\i indicates that American researchers may have uncovered a way of making just such a family of compounds available in large amounts. The workers have chemically synthesized myxopyronin A and B, two natural compounds known to block replication of drug-resistant strains of \Jbacteria\j.\p
The best aspect of these chemicals is their selective nature. Harmful \Jbacteria\j contain DNA-dependent polymerase enzymes, and so do human cells. Myxopyronin A and B attack the bacterial enzymes while leaving the human host's enzymes alone.\p
\BKey words\b: Scriptgen Pharmaceuticals, James Panek, \JBoston\j University\p
#
"Monitoring antibiotic resistance",616,0,0,0
(May '98)
The development of antibiotic resistance was a matter of public knowledge more than thirty years ago, but the first \Jpenicillin\j-resistant pathogen was actually seen as early as 1945. So what went wrong? Why has antibiotic resistance been allowed to rise so dramatically, posing a threat to public health, increasing medical costs, and fuelling a resurgence in pathogens that were once considered beaten, or at least brought under control?\p
A May report from the Institute of Medicine's (IOM) Forum on Emerging Infections suggests that the answer for the future lies in better surveillance and greater awareness of the problem, and implies the reasons for the original problem.\p
No country has developed a reliable, comprehensive system for tracking drug resistance, says the report, which suggests that there may be a need for greater international effort. Perhaps the answer lies in giving more authority to the United Nations' World Health Organization or to the US Centers for Disease Control and Prevention so they can lead a global surveillance effort.\p
So long as people continue the inappropriate use of \Jantibiotics\j, we will see continued premature emergence of resistance. There are no adequate enforcement mechanisms that exist to ensure proper antibiotic use, so current efforts consist primarily of trying to educate people about the hazards of antimicrobial overuse or misuse. Research on the impact of antibiotic overuse and misuse on humans, on finding new ways to define a drug's effectiveness, and on the benefits and risks of reducing antimicrobial dose and duration of therapy would all help. So, too, would research on the human health effects of widespread agricultural use of \Jantibiotics\j.\p
The IOM is a private, non-profit organisation which provides health policy advice under a congressional charter granted to the US National Academy of Sciences. Its Forum on Emerging Infections was created in response to a request from the Centers for Disease Control and Prevention and the National Institute of \JAllergy\j and Infectious Diseases.\p
#
"Identifying alcoholics early",617,0,0,0
(May '98)
Experts agree that the best way to deal with \Jalcoholism\j is to identify those at risk before they run into problems. A report published in May in the journal \IAlcoholism: Clinical and Experimental Research\i describes a novel method of identifying alcoholics, a method which can apparently be used as a reliable predictor before the condition arises.\p
The study shows that there appears to be a strong preference for intense sweet taste combined with a particular personality profile in alcoholics. Use of this information can help diagnose the likelihood of developing an alcohol dependency with great accuracy.\p
The personality survey, dubbed the Tridimensional Personality Questionnaire, evaluates the levels of novelty seeking, harm avoidance, and reward-dependence in the people responding to the questionnaire. The word alcohol does not appear anywhere in the survey, which takes between 15 and 20 minutes, but researchers claim that it is 85% accurate in diagnosing \Jalcoholism\j, when combined with a test for having a "sweet tooth".\p
Previous research showed that rats with a genetic predisposition to high alcohol intake consume large amounts of sweet solutions (three times their normal fluid intake), but this trait is not shown by rats that do not drink alcohol. The most recent research shows that in a simple taste test, 65 percent of alcoholics said they preferred the most concentrated of five sugar solutions offered, which was three times sweeter than regular cola. Only 16 percent of the non-alcoholics showed a similar preference for the strongest solution while the others preferred much weaker sweet solutions.\p
By itself, having a sweet tooth is not an accurate predictor of potential problems with alcohol, but combined with a particular personality profile, the reliability jumps remarkably. In a study on 52 men who had never been diagnosed with \Jalcoholism\j and 26 recovering alcoholics, the sweet-liking alcoholics scored high on harm-avoidance and novelty-seeking, while sweet-liking non-alcoholics tended to score low on these traits. Neither group could be differentiated by their scores on reward dependence.\p
Part of the "novelty-seeking" characteristic is a tendency to be impulsive, while depression and anxiety seem to link to the "harm avoidance" characteristic. The researchers suggest that \Jalcoholism\j results from a preference for the strong pleasurable stimuli, the sweets, with impaired control of impulses, puts a person at risk.\p
#
"Treating HIV-positive pregnant women in Africa",618,0,0,0
(May '98)
Very high levels of \B\1HIV\b\c are common in women of child-bearing age in Sub-Saharan \JAfrica\j, the part of the continent which includes \JUganda\j, \JTanzania\j and South \JAfrica\j. About 6 million women in Sub-Saharan \JAfrica\j are HIV positive. The level of infection among child-bearing-age women exceeds 30 percent in many urban areas and 14 percent in rural regions.\p
One obvious health target must be to reduce the high rate of "vertical transmission", or mother-to-child HIV transmission in Sub-Saharan \JAfrica\j through treatment with antiviral drugs. The only problem is that the cost of drugs, such as the antiviral drug \B\1AZT,\b\c need to be reduced by about 75% on first world prices if such a strategy is to have any real effect. The AZT manufacturer, Glaxo Wellcome, is reportedly planning to reduce the drug's prices for vertical transmission prevention in low-income countries to about 25 percent of industrial world prices.\p
Studies in \JThailand\j have shown that a short-course of AZT alone can cut vertical transmission by about half. In previous studies in the US, the AIDS Clinical Trial Group of the National Institutes of Health has found that treatment with AZT beginning at the 28th week of \Jpregnancy\j reduced vertical transmission by about two thirds.\p
In a report in the May issue of the journal \IAIDS\i, investigators examine the economics of combination therapy using the AIDS antiviral drugs AZT and 3TC. They explored three treatments within this framework:\p
ò Treatment for the mother beginning at week 36 of \Jpregnancy\j and continuing through childbirth, followed by treatment for both mother and infant during the first week after delivery.\p
ò Treatment for the mother during childbirth and for mother and baby for one week after delivery.\p
ò Treatment during childbirth only.\p
The cost of preventing one case of vertical transmission to one infant in an area with 15% of the women HIV-positive was estimated at between US$817 and US$1445, depending on the treatment used. This is comparable to the costs of other prevention programs which aim to prevent transmission of the HIV virus.\p
\BKey names\b: Elliot \JMarseille\j, James G. Kahn, and Joseph Saba, UCSF ARI.\p
#
"Choking off cancers",619,0,0,0
(May '98)
The world press sat up and took notice during May when a new approach to cancer treatment was announced. Two new drugs, angiostatin and endostatin appear to be remarkably effective in blocking and destroying tumors, and plans are already under way to rush the drugs into human clinical trials.\p
Tumors are centers of growth, and growing cells need plenty of food, which they obtain from the blood. The drugs function by cutting off the blood supply to tumors, making even extremely large tumors disappear. In mice, the drugs appear to stop malignant tumors growing and spreading, but they have not yet been tested in humans, and the medical community remains cautious.\p
Not so the world's medical reporters, who can see the logic of the drugs' operation, and who naturally wish to be at the front of any declaration of the end of the war on cancer. It may not be the end of the war, but the enemy appears to be in a nasty pickle.\p
Both angiostatin and endostatin evidently interfere with the tumors' ability to synthesize new capillaries from pre-existing blood vessels, a process called angiogenesis. Essentially, the tumors were starved when the drugs were administered to cancer-bearing mice. Usually, angiogenesis is limited in organisms, once they are past the fetal stage. It is needed in the development of an embryo, in tumor formation, wound repair, and in the establishment of skin grafts.\p
The caution shown by the medical profession about the use of these drugs comes from the simple fact that cancer patients are ill: for all we know, cancer patients may need to develop new blood vessels, the process called neovascularisation. If angiogenesis is blocked, it may kill the tumor, but also kill the patient.\p
But even those objectors express the hope that it may be possible to deal with the tumors with short bursts of the drug, enough to kill the tumor but not the patient. This approach, after all, is what lies behind conventional \Jradiotherapy\j and \Jchemotherapy\j - and if it comes to that, in the use of \Jantibiotics\j, which are selected because they are more lethal to microbes than they are to us.\p
\BKey name\b: Frances J. Castellino\p
#
"Why do plants not develop cancer spontaneously?",620,0,0,0
(May '98)
Plants appear to have strong preventive mechanisms to avoid the kind of uncontrolled cell division that we call cancers in animals. Unlike animal cells, plant cells which fulfil specific functions and are therefore differentiated are able to reactivate cell division, which is why it is possible to regenerate new plants from individual cells in tissue culture.\p
In animal cells, differentiation usually locks the cell into a particular type: the excitement over the \Jcloning\j of Dolly the sheep came partly from the cleverness of the technique that was used to unlock the udder cells of the "mother" sheep which provided Dolly's genes.\p
A report in the \IProceedings of the National Academy of Sciences\i in late April looked at this question, and suggested that if we knew how the plant cell cycle is controlled and managed, we might learn a lot about how evolutionary changes have led to less rigorous control of cell division and development in mammals.\p
All eucaryotes have their cell cycle managed by enzymes called cyclin-dependent protein kinases (CDKs). Plants use several different CDKs that are all related to their yeast and animal counterparts. Getting the cell cycle going requires the action of CDK-activating kinases (CAKs) that represent important targets within cellular signalling pathways, and two major types of CAKs have so far been identified.\p
The \IPNAS\i report reveals that CAK genes from \IArabidopsis\i, a common weed, are able to function in yeast mutants which have no CAK genes of their own. This was the case with both fission yeasts and also with budding yeasts, and allows researchers to identify a plant CAK gene. And the point of all this? In time, once the rest of this chain of biochemical controls is unravelled, we may be a little closer to understanding why plants do not get cancer, and why we do.\p
#
"New carcinogens identified",621,0,0,0
(May '98)
The US National \JToxicology\j Program, headquartered at the National Institute of Environmental Health Sciences, announced in May the addition of 14 substances to the 184 already included in the US Government's official list of known or "anticipated" human \B\1carcinogens\b\c.\p
The substances include several diesel fuel combustion products, and even the transplant drug cyclosporin, which when properly used, has health benefits that can far exceed their potential risk. The drug is used to help prevent rejection of a transplanted kidney or other organ by its new host, and the labelling on the drug previously indicated that it was a potential cancer risk. A second drug, thiotepa, is also newly listed as a known human carcinogen. It was previously listed as an anticipated human carcinogen.\p
\BNewly listed as known human carcinogens:\b\p
CYCLOSPORIN, an immunosuppressive drug.\p
THIOTEPA, a drug used to treat lymphomas and tumors of the breast and \Jovary\j. It has also been used at high doses in combination \Jchemotherapy\j with cyclophosphamide in patients with refractory malignancies treated with autologous bone transplantation.\p
\BNewly listed as reasonably anticipated to be human carcinogens:\b\p
AZACITIDINE, a drug used to treat acute leukemia.\p
p-CHLORO-o-TOLUIDINE and its HCl salt, used to produce azo dyes for cotton, silk acetate and nylon and as an intermediate in production of Pigment Red 7 and Pigment Yellow 49. Also an impurity in, and metabolite of, the pesticide chlordimeform.\p
CHLOROZOTOCIN, a drug used to treat cancers of the \Jstomach\j, large \Jintestine\j, \Jpancreas\j and lung; \Jmelanoma\j; and multiple myeloma.\p
DANTHRON, or dantron, a laxative removed from the market several years ago when tests were published indicating it caused cancer in laboratory animals. It is also an intermediate in the manufacture of dyes.\p
1,6-DINITROPYRENE. No commercial uses but is detected in ambient atmospheric samples and found in diesel exhaust.\p
1,8-DINITROPYRENE. Also in diesel exhaust and air samples.\p
DISPERSE BLUE 1, used as an anthraquinone-based dyestuff in several semi-permanent hair dyes and also in coloring fabrics and plastics.\p
FURAN, used as a chemical intermediate in the synthesis and production of other organic compounds.\p
O-NITROANISOLE, used as a precursor in the synthesis of o-anisidine which is used in the manufacture of more than 100 azo dyes.\p
6-NITROCHRYSENE. Not used commercially, but detected in ambient atmospheric samples.\p
1-NITROPYRENE. Not used commercially, detected in ambient atmospheric samples and found in diesel exhaust.\p
4-NITROPYRENE. Not used commercially, detected in ambient atmospheric samples.\p
1,2,3-TRICHLOROPROPANE, used as a polymer cross-linking agent, paint and varnish remover, solvent and degreasing agent. It has also been detected in drinking and ground \Jwater\j in various parts of the United States, according to the report.\p
#
"Will the heart ever be safe again?",622,0,0,0
(May '98)
A report in \ICirculation: Journal of the American Heart Association\i during May suggests that infection by a particularly strong strain of \Jbacteria\j normally associated with \Jstomach\j ulcers could be a contributing factor to \Jheart disease\j. The bacterium, \IHelicobacter pylori\i, was found in 62% of people with \Jheart disease\j and only 40% of those without the disease.\p
The study looked at 88 patients who had ischaemic \Jheart disease\j, which causes heart attacks and is the result of poor blood flow to the heart. The control group consisted of 88 patients who did not have \Jheart disease\j. Every effort was made to match the groups on body mass index - a measure of fatness - and socioeconomic class. Fatness can be related to \Jheart disease\j as well, while poorer people tend to have more infections than those who are well-off.\p
The key seems to be a gene found in some of the \Jbacteria\j called CagA. Around 43% of people with \IH. pylori\i with the CagA gene had \Jheart disease\j, compared with 17% of those who had an infection with \IH. pylori\i strains lacking this gene.\p
The study looked at four possible infectious causes of \Jheart disease\j, all of them causing "a smoldering \Jinflammation\j", \IChlamydia pneumoniae\i, cytomegalovirus and herpes, as well as \IH. pylori\i.\p
\BKey name\b: Vincenzo Pasceri\p
#
"Worried about mad cow disease?",623,0,0,0
(May '98)
The history of the disease, the latest research and government regulations surrounding \B\1bovine spongiform encephalopathy\b\c (BSE) and Creutzfeldt-Jakob disease can now all be found on one web site: \Bhttp://w3.aces.uiuc.edu/AnSci/BSE\p
Key names\b: Jan Novakofski, Susan Brewer, Richard Wallace\p
#
"Autoimmune diseases or the results of pregnancy?",624,0,0,0
(May '98)
One of the amazing things about our bodies is the way a female body can withstand a tumor, a group of foreign cells called a baby, without rejecting it. Half of the genetic complement of a baby, the father's contribution, is foreign to the mother, yet her body usually carries the baby to full term. Now it appears that this tolerance may be a mixed blessing, that it may even backfire on the mother.\p
Diseases which have until now been listed as autoimmune diseases may in fact be triggered by fetal cells which remain within the mother's body after birth - for as long as 27 years! Systemic sclerosis is a disease which affects mainly women between 30 and 60, and a late April study in the \INew England Journal of Medicine\i revealed that researchers had found fetal immune cells in skin lesions taken from women with systemic sclerosis.\p
The search began in 1996 when fetal cells were identified in women who had given birth to sons. As women do not have a Y \Jchromosome\j, but their sons did, it was reasonable to infer that any cells in the mother which carry DNA sequences from a Y \Jchromosome\j are cells derived from the fetal son. A February report in the \ILancet\i linked a related condition, scleroderma, to fetal cells, reporting that women with scleroderma had more than ten times as many fetal cells in their blood as women without the disease, but other researchers had expressed doubt about the report's conclusions.\p
Now the new result strengthens the original conclusion, which was that scleroderma is analogous to graft-versus-host disease (GVHD), a possible complication of bone marrow transplantation. In this condition, immune cells called \IT\i-cells from the donor bone marrow attack the tissues of the recipient, causing some of the same symptoms as those seen in systemic sclerosis.\p
If future studies confirm that fetal \IT\i-helper cells are to blame, researchers might one \Jday\j use cytokine-blocking drugs to treat scleroderma and systemic sclerosis. The question that remains: why do only some women develop the disease? What triggers the disease condition? And what causes systemic sclerosis in the few men who get the condition?\p
#
"Circadian clockworks uncovered",625,0,0,0
(May '98)
A late May report in the \IProceedings of the National Academy of Sciences\i describes the discovery of two proteins which help set the body's daily clock, the \B\1circadian rhythm\b\c. The proteins, called cryptochrome (CRY) 1 and 2, were found in almost all the body cells studied. The study began when researchers working on the human \Jgenome\j project found an unexpected gene which was similar to a gene that encodes photolyase, an \Jenzyme\j involved in repairing DNA damaged by ultraviolet light.\p
Since humans do not have photolyase, the researchers began to seek out the genes and the proteins they encode. CRY1 and CRY2 turned up in layers of the \Jretina\j which are not involved in forming visual images. When the image-forming parts of the \Jretina\j are destroyed, the victims are blinded, but they still respond to daylight, sleeping and waking in a normal pattern. If the optic nerve is severed, this circadian rhythm is lost, showing that most of the detection happens in the eyes.\p
The researchers, Yasuhide Miyamoto and Aziz Sancar, believe that CRY 1 and CRY 2 in the \Jretina\j, and also in the skin, could be involved here. It would explain a recent finding that light on the back of the knees can reset the body's "clock" (see \BSeeing the light (2),\b January 1998)\p
Previous researchers had concluded that the normal visual \Jpigments\j, the opsins, which are related to vitamin A. Now CRY 1 and CRY 2, both related to vitamin B2, can be assigned this role. More importantly, the finding may be important for people suffering from seasonal affective disorder, or SAD, during high-latitude winters, and drowsy industrial workers on the midnight shift. American industry has collected data showing most accidents happen during this shift, probably because people's circadian clocks have told them that it is time to slow down, and mistakes are more likely.\p
Then again, cancer treatment may improve, as both beneficial and side effects of anti-cancer drugs can depend on what time of \Jday\j they are administered.\p
#
"Curing leprosy",626,0,0,0
(May '98)
The WHO reported during May that more than ten million lepers (sufferers of \B\1Hansen's disease\b\c) have now been cured by the combination of drugs known as multidrug therapy (MDT). Usually involving three drugs, rifampicin, dapsone and clofazimine, MDT continues to be highly effective in curing \Jleprosy\j completely, with very low relapse rates of around 0.3 per 1000 cases per year.\p
Back in the 1970s, the bacillus that causes the disease, \IMycobacterium leprae\i, was fast developing resistance to the single drug used then, dapsone. So far, no case has been reported where the bacterium has been resistant to all three drugs.\p
Elimination of the disease will come, provided the disease can be reduced to less than one per ten thousand, right across the world. In 1985, there were 122 countries with levels higher than this, now there are just 32, where there are probably 1.5 million cases still waiting to be detected and treated.\p
Once that is achieved, perhaps we will be able to stop calling \Jleprosy\j "Hansen's Disease" to spare the embarrassed victims.\p
#
"TB kills women",627,0,0,0
(May '98)
The WHO had worse news about \B\1tuberculosis\b\c, reporting that TB is now the single biggest infectious killer of women in the world. Over 900 million women and girls are infected with TB worldwide, one million will die and 2.5 million will get sick this year from the disease. This makes TB the single leading cause of deaths among women of reproductive age.\p
In wealthy countries, the disease is more typically found in older men. In industrialized countries, one quarter of all TB cases occurs in the over-65s, compared with only 10% in developing countries of \JAfrica\j, Asia and Latin America, where 60% of all cases are in men and women of reproductive age.\p
Worldwide, TB accounts for 9% of deaths among women aged between 15 and 44, compared with war which kills 4%, HIV 3% and \Jheart disease\j which kills a further 3%.\p
#
"Transgenic calves",628,0,0,0
(May '98)
A report in \IScience\i during May revealed that we could soon have herds of transgenic cows producing copious amounts of milk rich in therapeutic proteins. The signal for this prediction is the news that scientists have cloned three calves that carry a foreign gene.\p
You can modify an animal by injecting new DNA into a fertilized egg, but more often than not, the gene is not expressed, so the adult animal does not produce the desired protein. An alternative is to tuck the desired gene into the nucleus of a developed and differentiated cell, and then move that nucleus into an egg which has had its own nucleus removed. This was done late last year when a modified sheep was produced and was able to carry the gene for Human Factor IX, which some vicitims of \B\1haemophilia\b\c take to aid blood clotting (see \BAfter Dolly, it's Polly\b, December 1997).\p
Cows give more milk than sheep, so now the same \Jcloning\j technique has been applied to cows. The trick was to add a marker gene tied to a gene for resistance to the chemical neomycin, to a culture of cells called fibroblasts. Then, when the genes had been given time to settle in, the culture was treated with neomycin, killing the cells which did not have the power to resist neomycin (and by implication, the linked marker gene).\p
The modified nuclei were then added to egg cells, with 28 eggs going into 11 cows to produce three genetically identical calves, all carrying the marker gene. Next move: producing a cow which secretes albumin into the milk, with likely yields of 80 kg a year.\p
#
"The deaf mouse that isn't",629,0,0,0
(May '98)
Sebastian, born June 23, 1997, ought to be a deaf mouse. He is, after all, a shaker-2 mouse, a descendant of a strain of mice started with a mouse exposed to x-rays as far back as 1928. This radiation produced a \Jmutation\j on mouse \Jchromosome\j 11 which gives the inheritors balance problems and renders them deaf.\p
But Sebastian is not deaf, because he was given gene therapy, an application of genetic \Jengineering\j technology. A report in \IScience\i, late in May, describes how short sections of normal cloned DNA were injected into fertilized mouse eggs. The researchers then waited to see which DNA clone produced a mouse with normal hearing.\p
The point of the exercise was to identify which short section of DNA carried the normal form of the gene (known to geneticists as the "wild-type", because animals with defective genes are rare in wild animal populations). Once the segment was identified, they were able to track down the actual gene involved.\p
This is the first time that genetic \Jdeafness\j has been permanently corrected, and it is the fifth \Jdeafness\j-related gene to be found which matches a human gene. Another paper in the same issue of \IScience\i indicates that the human equivalent gene (DFNB3) is on \Jchromosome\j 17. Mutations in this gene may well be responsible for human congenital \Jdeafness\j.\p
There are probably at least another twelve genes for \Jdeafness\j still to be identified. In Sebastian's case, the wild-type gene and the mutant form differ by just one single base-pair in more than 30 thousand base-pairs. That one change is enough to disrupt the formation of a myosin \Jenzyme\j called Myo15, which seems to be involved in inner ear development, probably by hauling around a chemical called actin, though this statement is no better than well-informed conjecture.\p
\BKey names\b: Frank J. Probst, Sally A. Camper, Yehoash Raphael, Thomas B. Friedman (DFNB3)\p
#
"Archaeans fix DNA better",630,0,0,0
(May '98)
Extremophiles (see \BSome like it hot\b, February 1998) are microbes which survive in extreme environments. These organisms are grouped into the Archaea, the so-called third domain of life. A recently described piece of work provides an interesting insight, both into the nature of the extremophiles, and also into the art of genetic investigation.\p
One of them, \ISulfolobus acidocaldarius\i grows optimally around 80║C and in hot, sulfur springs with a pH of about 3, which is distinctly acidic. Beyond that, they know very little, other than that the biochemistry of such living things need to be strange, for at that \Jtemperature\j, proteins like egg-white coagulate, and the DNA of less macho life forms unzips into separate strands.\p
In previous work, Dennis Grogan at the University of \JCincinnati\j demonstrated that the microbe can use visible light to repair DNA damaged by ultraviolet light. Now, he is searching for tools to study genetic recombination in \ISulfolobus\i.\p
He began by seeking auxotrophs, mutant forms which are unable to synthesize some chemical or other, and needs to obtain this from the environment. From about 15 thousand mutagenized colonies, he got around three dozen useful auxotrophic mutants, but while these were hard to get, mutant forms of a gene called pyrF were much easier. He and his student, Michelle Reilly, have gathered around 200 of them.\p
Once he had cells with more than one \Jmutation\j, Grogan and his students set up classical genetic recombination experiments. \ISulfolobus\i is able to exchange genetic material with other cells through a process known as conjugation, where a bridge forms between two cells, and genetic material passes across, to be incorporated into the receiving cell.\p
The recombination results do not make sense if you try to apply the standard rules that geneticists have learned from experiments with \IEscherichia coli\i, and it seems likely that \ISulfolobus\i is undertaking continual repairs to itself. This is consistent with the fact that the background \Jmutation\j rate in \ISulfolobus\i is nearly identical to that seen in \IE. coli\i, even though \ISulfolobus\i lives in an environment much more likely to damage DNA. So it looks as though the extremophiles may have a number of interesting secrets still to reveal to us.\p
#
"Why are there so many genes for cystic fibrosis?",631,0,0,0
(May '98)
Older geneticists (and most young geneticists) know that there has to be a reason for fatal genes to be common. In scientific terms, they all realize that standard evolutionary pressures should soon sweep away dangerous genes, unless there is some advantage in having those genes around, an advantage that outweighs the disadvantage that comes with the gene.\p
Diseases like \B\1thalassemia\b\c and \B\1sickle cell disease\b\c owe their survival to the fact that having one copy of the faulty gene gives protection against \Jmalaria\j, so even though having two "doses" is unhealthy at best and fatal at worst, having no "doses" is just as bad, so that people with just one faulty gene, the heterozygotes, are favoured for survival, keeping the faulty gene in the population.\p
Now American researchers think they may have just accounted for the reason why one person in twenty carries the gene for cystic fibrosis. The gene which causes the disease seems also to give the victim protection against \B\1typhoid fever\b\c \Jbacteria\j. Each year, populations like that in the United States or \JAustralia\j produce one child with cystic fibrosis for every hundred thousand in the population, or about 2500 children in the US, and 200 in \JAustralia\j.\p
In the middle of the century, most of these would have died at an age of 1 or 2, now most of them will live to about the age of thirty. All of these people have two faulty copies of the gene that encodes for a protein called the cystic fibrosis transmembrane \Jconductance\j regulator (CFTR). The abnormal CFTR blocks the movement of chloride ions and \Jwater\j in the lungs, gastrointestinal tract, and other tissues and causes them to secrete large amounts of mucus. As mucus accumulates in their lungs, these children become increasingly susceptible to life-threatening respiratory infections.\p
Normal CFTR binds to \IPseudomonas aeruginosa\i and clears it from the lungs, and children with cystic fibrosis, deprived of this protection, tend to have their lungs infected with \IP. aeruginosa\i. The researchers were looking for other \Jbacteria\j that might be bound by CFTR in the same way, and discovered that in people with normal CFTR, the protein also acts as a receptor for \ISalmonella typhi\i, the gastrointestinal pathogen that causes typhoid fever.\p
In tissue culture experiments, they found that human cells expressing normal CFTR took up significantly more \IS. typhi\i than did cells producing mutant CFTR. In other words, in people with CFTR, \IS. typhi\i cannot find its way into the body. Normally, this would not matter, as the cells which are taken up are consumed and destroyed, but at high levels, some of the \Jbacteria\j make it through into the body's tissues, and an infection can start.\p
The researchers offer an interesting speculation: now we know how \IS. typhi\i is ingested, it might be possible to produce a non-pathogenic strain, a version which did not cause disease, which could be used to carry vaccines into the body. We have probably not seen the last of this line of enquiry.\p
#
"Treating the sick with viruses?",632,0,0,0
(May '98)
If the thought of being injected with \ISalmonella\i to cure your disease leaves you cold, how about being injected with a virus? This is one implication arising from a paper appearing in \INature\i in mid-May. Trevor Douglas and Mark Young described how they used a "gating mechanism" in the protein coats of some simple viruses which allows them to admit and expel organic and inorganic material.\p
Using these "reversible structural transitions" technicians will in the near future be able to remove the genetic component of a virus (the DNA or RNA that allows the virus to reproduce) and use the remaining protein coat as a container and delivery system for other substances.\p
Viruses are natural containers for the introduction of organic chemicals into cells. Douglas and Young say they have simply subverted this natural function. But learning to think of viruses as substances that can be taken apart, purged of genetic material, reassembled and used as couriers of selected substances is a significant challenge to conventional thinking.\p
In their study, the researchers loaded a substance similar to \Jheparin\j, a drug used to treat thromboses (blood clots), into a virus which attacks cowpeas. This virus is known to be host-specific, and not to attack humans. More importantly, people ingest large quantities of this virus when they eat cowpeas, and no harm comes to them. In any case, the virus has no genetic material, making it completely harmless.\p
Not content with this breakthrough, Douglas and Young report that they have used the protein coats of viruses as "size and shape constrained" chambers in which minerals crystallize in very specific and precise dimensions: they believe that it may be possible in the future to use this method to make an unlimited number of homogeneously sized crystals. This would be extremely useful in producing miniaturized semi-conductors and other nanotechnology devices.\p
#
"Jellyfish sugar?",633,0,0,0
(May '98)
Some \Jjellyfish\j are able to glow in the dark, and Australian scientists at the CSIRO have been using a gene taken from these \Jjellyfish\j to develop better sugar for the Australian sugar industry. The glow-in-the-dark substance is called green fluorescent protein (GFP) and is made by a naturally-occurring gene in \Jjellyfish\j called gfp.\p
The gfp gene is non-toxic, however it will not be used in crops that humans consume, but it will be an invaluable tool in improving the existing strains of sugar cane. Basically, it will be used to see if new genes, inserted into the sugar cane, are working properly or not. It would matter little if the gene was used in sugar production or not: sugar chemists have long claimed that white sugar is far and away the purest substance found on supermarket shelves.\p
All the researchers have to do is shine a blue or ultraviolet light on plant cells containing it and watch through a \Jmicroscope\j equipped with a special filter to see if they glow. This takes about 10 seconds, compared with other methods that can take a couple of days and involve very expensive equipment.\p
The glow-in-the-dark gene acts as a marker gene. It has the advantage that it is easily switched on, and can be seen working right away. This means the \Jjellyfish\j gene is extremely quick to use, compared with other gene markers, and enables scientists to check more samples for the best results at a far lower cost. In the future, CSIRO Tropical Agriculture may use the gfp gene to improve other crops such as sunflowers, \Jbarley\j and mung beans.\p
#
"Left side, right side",634,0,0,0
(May '98)
A Californian woman, known only as V. J., asked surgeons to cut the communication links between the two halves of her brain, in the hope that this would give her some relief from uncontrollable epileptic seizures.\p
The good news is that she gained some relief. The interesting news relates to the way this left-handed woman's brain manages speaking and writing. First described by Kathleen Baynes in 1996, Baynes and her colleagues have added to our information on this case in a paper published in \IScience\i during May, as well as on two right-handed cases.\p
The right-handed subjects can write words displayed to the left side of the brain, but not words displayed to the right, which is only to be expected, since the left side of the brain controls speaking, reading and writing in most people. But V. J. is different: she can write words displayed to the right side of her brain, but not words displayed to the left.\p
It seems that the left side of V. J.'s brain controls reading aloud and speaking, while writing is controlled by the right side of her brain. This, says Baynes, is the first direct evidence that the right hemisphere might control writing in left-handed people, but more importantly, she suggests that it implies that spoken and written language evolved independently in humans, since they are not inherently linked.\p
Another report, published in \INeuron\i during May, shows that when you are told to remember a word, you activate a region on the left side of your brain. When you are told to remember an unfamiliar face, something to which you can't attach a name, you activate a region on the right. On the other hand, if you asked to remember an object which has a name, both regions of the brain become active.\p
William M. Kelley, working under Steven E. Petersen, remind us that there are three stages to memorizing: \Iprocess encoding\i, when you take the information in, and make it available for memory, \Istorage\i, where you make a longer-lasting change in the brain, and \Iretrieval\i, when you bring the information back again. They liken these stages to editing a document on screen, saving it to the hard drive and opening the file at a later date.\p
Here, they were looking at encoding. Studies on patients with brain damage suggest that people use the left side of the brain for language tasks and the right side for handling spatial and pictorial information. Yet if you make images of the brain as it works, it looks as though the left side is used for memorization and the right for retrieval.\p
Petersen noticed, however, that most of these images were made while subjects were memorizing words or sentences. He designed a study to test the effects of nonverbal information. Then, using functional magnetic \Jresonance\j imaging (fMRI), Kelley and colleagues performed two experiments. In the first, they imaged five subjects who were asked to memorize written words, line drawings of objects such as frogs and ladders, and pictures of faces they would be unable to name. Subsequent testing showed they remembered these words, nameable objects and unfamiliar faces very well, though they did best with nameable objects.\p
Put a finger on top of your head, then move it to the front between your left ear and your hair's midline, and you will be near the upper part of the left frontal lobe. This is the region where words are remembered. Performance was much better, however, with nameable objects, which triggered activity on both sides of the brain.\p
#
"The crab that came in from the wet",635,0,0,0
(May '98)
\JJamaica\j is home to the world's most land-loving crab, a thin and delicate species called the bromeliad crab. These tiny crabs, less than an inch (2.5 cm) long, are thin enough to squeeze in between the leaves at the base of the \B\1bromeliad\b\c plant, where rainwater collects. They live in these little pockets of rainwater inside bromeliad plants, which grow on the branches of tropical trees.\p
According to a report in \INature\i in late May, the bromeliad crabs have evolved rapidly to fit their strange lifestyle. These crabs are the only ones known that actively feed and care for larvae and juveniles during the several months they spend in their rainwater nursery. The mother crab manages the \Jwater\j quality by removing debris, by circulating the \Jwater\j to add oxygen to it, and by carrying empty snail shells into the \Jwater\j to buffer the pH levels and add \Jcalcium\j.\p
With so many behavioral and anatomical differences, these crabs would seem to be far away, in evolutionary terms, from other crabs - as much as 50 million years away, according to some estimates. Some workers even believed that the crab had migrated somehow from South-east Asia or \JIndonesia\j, where there are some freshwater species that also care for their young, although not to the unusual degree of the bromeliad crab.\p
The researchers collected 22 crab species from \JJamaica\j and surrounding areas, including Venezuela and Panama, and sequenced two genes from each of these species, about a thousand base pairs in each case, enough to provide statistical evidence that the Jamaican land crabs (there are nine species, including the bromeliad crabs) are a single group, unrelated to any other crabs. The group, say the researchers, has evolved from one common Jamaican marine ancestor very recently, about 4 million years ago.\p
The genes used in the study have been steadily mutating over the ages, giving a molecular clock that records the ages of the species in relative terms. If such a clock can be calibrated, it can be used to provide absolute dates. In any genetic sequence, there are a number of base pairs which can be varied with no effect on the activity of the gene under study.\p
Over time, changes come and go, so that two gene sequences which were the same slowly drift apart. All we need to do is find some starting point that can be identified by reliable geological evidence, and we have our clock. In this case, the closing of the Panama land bridge between North and South America, just 3.1 million years ago, separated the various species of marine crabs into two breeding groups, the crabs living on the Caribbean side of Panama and those living on the Pacific side. All that was required then was to look at the \Jmutation\j rates for related species, and the whole question was resolved.\p
From the clock, it appears that the Jamaican land crabs began evolving only 4 million years ago, at a time in \JJamaica\j's geologic history when the land had risen far enough out of the sea to provide new ecological niches for the ancestral marine crab that began evolving strategies for living entirely on the land. The clock also indicates that the closest relative of the Jamaican land crabs is a Jamaican marine crab.\p
Such rapid adaptation to a new ecological niche and rapid radiation of new species is not thought to be common in nature, so this should trigger some interesting new research.\p
\BKey names\b: S. Blair Hedges, Christoph D. Schubart\p
#
"When did humans first walk upright?",636,0,0,0
(May '98)
The evidence for humans walking upright comes in many forms. Early in May, a group of researchers provided excellent evidence for the ages of perhaps the earliest upright humans, \B\I\1Australopithecus\b\c anamensis\i, which they named in a 1995 paper in \INature\i, which was based on 22 fossils of the species. Now, writing again in \INature\i, they have provided further information, based on another 38 fossils which they have since found.\p
The original fossils were found between layers of rock and ancient soils, but the date of the youngest layer was uncertain, suggesting that perhaps \IA. anamensis\i might not be as ancient as its discoverers believed. In particular, the fossils found in some of the younger deposits were the most human-looking, so some critics said there might be two species instead of just one, an older species and a much younger one.\p
Given the age of the fossils, the best available dating method uses the 40 \JArgon\j/39 \JArgon\j method. The only problem: this method needs crystals and the youngest layer is mostly powdery ash, but by sifting through the ash, the team managed to get enough good crystals to establish that the fossils are between 4.1 and 4.2 million years old.\p
The new fossils also indicate just how primitive the species is in other ways. The ancient species had primitive jaws shaped more like a \Jchimpanzee\j's than a modern human's rounded "dental arcade", and they showed a high degree of sexual dimorphism, also a primitive characteristic.\p
When a species shows sexual dimorphism, the two sexes are different in appearance, and in primates, that means large males and smaller females, the sort of difference that we can see in species such as the \Jgorilla\j. The standard explanation is that the males have to fight for mates, and that this naturally favours the larger, heavier males in each generation.\p
Part of the competition may also come down to threats. Like gorillas, the male specimens have much larger canine teeth, while in humans, males still do have slightly bigger canines, on average, and their enamel is thinner than a female's, a characteristic of male apes, which sharpen their teeth to use as weapons, but the difference is much less than in more primitive members of the primates.\p
Most of the specimens are fragments, but even those can tell a story. A \Jfossil\j wrist bone was found, which has the primitive features of a \Jchimpanzee\j's, while the \Jfemur\j (thigh bone) of \IA. anamensis\i is more like a modern human's, with a structure that allowed the species to walk upright on two legs instead of on all fours like a \Jchimpanzee\j.\p
All in all, the evidence now seems to point even more strongly to \IAustralopithecus anamensis\i as our direct ancestor, and as our first ancestor to walk upright.\p
\BKey names\b: Meave G. Leakey, National Museums of \JKenya\j; Craig S. Feibel, Rutgers University; Ian McDougall, Australian National University; Carol Ward, University of Missouri; and Alan C. Walker, Penn State University.\p
#
"A new Madagascan dinosaur sheds new light",637,0,0,0
(May '98)
A new \Jfossil\j find of an extremely ugly (according to its discoverers) \B\1dinosaur\b\c called \IMajungatholus atopus\i is offering us a new view about the movement of the \JEarth\j's continental plates millions of years ago. The \Jdinosaur\j belongs to a family previously found only in South America and India, and it was described in \IScience\i in mid-May. The material comes from a \Jdinosaur\j which flourished in the late Cretaceous, 65-70 million years ago, and it was found in an area about 40 kilometers (25 miles) from Mahajanga, one of Madagascar's largest cities.\p
The find includes tail bones and most of the skull, though it was in pieces spread out over an area 2 meters across. The animal was apparently buried during a flood soon after its death, protecting the remains from scavenging and \Jdecomposition\j, and the fragmentation has allowed scientists working on the find to see far more of internal structures than they would have seen from an intact specimen. There is still enough to say that the animal had a total body length of 7 to 9 meters and that it was the top predator of the time on Madagascar, feeding on the massive long-necked sauropod dinosaurs also found there.\p
\IMajungatholus\i is remarkably similar to the Argentinian \Jdinosaur\j, \ICarnotaurus\i. Both belong to the \Jdinosaur\j family Abelisauridae, which lived during the late Cretaceous period, with other members found in India. The dinosaurs lived at a time when all of the continents were connected, so we can use them to test hypotheses about the timing of the break-up of the \JEarth\j's continents. The fact that this family is so widespread is causing scientists to wonder whether their previous views on the break-up of the continents is entirely reliable.\p
Until now, we have assumed that the continents split in a particular pattern, which included South America and \JAfrica\j breaking away as one unit. But that would mean that the animals in South America and \JAfrica\j would be more closely related. Instead, this animal found in Madagascar, an island off the south-east coast of \JAfrica\j, is more closely related to animals found half way around the world.\p
This would make sense if a land bridge connected South America and Madagascar during much of the Cretaceous period, and it is possible that modern \JAntarctica\j may have been the main part of that bridge. If \JAntarctica\j ever turns up abelisaurid fossils, that would support this new theory.\p
\IMajungatholus\i was originally named for an isolated skull fragment thought to belong to a pachycephalosaur, or "dome-headed" \Jdinosaur\j. Now, with a complete skull available, it is clear that it was a serious \Jcarnivore\j and a distant cousin to \ITyrannosaurus rex\i. The finding is also important because \IMajungatholus\i was previously the only pachycephalosaur reported from the southern hemisphere. Now it has lost that status, the pachycephalosaurs are again restricted to the north.\p
The ugly beast had a very unusual bone structure and facial features that researchers think were used to send visual signals to attract potential mates or threaten potential enemies. Its facial bones were rough and wrinkled and it had a bony bump above its eye sockets. The bump most probably was covered with some sort of epidermal horn, perhaps similar to the horn on a \Jrhinoceros\j.\p
\BKey names\b: Scott Sampson, New York Institute of Technology, Lawrence Witmer, Ohio University, David Krause, State University of New York.\p
#
"World's glaciers still melting",638,0,0,0
(May '98)
Outside of \JAntarctica\j and the \JGreenland\j Ice Sheet, global warming from the \B\1\Jgreenhouse effect\j\b\c continues and the volume of the world's glaciers outside of these areas continues to decline, according to a May report. The smaller, low-latitude glaciers seem to be taking the biggest hit. \JAfrica\j's Mt \JKenya\j lost 92% of its mass this century, and Mount \JKilimanjaro\j glaciers have shrunk by 73% in the same time. There were 27 glaciers in \JSpain\j in 1980, but now that number has since dropped to just 13.\p
The European Alps have undergone an ice loss of about 50% in the past century, and New Zealand glaciers have shrunk about 26% since 1890. A late May American Geophysical Union meeting held in \JBoston\j was told that only a few hundred of the world's estimated 200,000 glaciers have been studied in detail, but that the pattern is a uniform one.\p
The world's glaciers outside of the \JAntarctica\j and \JGreenland\j make up only about 6 percent of the world's total ice mass, but the \Jwater\j is recycled more quickly and contributes more to sea level rise than do the polar ice sheets, so this needs to be a matter for serious concern, the meeting was told.\p
In 1996, the International Panel on Climate Change estimated that the world's oceans will rise by more than 50 cm (18 inches) by the year 2100, with a third of that contributed by glaciers and ice caps and more than half by the thermal expansion of warming waters, an indirect consequence of glacial melting. Global warming is definitely with us, it seems.\p
#
"Global warming and disease",639,0,0,0
(May '98)
Outbreaks of \ICryptosporidium\i, one of the causes of \B\1Traveller's Diarrhoea\b\c, may increase as global warming increases rainfall levels in some parts of the world. Cryptosporidiosis is just one of many waterborne diseases which could increase in prevalence with higher levels of rainfall and flooding, but it is a useful one to study.\p
\ICryptosporidium parvum\i, a protozoan parasite, was first recognized in 1976 as producing illness in humans. Causing a diarrheal disease that lasts for one to two weeks in healthy individuals, \ICryptosporidium\i can be fatal among immunocompromised persons (that is, in people whose immune systems are less than fully effective, such as people with AIDS).\p
There are no effective medications for the disease, which normally runs its course and leaves the patient to recover, and in most parts of the world, tests are not routinely done for the protozoan. A group of economists addressed the American Geophysical Union Meeting late in May, explaining that the cost to US society of a current outbreak is about US$211 per person. This includes both medical costs and the cost of time lost from work and leisure activities. This becomes serious when you consider that in 1993, during an outbreak in \JMilwaukee\j, \JWisconsin\j, 400,000 people became ill.\p
The US Environmental Protection Agency is currently proposing expanding \Jwater\j treatment requirements to safeguard \Jwater\j supplies from \ICryptosporidium\i. This would have to be a continuing process, as reservoirs for \ICryptosporidium\i exist in livestock, such as \Jcattle\j and dairy cows, and also in wildlife. Run-off from agricultural areas can release the \ICryptosporidium\i oocysts into the drinking \Jwater\j supply.\p
#
"El Ni±o and health",640,0,0,0
(May '98)
A WHO fact sheet, issued during May, draws attention to the health problems caused by El Ni±o, when the nutrient-rich cold waters of the coastal Humboldt Current are replaced by eastward-lowing warm ocean waters (which is nutrient poor) from the equatorial Pacific along the west coast of South America.\p
El Ni±o is linked to a range of natural disasters, including fires in tropical rain forests, floods in the Americas, and \Jdrought\j in \JAustralia\j, Papua-New Guinea, the \JPhilippines\j and \JIndonesia\j, all of which cause health problems. The El Ni±o Southern \JOscillation\j (ENSO) causes a wide range of ecological imbalances across much of the world, and may even be involved in the \Jdrought\j-related \Jfamine\j which is threatening countries such as \JSudan\j.\p
It is already obvious that many disease outbreaks are triggered at the same time. This applies especially to insect vector-borne diseases (e.g. \Jmalaria\j, Rift Valley fever) and epidemic diarrheal diseases (e.g. \Jcholera\j and shigellosis). Less well documented, but of increasing interest, are the effects of ENSO on \Jdengue\j. This largely urban disease, present in tropical regions around the world, is spread by mosquitoes that breed in artificial containers. Thus, in addition to climatic factors, changes in domestic \Jwater\j storage practices, brought about by disruption of regular supplies, will also influence patterns of transmission.\p
#
"Galapagos penguins dying off",641,0,0,0
(May '98)
The world's most northerly \B\1penguin\b\c lives in the waters off the \B\1Galapagos Islands\b\c, just south of the equator in the Pacific Ocean off South America. Like all penguins, they rely on fish for food, and now they appear to be dying off. Since 1970, the population has halved, and this is being blamed on the increasing number and strength of warm-\Jwater\j El Ni±o events, along with a decline of colder-ater La Ni±a events.\p
The May issue of \ICondor\i describes some of the problems the penguins are facing, as seen by Dee Boersma, a University of Washington \Jzoology\j professor and a leading authority on temperate- and equatorial-zone penguins. On her most recent visit, Boersma saw dead marine iguanas and sea lions, undernourished flightless cormorants, and a generally emaciated penguin population in which no juveniles were seen.\p
The marine life forms in the area rely on an upwelling of cold mineral-rich waters of the Cromwell current which is forced to the top by underwater volcanoes in the area of Fernandina and Isabella islands. This mineral input is needed to maintain the \Jalgae\j and seaweeds which the iguanas eat, and to support the plant plankton on which the rest of the food chain depends. The marine iguanas have been able to turn to foraging on land, where the extra rains have turned the islands green and lush, but the penguins have no such choice.\p
More importantly, the penguins molt before they breed, and if food is not available during this time, the adults can die much more easily, but survival through the molt does not guarantee that the penguins will be able to raise young. If food is short, the adults may not even attempt to lay eggs.\p
In 1971, during a La Ni±a event, Boersma found that 80 penguin chicks had fledged from 62 nests. The following year, during an El Ni±o, she found that all but one of the 92 nests failed in a winter breeding period and all 108 nests failed in the fall breeding period.\p
Just recently, Boersma saw 100 penguins starting to molt, which is a hopeful sign, but the penguins will need a number of good years to get back to where they were twenty years ago. At the moment, there is no severe risk of \Jextinction\j, but small populations are always more at risk than large populations.\p
#
"X-ray vision?",642,0,0,0
(May '98)
A new table-top \Jlaser\j, capable of generating a coherent beam of x-rays, has been revealed in a late May issue of the journal \IScience\i. The breakthrough is important, because it will allow University of \JMichigan\j researchers to generate a focused beam of x-rays that could be incorporated into a device for atomic-scale imaging.\p
Forget the "Superman" variety of x-ray vision: what we need is a system of generating controlled radiation at very short wave-lengths, so that very small objects can be examined with it. That is exactly what Andy Rundquist, Charles Durfee, and electrical \Jengineering\j professors Henry Kapteyn and Margaret Murnane and their colleagues have achieved. The \B\1laser\b\c emits x-rays with a wavelength of just 20 nm, but there is every prospect of bringing this down as low as 2 nm.\p
This work has the potential to give chemists a close-up view of the dynamics of atoms during reactions with other atoms, and it could open a real-time window for biologists onto microscopic events at the cellular level.\p
#
"Defeating dust mites",643,0,0,0
(May '98)
Dust mites, a problem for asthmatics and \Jallergy\j sufferers, need moisture if they are to survive. Now the trick used by male dust mites to survive dry spells has been uncovered: they cluster together to conserve \Jwater\j.\p
A report in a recent issue of the journal \IExperimental & Applied Acarology\i suggests that if researchers can figure out what makes dust mites cluster, they might be able to find new ways to kill these microscopic insects, which excrete proteins that can trigger \Jasthma\j and \B\1allergy\b\c attacks in people.\p
Researchers have known for about 30 years that dust mites snack on naturally-shed human skin cells, and about 25 years ago, it was discovered that dust mites don't drink \Jwater\j, they suck molecules of it from the air. They also lose \Jwater\j from the surface of their hard outer shells. That is why dry environments are particularly inhospitable to dust mites.\p
Female dust mites are larger and hardier, but the smaller males seem to need to group together, and it may be that they do this with pheromones, chemicals which attract them together. If this is so, pheromone traps are possible, and pheromone sprays could confuse them to the extent that they dry out and die.\p
#
"Identifying refrozen food",644,0,0,0
(May '98)
How can you tell if frozen food has been allowed to thaw while it was being transported, if it was later refrozen? One answer in the future may require you to check for a color change in a very inexpensive thaw indicator placed in the food package.\p
The invention, a byproduct of a solar research project, depends upon an inexpensive "smart material" - a thin wire that "remembers" multiple shapes and acts as a sensor, has just been patented at Sandia National Laboratories in the US. It changes color if temperatures rise above 0║C (32║F), but it doesn't change back if the \Jtemperature\j then drops below freezing.\p
Using no power source except warming or cooling, the wire changes shape markedly and powerfully at appropriate temperatures. When warmed, movement of the wire tears a colored paper, green, to reveal a different color beneath, red. When cooled, the wire returns to its original position but because the paper is torn, the warning color remains visible.\p
#
"High-redshift gamma-ray burst",645,0,0,0
(May '98)
Gamma ray bursts continue to be in the news, and now most scientists are convinced that these bursts are from the most remote parts of the universe. Every burst, of which about one per \Jday\j is detected, releases perhaps as much \Jenergy\j in 10 seconds as the \JSun\j emits in its entire 10-billion-year lifetime, and now the emphasis is shifting from asking what the bursts are, and turning to the question "what causes them?".\p
GRBs, as they are called, are brief flashes of high-\Jenergy\j radiation that appear on average about once a \Jday\j at an unpredictable time from unpredictable directions in the sky. Since their discovery (by accident) in the late 1960s, several thousand bursts have been detected, most of them with BATSE (the Burst and Transient Source Experiment), on board the Compton Gamma Ray Observatory.\p
Until 1991, when BATSE went up, most astronomers thought the GRBs came from local sources in our own galaxy, and that they were all associated with neutron stars, objects with a mass similar to that of the \Jsun\j, and a diameter of about 10 km. These have huge gravitational and magnetic fields that would make them ideal sources for the gamma-ray bursts.\p
But if the GRBs come from our galaxy, they would be concentrated in the plane we call the Milky Way, but the BATSE observations show us that GRBs are distributed uniformly in space, without a concentration to the plane of the Milky Way, or towards its center or in other clumps. In other words, the GRBs have to be coming from sources uniformly scattered through the universe, or maybe from some kind of very large spherical halo (or corona) around the galaxy.\p
The halo idea is easy to test, since the \JEarth\j is on one side of the galaxy, and this should produce an asymmetry, though a huge halo could make this asymmetry disappear. It would need to be about a million light years across, compared with the Milky Way itself, which is about 100,000 light years across.\p
There were hints that weaker bursts came from further away, much further away than any halo, but really nailing it all down required linking a GRB to an optical object, and measuring its red-shift. This is not easy, because x-ray telescopes can only pinpoint an area 6 arc-minutes (1/5th of the moon's diameter) by 6 arc-minutes. In astronomical terms, this is still a very large "box" to search for a possible distant object.\p
The Wide Field Camera (WFC) on board of the Italian-Dutch X-ray \Jsatellite\j BeppoSAX, can perform to this standard but, more importantly, can provide a position within a matter of hours, much faster than was possible before. The WFC can search a significant piece of sky in a 40 by 40 degrees field of view, and then home-in on a gamma source.\p
The first result was in February 1997 (see \BSmall galaxy disappears, not many hurt\b, March 1997). This was matched to a visible source, and so we had our first candidate. It was also matched to a decaying x-ray source in the same region.\p
A Hubble Space \JTelescope\j (HST) pair of sightings made it appear that the source was distant, but still the answer was not nailed down. Now, with a number of GRBs having been linked to optical sources with significant redshifts, the case for the bursters being very distant objects seems fairly secure.\p
#
"Sunquake",646,0,0,0
(May '98)
A moderate-sized solar flare on July 9, 1996, has now been analyzed, and it appears that it created seismic waves in the \JSun\j's interior that closely resembled those created by terrestrial earthquakes. The \Jenergy\j involved has been estimated at \Jmagnitude\j 11.3, or something like 40 thousand times the force of the 1906 San Francisco \Jearthquake\j.\p
The waves look like ripples in a pond, but the solar waves were 3 km (2 miles) high and, in one hour, traveled a distance equal to 10 \JEarth\j diameters before fading into the fiery background of the \JSun\j's \Jphotosphere\j.\p
While sunquakes have been known for some time, this is the first evidence that solar flares can cause such an effect. A solar flare is caused by a disturbance in the \Jsun\j's magnetic field. It accelerates electrons from the solar corona down into the \Jsun\j's lower \Jatmosphere\j, where they give off x-rays. In 1995, astronomers Zharkova and Alexander Kosovichev of Stanford University calculated that the impact of the electrons would also cause the gas to expand suddenly as a shock wave. When the shock wave hits the solar surface, they proposed it could create seismic waves.\p
Now, using data from the orbiting Solar and Heliospheric Observatory (SOHO), they have demonstrated strong upward and downward motions of gas over an area 3000 to 5000 kilometers wide around the flare site.\p
#
"Magnetic quakes shake neutron stars",647,0,0,0
(May '98)
May also saw news of bizarre "crustquakes" on \B\1neutron stars\b\c which appear to fling waves of gamma rays across the Milky Way. A May report in \INature\i reminds us that when the core of a \Jsupernova\j collapses, some of the cores form black holes, but most become pulsars, neutron stars that send out regular pulses of radio waves.\p
Since 1979, astronomers have found several neutron stars which blast out x-rays instead of radio waves, and which unleash irregular bursts of low-\Jenergy\j (soft) gamma rays. These objects may be magnetars, neutron stars girdled by an intense magnetic field, which crushes the stars' crusts once they begin to cool.\p
This notion is supported by new information from the Rossi X-ray Timing Explorer. A neutron star 40,000 light-years away sent out oscillating bursts of x-rays in late 1996, indicating that the star spins just once every 7.5 seconds, much slower than other young pulsars, which spin at least 10 times per second. (The magnetar being studied is believed to be about 10 thousand years old.)\p
So why has the star slowed down? The answer, it seems, is interference, drag, from a magnetic field of 800 trillion gauss, 100 times stronger than those around typical neutron stars, or a thousand trillion times as strong as the \JEarth\j's own magnetic field, according to astrophysicist Chryssa Kouveliotou of NASA's Marshall Space Flight Center in \JHuntsville\j, \JAlabama\j. (For comparison, a "\Jrefrigerator\j magnet" typically has a field of around 100 gauss.)\p
Such a powerful magnetic field can wrinkle and crack the brittle surface of the cooling neutron star and send shock waves racing into space, releasing a "trapped fireball" of x-rays and gamma rays. The magnetic field will also heat the surface of the star to around 10,000,000║C (18,000,000║F), but as the field drifts through the crust of the star, it will generate starquakes. But the display won't last long, because when the star grinds to a near halt after a few thousand years, its outbursts should fade, because a slow-moving magnetic field cannot accelerate charged particles to high energies.\p
The objects doing this are generally called Soft Gamma Repeaters (SGR) because the bulk of their \Jenergy\j is in low-\Jenergy\j gamma rays (note: "soft" does not mean gentle, as these sources are very powerful indeed! The SGRs typically give off as much \Jenergy\j in a second as the \JSun\j does in a whole year). The magnetar in this study, called SGR 1806-20 by astronomers, was first discovered when it emitted soft gamma ray bursts.\p
Only four "live" SGRs have been located so far, but perhaps 10 percent of all neutron stars have magnetic fields that are strong enough for them to be considered magnetars. As they only last for about 10,000 years, we would not expect there to be too many around at any given time. Kouveliotou says there may be millions of these in the galaxy, dead and invisible to us. A new candidate for the location of the missing \Jdark matter\j? Perhaps not, but this could account for the large number of observed supernovae remnants without detectable neutron stars at their centers.\p
\BKey names\b: Chryssa Kouveliotou, Jan van Paradijs, Stefan Dieters, Tod Strohmayer, Robert C. Duncan.\p
#
"Jinmium gets younger",648,0,0,0
(May '98)
In December 1996, we brought you the breaking story of the art finds in a rock shelter at Jinmium in \JAustralia\j's Northern Territory (see \BModern humans suddenly get older\b). Dates ranging from 116,000 to 176,000 years had been found for a site containing human art. The dates were obtained using a method called thermoluminescence dating (TL), which relies on a clock driven by natural radiation in common minerals like \Jquartz\j.\p
At the time, following the lead of the researchers, we indicated that the dates would need to be confirmed, as the method relies on having no contamination of the sample. Essentially, the TL dating system looks at the glow released by grains of \Jquartz\j taken from the site, but if there are many grains, only an average value can be obtained.\p
While the mineral remains in the dark, radiation bumps electrons into crystal defects, or "traps," at a regular rate. We can read the clock either by heating the sample (TL) or by tickling it with light, called optically stimulated \Jluminescence\j, or OSL. The material glows as the electrons drop back into the lattice; the more intense the glow, the more time has passed since the sediments last saw daylight.\p
The assumption at the Jinmium site was that all of the grains on the floor below the rock art had fallen and been covered at about the same time. Any burrowing by animals or other disturbance would upset this, and the head of the team, Richard Fullagar, had indicated that the dating would need to be assessed very carefully. The solution, as we indicated then: "The problem is that the samples may have been contaminated by very old sand grains, so plans are now under way for a new study, in which individual grains will be dated. In this way, any anomalous results will show up rather than being absorbed into an average."\p
Now the results have arrived, and regrettably, the team now reports that Jinmium's age is a completely unremarkable 10,000 years.\p
#
"Life on Mars evidence under doubt",649,0,0,0
(May '98)
The original claim that \Jmeteorite\j ALH84001 showed evidence of life on Mars was based on minerals in the \Jmeteorite\j which resembled minerals formed by some \JEarth\j \Jbacteria\j. Now a closer study of iron sulfides in the \Jmeteorite\j appears to contradict this claim. A May report in \IScience\i describes studies on the iron sulfides produced by earthly \Jbacteria\j. The conclusion: if there were \Jbacteria\j involved in making the \Jmeteorite\j's iron sulfides, they were not quite as similar to earthly \Jbacteria\j as people had assumed.\p
The iron sulfide minerals in question are cubic iron sulfide, mackinawite, greigite, and pyrrhotite. ALH84001 contains pyrrhotite. Using transmission \Jelectron\j \Jmicroscope\j observations, researchers found clear evidence in two different \Jbacteria\j known for generating magnetic compounds of mackinawite and greigite, and possibly of cubic iron sulfide, but not of any other sulfides.\p
The pyrrhotite found in the \Jmeteorite\j has been offered as evidence that there must have been life on Mars, but now it seems that the \JEarth\j \Jbacteria\j may actually prevent the mineral from forming. The sequence seems to be that \Jbacteria\j first produce the non-magnetic sulfide, mackinawite, although this may in fact be formed from cubic iron sulfide. In any case, the mackinawite converts to magnetic greigite. Under normal circumstances, the greigite would break down to \Jpyrite\j and pyrrhotite, but it seems that earthly \Jbacteria\j somehow stop this happening.\p
#
"Still no metallic hydrogen",650,0,0,0
(May '98)
In August 1997 (\BAnd where is that \Jhydrogen\j?\b), we reported on the search for a strange form of \Jhydrogen\j which may exist at high pressures. Calculations have shown that the pressure required is about the same as you would find at the center of the \JEarth\j, but a May report from Cornell University says that \Jhydrogen\j, even under such huge pressures, remains in its normal state. At the very best, the pressure required for this transition is higher than previously thought, at the worst, the stuff just does not exist.\p
The search for metallic \Jhydrogen\j is of enormous importance because of the material's potential as a superconductor. The metal could also be relevant to the study of the interiors of larger planets, such as Jupiter, where metallic \Jhydrogen\j is thought to be in abundant supply.\p
\JHydrogen\j sits above the \Jalkali\j metals in the \B\1\Jperiodic table\j\b\c, and it has been known for about sixty years that if sufficient pressure is applied to solid \Jhydrogen\j, it, too, will become an \Jalkali\j metal. A small amount of liquid metallic \Jhydrogen\j was created in 1997 at the Lawrence Livermore National Laboratory in \JCalifornia\j, but it lasted in that state for less than a microsecond. This was done by submitting \Jhydrogen\j to both high pressure and high \Jtemperature\j.\p
Solid \Jhydrogen\j forms at low temperatures or under compression, but it is an insulator. Theory says that as the pressure increases, the solid decomposes and forms a lattice of protons surrounded by electrons which are then able to flow freely, turning the material into a conductor. According to a May report in \INature\i, "Calculations suggest that depairing (destruction of the molecular bond) should occur around 340 GPa, accompanied by the formation of an \Jalkali\j metal at this pressure or at substantially higher pressures."\p
The report goes on to indicate that there were no signs of metallic \Jhydrogen\j at 342 GPa. (For comparison, the \Jatmospheric pressure\j at the \JEarth\j's surface is about 100 Pa, or one ten-thousandth of a gigapascal, so the pressure attained was about 3.5 million times that at the \JEarth\j's surface.)\p
Huge pressures like these can only be obtained by compressing the gas in a diamond anvil cell, a small device consisting of pairs of the highest quality diamonds with tips beveled to one-fourth the diameter of a human hair. The diamonds, however, despite their seeming perfection, ultimately cracked. In all, the team cracked 15 pairs of diamonds.\p
The team has previously used a diamond anvil cell to make metallic oxygen, sulfur and \Jxenon\j. Recently, another group of researchers formed metallic sulfur at low temperatures, and found that the material becomes a superconductor at 10 Kelvin (- 442║F, -263║C) when the metallization pressure is reached.\p
#
"June, 1998 Science Review",651,0,0,0
\JDeath of Alexander the Great \j
\JLightning a sneaky killer?\j
\JNeutrinos oscillate, so they have mass!\j
\JA new explanation for sonoluminescence\j
\JNanotubes in the news\j
\J18 million chemicals\j
\JMaking better fuel cells\j
\JSimulating the universe\j
\JSupercomputer on the cheap\j
\JBreast cancer risk factors identified\j
\JTaxol\j
\JHow a tuberculosis drug works\j
\JNew TB vaccines\j
\JTB resistance on the rise\j
\JDoes infection cause Tourette's syndrome?\j
\JUsing St John's Wort to reduce alcohol intake\j
\JAthlete's foot: see a specialist\j
\JAnemia treatment in AIDS/HIV\j
\JCapturing the structure of gp120\j
\JHereditary colon cancer\j
\JGene therapy for sickle cell anemia?\j
\JNon-smokers protected by their genes\j
\JNo mad cows today, thanks\j
\JUsing viruses in gene therapy\j
\JCloning cow cells\j
\JAlgae attacking the floor of the Mediterranean\j
\JTurtles screwed\j
\JSumatran tigers a distinct species\j
\JSo why aren't more Australians called Bob?\j
\JGiant SHRIMP dates rocks\j
\JAncient human skull has a modern look \j
\JMr Ples loses weight - in the brain\j
\JWhich came first, the chicken or the dinosaur egg?\j
\JTalking about the weather\j
\JTriton undergoing global warming\j
\JThe best and brightest\j
\JUFOs worth studying say scientists\j
\JNew planet\j
\JLife is everywhere\j
\JNEAR but far\j
\JHubble 'sees' expanding nova shells\j
#
"Death of Alexander the Great",652,0,0,0
(Jun '98)
It may be a little late for the autopsy, but it appears that the ruler of much of the ancient world may not have died of poison, as some believe, or even of \Jmalaria\j, as most medical texts suggest. The culprit, according to a paper in a June issue of the \INew England Journal of Medicine\i was probably typhoid fever, caused by \ISalmonella typhi\i.\p
Using the available historical records, physicians at the University of Maryland Medical Center say that this is the most plausible explanation. In the week before he died, historical accounts say Alexander had chills, sweats, exhaustion and high fever, all of which are typical symptoms of a number of infectious diseases besides typhoid fever, but he is also described as having severe abdominal pain, causing him to cry out in agony.\p
This is a key symptom, as untreated typhoid fever can lead to perforation of the bowel, and may have been the reason for his abdominal pain. Even more importantly, the ancient accounts say that Alexander's body did not begin to decay for at least several days after his death. This sounds like a piece of legend, but another complication of typhoid fever, called ascending paralysis, could account for it. This problem starts with the feet and moves up the body, paralyzing muscles and slowing down breathing. It can make a person look dead, even if he is not, and Alexander may have been in that state for a few days before he died.\p
The earliest surviving accounts about Alexander's death available today were written three centuries after he died, so there was not a lot of information to go on. More importantly, today's doctors have little idea of what happens when typhoid fever goes untreated with \Jantibiotics\j, but historical medical accounts were available there to provide the necessary information.\p
#
"Lightning a sneaky killer?",653,0,0,0
(Jun '98)
A June article in \IThe Lancet\i suggests that \Jlightning\j can kill without a visible sign of electrical current entering or leaving a person's body. Usually, \Jlightning\j victims show signs of external damage such as skin burns where electrical currents have entered or exited the body, but some known \Jlightning\j victims show no sign of external burns, indicating they were not touched by the \Jlightning\j itself.\p
The researchers believe that intense magnetic fields resulting from the \Jlightning\j may induce fatal electrical currents entirely within the body, leaving it unmarked, but just as dead. Their work, they stress, is speculative, but say their calculations indicate the magnetically induced currents within the body during a vulnerable cardiac period could be strong enough to disrupt and fibrillate the heart, possibly causing death.\p
The researchers behind this study come from \JColorado\j, and they suggest that \Jlightning\j could be the culprit in a number of unexplained fatal heart malfunctions in the outdoors in recent years, including some in the state's high mountains. In a number of "hiker heart attacks" in \JColorado\j's high country, \Jlightning\j strikes near the victims could have provided enough of a magnetic field to induce a brief, severe current causing death. Since most of the cases involved solitary hikers, witnesses were generally lacking.\p
As \Jcorroboration\j, they cite a case where a foursome of golfers were standing under a tree that was struck by \Jlightning\j. One golfer sustained burns, two were knocked unconscious and a fourth, who suffered cardiac arrest, died about three weeks later, although he exhibited no evidence of external burns or other physical damage from \Jlightning\j.\p
\JLightning\j is known to affect victims in one of three ways: a \Jlightning\j bolt hitting them directly; indirect hits by a "side flash" of \Jlightning\j that causes electrical current to enter their bodies; or through injuries or \Jelectrocution\j caused by \Jlightning\j hitting and traveling through the ground. If their speculation has merit, then we need to add a fourth possible way of death being caused by \Jlightning\j.\p
#
"Neutrinos oscillate, so they have mass!",654,0,0,0
(Jun '98)
In early June, the Neutrino 98 Conference in \JJapan\j was presented with evidence collected over 20 months from the neutrino "observatory," Super-Kamiokande, a $100 million experiment in a 50 megaliter (11 million-gallon), stainless steel-lined cavity carved out beneath the Japanese alps, filled with ultra pure \Jwater\j and observed by 11,146 large area light detectors, each 50 centimeters (20 inches) square.\p
The results, if they are correct, will change the way we think, because one "flavor" of neutrino, the muon, has been found to disappear and reappear as it travels hundreds of kilometers through the \Jearth\j. In the terms used by physicists, it oscillates between one form and another, and according to standard doctrine, this can only happen if the neutrinos have mass.\p
The estimated mass is very tiny, but even that much could be enough, given the huge number of neutrinos thought to be left over from the time of the Big Bang to account for much of the mass of the universe.\p
Super-Kamiokande detects energetic charged elementary particles traveling at close to the vacuum speed of light (300,000 km per second), which means they exceed the speed of light in \Jwater\j, and produce a flash of Cherenkov radiation in a 42-degree half-angle cone trailing the particle. This nanosecond directional burst of blue light is detected with photomultipliers. Its pattern, timing and intensity allow physicists to determine the particle's direction, \Jenergy\j and identity.\p
The Super-Kamiokande Collaboration claims "the discovery of neutrino mass and oscillations. The claim is based upon atmospheric neutrino data, which resolves an anomaly uncovered in 1985 and confirmed and elaborated by subsequent experiments. In its analysis of the present data base, the team observed a deficit of muon neutrinos coming from great distances and at lower energies; the functional behaviour of this deficit indicates that muon neutrinos oscillate, thus they have mass."\p
Neutrinos have no charge, are very light, and are grouped in the family of neutral leptons. They do not feel the strong force that binds quarks into protons and neutrons, and protons and neutrons into nuclei. The three kinds, or flavors of neutrino are \Ielectron\i, \Imuon\i and \Itau\i, and there are three anti-neutrinos of the same three flavors. Until now, the neutrinos have been thought to have no mass.\p
The neutrinos being studied here are formed when primary cosmic rays hit the \Jatmosphere\j, forming "secondaries." The secondary cosmic rays fly on, with largely the same direction as the primaries. Some of the secondary cosmic rays, mostly \Jpi\j and K mesons, and tertiary muons, decay, resulting in neutrinos, while photons and charged particles are absorbed into the ground\p
Each second, around 100 neutrinos pass through your body, but there is only a one in 10 chance of a single neutrino striking a proton or a neutron in your body - all the rest just pass through, which is why they are hard to detect. If there is a large enough body of \Jwater\j with enough neutrinos, a neutrino will occasionally react with a quark in the oxygen atom of a \Jwater\j molecule, snatching a charge away, and becoming an \Jelectron\j or a muon.\p
This is where the Cherenkov radiation is generated, for the photomultipliers to detect. Muons travel relatively straight and produce a rather clean ring image on the wall. Electrons are distinguished as they scatter and make fuzzier images, which can be recognized with about 98 per cent accuracy. On average, an atmospheric neutrino is detected about once every 90 minutes.\p
Now we need to understand the atmospheric neutrino anomaly. In simple terms, the \Jatmosphere\j should yield twice as many muon neutrinos as \Jelectron\j neutrinos, but many studies have revealed something more like 1:1, and many possible reasons have been suggested to explain this, with "oscillations" being an interesting but unproven theory.\p
Oscillations are not easy to explain. The things we call particles can also be called waves in quantum mechanics, where this is called " the particle-wave duality of fundamental matter." We identify particles mainly by what they produce: a pion decays to a muon and a muon (anti)neutrino, while a neutron decays to produce a proton, an \Jelectron\j and an \Jelectron\j (anti)neutrino. We can also identify particles by weight, but this is where the going gets nasty.\p
The muon neutrino seems to be composed of two different masses. It looks as though the muon neutrino may be composed of half each of two states of slightly different mass that oscillate in and out of phase with each other as they travel along, alternately interacting as a muon neutrino and then making a tau neutrino. If this is happening, then what we detect will depend on where we are detecting it, but the distance traveled during an \Joscillation\j is probably hundreds of kilometers, so we have yet to see this in a particle accelerator.\p
The distance traveled for some neutrinos formed in the \Jatmosphere\j is probably about the right length to allow an \Joscillation\j to take place in some proportion of the neutrinos formed in the \Jatmosphere\j. In fact, more muon neutrinos come from overhead than come up through \JEarth\j - the neutrinos pass through the \Jplanet\j as though it is not there, and in a path of around 13,000 km, have probably cycled through many oscillations, while the ones coming from overhead, are formed about 20 km away, allowing no time for an \Joscillation\j before they are detected.\p
Neutrinos of higher \Jenergy\j oscillate more slowly, so the net result is that we see muon neutrinos disappearing in proportion to their flight path and inversely proportional to their \Jenergy\j. As yet, we do not know whether the muon neutrinos oscillate into tau neutrinos or a new sterile neutrino, so there is still important work to be done.\p
But, and here is the key point, if the neutrinos are oscillating between two masses, then they must have mass. And that, say observers, is the most exciting discovery in elementary \Jparticle physics\j in ten years. In time, they believe, the new results could prove to be the key to finding the holy grail of physics, the unified theory, also called the "theory of everything."\p
The Standard Model, which assigns zero mass to neutrinos, now appears to be asymmetrical, and to physicists, that means just one thing: somewhere beneath the data, there lies a nice neat symmetrical solution, and that means lots of joyous experimenting and searching, secure in the knowledge that there is more to find.\p
#
"A new explanation for sonoluminescence",655,0,0,0
(Jun '98)
Sonoluminescence is a fascinating effect which can be seen when ultrasonic waves break against the surface of a \Jwater\j bubble and heat the atoms inside until they glow. A new explanation of the atomic process behind sonoluminescence was offered by an American physicist, Sanjay Khare, in a recent issue of the journal \IPhysical Review Letters\i.\p
Up until now, even though scientists know a great deal about the motions of bubbles and ultrasonic waves, and even though sonoluminescence was discovered in 1934, nobody has been sure exactly how sonoluminescence works on an atomic level. Now scientists are using ultrasound to accelerate and enhance \Jchemical reaction\js in a new branch of science called sonochemistry, mainly in the creation of new materials, so a good explanation is urgently needed.\p
The clue came from the observation that the ultrasound-stimulated bubbles emit light in very short pulses, as short as 10 parts in a trillionth of a second. Any single "excited" atom of the gas inside a bubble would take much longer to decay and emit light, but when many atoms decay together they sometimes decay faster.\p
In short, if the many atoms inside the bubble decayed at the same time, then the light waves would emerge in step with each other and at the same frequency. That would account for the short pulses. Khare and Mohanty, the two authors of the paper, suggest that it may be possible to vary the effect by varying the gas composition.\p
The \Jtemperature\j inside some bubbles can reach 10,000║C, twice the \Jtemperature\j at the \Jsun\j's surface, and potentially enough to fuse atoms and form new materials. Your reporter would never, however, stoop to commenting that research in this area seems to be hotting up.\p
#
"Nanotubes in the news",656,0,0,0
(Jun '98)
A June report in \IScience\i described ballistic \Jconductance\j in nanotubes. This means electrons passed through a conductor without heating it, and they did so at room \Jtemperature\j. The conductors were multi-walled carbon nanotubes up to five microns long. (A \Jmicron\j is a millionth of a meter, and more correctly called a \Jmicrometer\j.)\p
The future for this sort of effect lies in ever-smaller electronic devices. The amazing thing about ballistic \Jconductance\j is that it breaks all the "rules" we learn at school. In classical physics, the resistance of a metal bar is proportional to its length, and if you make it twice as long, you will have twice as much resistance. But for these nanotubes, it makes no difference whether they are long or short because the resistance is independent of the length or the diameter.\p
The reason behind this effect: inside the nanotube, the electrons act more like waves than particles in structures whose size approaches that of the wavelength of electrons, making the whole operation more like \Joptics\j than \Jelectronics\j. But to go much beyond that, we would need to burst into a feature article on quantum mechanics.\p
The main thing to keep in mind is that the nanotubes can carry huge currents, with current densities greater than ten million amperes per square centimeter, far greater than could be handled by any other conductor.\p
#
"18 million chemicals",657,0,0,0
(Jun '98)
In June, the Chemical Abstracts Service added the 18 millionth chemical substance to its \Jdatabase\j. The substance's name is (1S-\Jcis\j)-2-\JPhenyl\j-3-cyclohexene-1-carboxylic acid and it is identified by the CAS Registry Number 207110-49-4, just in case somebody wants to go and visit it.\p
The substance, identified in a patent application from Merck and Co., is "an intermediate compound in the preparation of tachykinin receptor antagonists which may be useful in the treatment of inflammatory diseases, pain or migraine and other ailments."\p
The Chemical Abstracts Service reviews and summarizes chemistry research in patents, conference proceedings and more than 8000 scientific journals, transferring new compounds to the CAS Registry file, where new chemical substances are added at the rate of approximately one new record every 9 seconds.\p
#
"Making better fuel cells",658,0,0,0
(Jun '98)
The best way to generate power from fuel is to use a fuel cell, a device that side-steps a basic thermodynamic limit on the efficiency of engines that burn power-packed chemicals such as \Jhydrogen\j, alcohols, and \Jnatural gas\j to make heat, and then use the heat to generate electricity. The only problem is that fuel cells need catalysts to make the fuel give up its \Jenergy\j.\p
The \Jhydrogen\j fuel cells that have flown on space missions since Gemini are not practical for most applications on \JEarth\j because they use catalysts and electrolytes that work only with very pure \Jhydrogen\j, which is expensive to make and is hard to store and transport. The thought of carrying a compressed \Jhydrogen\j cylinder around with a laptop does not really bear thinking about . . .\p
Now Thomas E. Mallouk and his colleagues have announced what they see as a breakthrough, in \IScience\i during June. They are working on methanol, a liquid fuel that can be made cheaply from biomass or from \Jfossil\j reserves such as \Jcoal\j, oil, or \Jnatural gas\j, and which is used, for example, in the cars which race in the \JIndianapolis\j 500.\p
Methanol presents a special problem for fuel cells because it "poisons" the surface of \Jplatinum\j catalysts. The catalysts do not absorb \Jwater\j which is needed to oxidize carbon monoxide on the surface, so other elements have to be added which will take a grip on the oxygen part of a \Jwater\j molecule. The previous best catalyst has been a \Jplatinum\j-ruthenium alloy.\p
The new catalyst, a quaternary alloy containing \Jplatinum\j, ruthenium, osmium, and iridium, is between 40% and 100% better, depending on the power demand on the cell and is particularly good under high current/high power conditions, the researchers say.\p
One of the researchers, Eugene Smotkin, had collaborated in some work which revealed the need to test a variety of more complex alloys, but aside from the four metals in the final product, there was also rhodium to consider, and a wide range of different compositions to be tested. It sounded like a lot of drudgery for somebody, because there would be hundreds of combinations needing testing.\p
The solution was to use an ink-jet printer to produce hundreds of dots, each about the size of a lower-case letter "o," contained a slightly different mixture of five elements: \Jplatinum\j, ruthenium, osmium, iridium, and rhodium, on a large carbon electrode, and the dots were then converted to metal, and the whole electrode was used to catalyze the \Joxidation\j of methanol.\p
In a case like this, the good catalysts cause more reaction, and so produce more of the acid which is a by-product of the reaction. All they had to do was add a fluorescent acid-base indicator in the solution above the array. Spotting the acid concentration by just glancing over the array pinpoints the most active dots, and narrows down the field very rapidly.\p
#
"Simulating the universe",659,0,0,0
(Jun '98)
A June report indicates that scientists have for the first time, modeled the entire observable universe in a computer simulation. This was achieved by a multinational team called the Virgo Consortium, using a 512-processor Cray supercomputer at the Max Planck Society's computing centre in Garching, \JGermany\j.\p
\IKey name\i: Virgo Consortium\p
#
"Supercomputer on the cheap",660,0,0,0
(Jun '98)
Can't afford a Cray? Never mind, here's another way to do it. A supercomputer built from ordinary personal computer components is among the 500 fastest computers in the world, according to the 11th TOP500 list released at the Supercomputer '98 conference in \JMannheim\j, \JGermany\j. The Avalon computer cost just US$150,000 to build, and can compute almost 20 billion mathematical operations in a second, and rates 315th in the world rankings.\p
Avalon is located at Los Alamos National Laboratory, and it was built out of 68 high-end personal computers that use the Digital Equipment Corporation Alpha \Jmicroprocessor\j, connected by 3Com network switches similar to those found in a university department or small business. Each processor in the Los Alamos supercomputer is an ordinary PC, using the same type of memory and disk drives found in a computer on an office desktop.\p
For software, the Los Alamos team used an open source Linux operating system and other software available on the \JInternet\j. In its short life, Avalon already has performed some significant scientific computations. One of the first simulations followed the \Jevolution\j of a shock wave through 60 million atoms. The simulation ran for more than 300 hours on Avalon, calculating about 10 billion floating point operations per second.\p
Computers using off-the-shelf technology like Loki (Avalon's predecessor) and Avalon are called "Beowulf" computers, after the project begun by Thomas Sterling at the NASA Goddard Space Flight Center. They typically replace computers costing US$1 million or more.\p
More information about Avalon is available on the Web at \Bhttp://cnls.lanl.gov/avalon\p
The Traveling Salesman revisited\b\p
The Traveling Salesman problem is one of those nasty problems that hackers like to throw at their computers. Given a range of towns to be visited, how do you order the visits in order to minimize the distance traveled? One solution is to enumerate all of the possible routes, and then test each one for a total distance traveled.\p
In 1994, a TSP solution was found for 7397 US cities in 1994, but now a new solution has been found, covering 13,509 US cities and towns with populations of more than 500 people. But why would anybody bother?\p
The answer is that the TSP is typical of a whole class of hard optimization problems with applications in science and \Jengineering\j, and what works on this problem will work anywhere: the key will always be an efficient \Jalgorithm\j. Counting reverse travel as the same route, with two cities, you have just one route, with three cities there are three routes, with four, there are twelve routes, with five there are 60, and the number is beginning to escalate: in general, for n cities, the number of distinct routes is n!/2 (where n is "n factorial.")\p
The breakthrough was achieved on a cluster of three Digital AlphaServer 4100s (a total of 12 processors) and a cluster of 32 Pentium-II PCs located in Rice University's Duncan Hall. The calculation took approximately three months from start to finish.\p
The researchers are feeling rather pleased, naturally enough: it involves ideas from polyhedral combinatorics and combinatorial optimization, integer and linear programming, computer science data structures and algorithms, parallel computing, software \Jengineering\j, numerical analysis, graph theory and more, according to one of them.\p
Direct applications of the TSP range from drilling holes in printed circuit boards to designing fiber-optic communications networks, and from routing helicopters around oil rigs to picking up coins from \Jtelephone\j booths. But traveling salesmen? Sadly, they are a dying breed, and probably will never benefit from the solution found to the problem which bears their name.\p
For a map of the United States showing the TSP solution for 13,509 cities, see: \Bhttp://www.crpc.rice.edu/CRPC/Images/TSP/tsp.gif\p
#
"Breast cancer risk factors identified",661,0,0,0
(Jun '98)
Women who display certain physical characteristics, such as abundant body hair, excessively oily skin and an apple-shaped physique, may have a greater risk of developing breast cancer than other women, according to a paper read to the Society for Epidemiologic Research in June.\p
The blood levels of \Jandrogens\j, male sexual \Jhormones\j, produced in women in small amounts, have been positively associated with the risk of developing breast cancer in several prospective studies. The most active androgen is testosterone. The reported study was searching for evidence that external signs of high androgen levels in women could be associated with breast-cancer risk.\p
The study rated a large sample of women in northern \JItaly\j for body hair, sebum production from the skin's oil-and-wax-producing sebaceous glands, and body-fat distribution, based on the ratio of waist-to-hip measurements. They were then observed for cases of breast cancer.\p
The conclusions: excess body hair increases the risk by 33% in post-menopausal women, while premenopausal women with the highest levels of sebum production were at 21/2 times the risk of developing breast cancer compared to those at the lowest level, and premenopausal women with the stereotypical "apple" shape, in which the waist is bigger than the hips, had a 21/2 times greater risk of developing breast cancer than their counterparts with "pear" shapes, in which the hips are bigger than the waist.\p
The implications of this study are that women in higher risk groups may need to take medication to reduce androgen levels, or go onto a diet with that effect, or they may need closer screening.\p
\IKey names\i: Paula Muti, Martin Stanulla\p
#
"Taxol",662,0,0,0
(Jun '98)
The anti-cancer drug Taxol (paclitaxel) is extracted from the bark and needles of yew trees. It is federally approved in the USA for treating breast and ovarian cancers, and now is finding a range of new uses, including oesophageal cancer and the AIDS-related Kaposi's sarcoma. Luckily for the limited world supply of yew trees, alternative sources have now been found. The cost of taxol ranges between $1,000 and $2,000 a treatment, and \JBristol\j-Myers Squibb earned almost $941 million in sales worldwide last year of this one drug, making it big business.\p
In 1993, researchers found a \Jfungus\j, now named \ITaxomyces andreanae\i, which is an efficient producer of taxol, growing on a Pacific yew tree in northwestern Montana. Since then, Gary Strobel, one of the researchers, has found a \Jfungus\j from an Asian yew plant which makes 1,000 times more taxol than the original Montana \Jfungus\j.\p
Strobel has sampled yew trees from \JNepal\j, the \JHimalayas\j and across North America and found taxol-making fungi in every one of them. Strobel thinks taxol-making microbes evolved to protect the trees from a group of root infecting fungi that thrive in moist environments. Taxol attacks these fungi the same way it attacks human cancer cells, he suggests.\p
#
"How a tuberculosis drug works",663,0,0,0
(Jun '98)
Isoniazid has been used to kill the \Jbacteria\j, \IMycobacterium tuberculosis\i, which was the cause of \B\1TB\c\b for almost fifty years. Now, with an understanding of how the drug works, scientists find themselves well-placed to begin developing new drugs against this scourge, which is fast developing a resistance to all the five widely available drugs which are used against it.\p
TB infects 1.9 billion people, one-third of the world's population, and if it is left unchecked, could infect 300 million, cause TB in 90 million, and kill more than 30 million people in the next decade, according to the World Health Organization. As the number of new cases increases, multi-drug-resistant strains of TB have made treatment more and more difficult. At the moment, there are ten million new cases each year, and there were 3.1 million deaths worldwide in 1995.\p
Isoniazid kills the tuberculosis bacterium by stopping it from making mycolic acid, a waxy substance that strengthens bacterial cell walls. Without a strong wall, the bacterium bursts when it tries to reproduce, but how did it target the bacterium in this way? To find out, researchers used a quirk of bacterial \Jphysiology\j to find out how the isoniazid acts.\p
\JAntibiotics\j kill by binding to one of the proteins that the bacterium needs to survive, in this case one which is needed for making mycolic acid. But, in its death throes, the bacterium often makes more and more of the protein in a futile effort to save itself.\p
So to identify the mystery proteins that isoniazid binds to, the researchers exposed the \Jbacteria\j to isoniazid. In response, the \Jbacteria\j churned out two proteins, both of which resembled proteins that produce fatty acids like mycolic acid. The proteins also stuck to isoniazid, a further indication that they were the ones crippled by the drug. The team found that some isoniazid-resistant strains have mutated forms of one of these proteins, called KasA.\p
This is all that is needed: if the \Jbacteria\j need KasA for growth, scientists can now design drugs that block it, and they can also target the mutated form of KasA. Rather than trying everything to see if anything works, they can customize their drugs to be more efficient killers.\p
#
"New TB vaccines",664,0,0,0
(Jun '98)
Two new vaccines for animals are being explored in the USA. The present BG vaccine is believed to lose its protective power over time, and causes vaccinated individuals to "test positive" for TB when they do not in fact have TB at all. The new vaccines are expected to avoid both these problems, according to a June report in \IInfection and Immunity\i.\p
One vaccine uses specific proteins isolated from the tuberculosis bacterium. When mixed with interleukin-2, a protein known to boost the immune system, and added to a carrier called an adjuvant that also helps boost immunity, the vaccine simulates a TB infection and generates a specific type of \Jlymphocyte\j needed to combat TB infections.\p
A second vaccine uses a segment of DNA from \IMycobacterium tuberculosis \ithat encodes and produces a specific protein that also is involved in immunity to TB. This same approach, using a DNA sequence from a bacterium or virus to stimulate immunity, has produced other promising vaccines.\p
#
"TB resistance on the rise",665,0,0,0
(Jun '98)
A report in the \INew England Journal of Medicine \iduring June made it clear that resistant strains of TB \Jbacteria\j are a growing worry. Because the \Jbacteria\j hide inside cells and rapidly evolve resistance to \Jantibiotics\j, they are particularly hard to kill. Infected patients must simultaneously take three and sometimes four \Jantibiotics\j for up to 8 months to beat back an infection. When patients don't finish their treatment, resistance to even two of these drugs - isoniazid and rifampin - can crop up and render a strain almost unstoppable.\p
The problem is that TB is better able to get a hold on people with dysfunctional lifestyles, the poor and the homeless in particular, people who are less likely to persevere in drug treatments.\p
In every one of the 35 countries studied by the World Health Organization and the International Union against Tuberculosis and Lung Disease, strains that resisted both isoniazid and rifampin had appeared. Although 2.2% of all tuberculosis strains worldwide resisted both drugs, an alarming 13% of strains from patients treated for less than a month also resisted both drugs. \JBacteria\j resistant to multiple drugs flourished in areas where \Jantibiotics\j are freely available and tuberculosis control programs are weak, such as the former Soviet Union.\p
#
"Does infection cause Tourette's syndrome?",666,0,0,0
(Jun '98)
Tourette's syndrome is a condition which causes involuntary muscle contractions and bursts of words and noise. For a long while, it has been thought that the syndrome was caused by a gene defect, as it seems to run in families and to be commonly found in pairs of identical twins where one of them shows the syndrome, though there have always been indications that some other factor was required as well. Now a story in \INeurology\i in June indicates that at least some cases of Tourette's syndrome are caused by some sort of infection.\p
At least, that is the inference of Johns Hopkins researchers led by Harvey Singer, after they found a \Jcorrelation\j to certain \Jantibodies\j and having the condition. They believe that \Jantibodies\j made by the immune system in response to a bacterial infection may then go on to attack brain nerve cells in a subset of the children who develop Tourette's. Currently Streptococcus is high on their list of suspects, mainly because that bacterium is already linked to a similar condition, Sydenham's \Jchorea\j. As well, some Tourette's cases have been reported as beginning or getting worse after a Strep infection.\p
The research group took blood samples from 41 Tourette's patients and a group of 39 control subjects, and tested them for \Jantibodies\j to proteins in ground-up human brain tissue. The Tourette's patients had significantly higher levels of \Jantibodies\j against proteins from the putamen, an area at the base of the brain involved in movement. In the past, brain imaging studies have shown changes in the shape and size of the putamen in Tourette's patients, reinforcing the idea that these \Jantibodies\j may contribute to the disorder, and this view was further supported when it was found that there were no significant differences between patients and controls for \Jantibodies\j to proteins in two other parts of the brain, the caudate and the globus pallidus.\p
At the moment, researchers are entertaining the notion that people who have two copies of the Tourette's gene always develop the syndrome, while those who receive one copy of the gene, estimated at about two per cent of the general population, develop Tourette's only after being exposed to another factor in the environment, such as an infection. Until the Tourette's gene is isolated, that is probably as far as anybody can go.\p
#
"Using St John's Wort to reduce alcohol intake",667,0,0,0
(Jun '98)
St John's Wort, \IHypericum perforatum\i, a serious weed problem in \JAustralia\j (see \BWeeds in the act\b, September 1997), is a popular herbal treatment against depression in other parts of the world. Now its active ingredient, hypericin, might prove effective in the fight against \Jalcoholism\j. In laboratory rats bred to prefer alcohol, the drug has reduced their alcohol intake by about 50%. When given the choice, these rats consume a small amount of \Jwater\j and a great deal of alcohol.\p
Unlike some medicinal plants, this one is easy to come by. Apart from its Australian weed status, the plant is also found growing wild in Europe, western Asia, North \JAfrica\j and in North America, particularly the Pacific Northwest. It is a common wildflower with vivid yellow flowers edged with tiny black beads. When rubbed, the plant releases a red pigment containing hypericin.\p
Amir H. Rezvani points out that there is logic in his work, as depression and \Jalcoholism\j have a strong biological link, so if hypericin worked for depression, then it might just have a beneficial effect on \Jalcoholism\j as well. Now the treatment has worked on two strains of rats, it will be extended to humans.\p
\JAlcoholism\j is a complex disease, so no one single cure is likely to be found, but this could offer a way to alter the brain chemistry of people wanting to change.\p
#
"Athlete's foot: see a specialist",668,0,0,0
(Jun '98)
A study published in the \IJournal of the American Academy of \JDermatology\j\i at the end of June shows that family doctors and other non-dermatologists were far more likely than dermatologists to prescribe less-effective yet more costly medications for athlete's foot and other fungal skin disease.\p
Researchers surveyed 4.1 million visits to doctors for fungal skin infections over a period of five years. Of those 4.1 million visits, 82 per cent of the visits were to doctors other than dermatologists. The survey showed that 34.1 per cent of the non-dermatologists prescribed drugs that combined cortisone with anti-fungal agents, compared with just 4.8 per cent of the dermatologists.\p
Other studies have shown failure rates of 45 per cent for combination drugs, compared with just 8 per cent for single agents, apparently due to interference from the cortisone, which limits the effectiveness of the treatment. While the study was funded by the manufacturer of one of those single-agent treatments, there would appear to be some reason for concern in this case.\p
#
"Anemia treatment in AIDS/HIV",669,0,0,0
(Jun '98)
There is now evidence available to suggest that untreated anemia is a significant cause of death, even when the hemoglobin levels in patients are within "normal" ranges, according to new data presented at the 12th World AIDS Conference in Geneva during late June and early July.\p
Up to 95% of people living with HIV will encounter anemia over the course of infection. But while anemia traditionally has been treated with blood transfusions when the red blood cell protein hemoglobin drops to severely low levels - below 8.5 grams per deciliter, this level is probably too low for HIV patients. In fact, people with hemoglobin levels as high as 12g/dl, typically considered to be in the "normal" range, can benefit from the red-cell-boosting drug erythropoietin, given when anemia is related to therapy with antiretroviral drugs like AZT.\p
#
"Capturing the structure of gp120",670,0,0,0
(Jun '98)
HIV is deadly because it attacks cells of the immune system. As virologists describe it, the virus attaches to a CD4 T-cell receptor, using a viral surface glycoprotein called gp120. The future of AIDS vaccines probably lies in developing a full understanding of the structure of gp120. So it is hardly surprising that reports in both \INature\i and \IScience\i during June revealing the structure of gp120, were also featured on the cover of each.\p
The structure was unraveled using complicated x-ray \Jcrystallography\j, a technique that passes x-rays through a crystal from many angles, determines their pattern of \Jdiffraction\j, and then assembles the data to reveal the crystal's 3-dimensional structure. This is a standard method, which first achieved prominence after it was used to identify the structure of DNA in 1953.\p
The main problem is that HIV is excellent at disguising itself from identification, and that meant that the gp120 molecule was hard to settle down into crystal form. No crystals, no x-ray \Jcrystallography\j, so researchers had to work their way around the problem, deleting bits by informed guesswork until they got clues which should build up into a full picture.\p
Now scientists can "see" specific targets for anti-HIV vaccines and drugs, and they can identify the surprising array of defences that the virus uses to evade attack. They have discovered, for example, that a large part of the gp120 surface is protected against attack by a dense array of carbohydrates and by an amazing capacity to change shape.\p
The gp120 molecule has a shape-shifting device, a set of loop-shaped projections that stick out above the molecule's surface hide the critical locking regions from the immune system until an attack is launched. When the virus reaches its target, the loops collapse and move out of the way, unmasking the locking regions, rather like a warship changing to its true colours, just before opening fire.\p
Then there is an "icing" of \Jcarbohydrate\j molecules that also shields the receptor-binding regions of the gp120 surface from antibody attacks: finding a method of stripping these away should be a profitable line of attack.\p
#
"Hereditary colon cancer",671,0,0,0
(Jun '98)
Hereditary nonpolyposis colorectal cancer (HNPCC) is linked to two genes called MLH1 and MSH2. People who carry one or more of the mutations for HNPCC have a greater than 85 per cent chance of developing colorectal cancer by age 75. The mutations also increase the risk of endometrial and other cancers, with tumors usually developing before age 50. Screening of colorectal cancer patients is important, as it is likely to identify relatives who are also at risk, who can then undergo regular colonoscopies to identify cancers before they really take hold.\p
Testing all colorectal cancer patients for mutations in these genes would cost millions of dollars, making it too expensive to be practical, but researchers have found a way to screen people who might have the mutations by testing tumor cells for the presence of oddities called "instability of microsatellite repeats," according to a May report in the \INew England Journal of Medicine\i.\p
The microsatellite repeats are tiny regions of DNA which are repeated dozens of times, and they probably develop during errors in DNA replication inside cancer cells. The errors often occur because of mutations in genes causing HNPCC. Everyone at risk for HNPCC has unstable microsatellite repeats, but not everyone with unstable microsatellite repeats is at risk for HNPCC.\p
The unstable microsatellite repeats can be detected easily using the polymerase \Jchain reaction\j (PCR), a relatively inexpensive test, replacing the much more expensive genetic testing which would otherwise be needed. In a study of 509 colorectal cancer patients in \JFinland\j, 12% had unstable microsatellite repeats, a total of 63 cases, and 10 of these had mutations in one of the HNPCC genes in normal cells, and therefore represent newly detected families with HNPCC. The fact that the mutations were found in normal cells indicates that the individuals were born carrying the \Jmutation\j.\p
A further study is under way in \JItaly\j, and a further series of tests are planned on an Ohio population. In the meantime, the senior author of the study recommends screening for unstable microsatellite repeats in all colorectal-cancer patients under 50 years of age who have a personal or family history of colorectal or endometrial cancer. Those who test positive for the repeats should be offered genetic testing for the HNPCC mutations.\p
\IKey names\i: Albert de la Chapelle, National Cancer Institute, and Academy of \JFinland\j.\p
Around a third of all people who have undergone surgery for colorectal cancer face development of additional tumors. But tests to detect cancer recurrence can give contradictory results, forcing doctors to perform exploratory surgeries that may be too late to be useful. An early July report suggests that \Jpositron\j emission \Jtomography\j (PET) can detect additional tumors early and reveal the extent to which the colorectal cancer has spread.\p
#
"Gene therapy for sickle cell anemia?",672,0,0,0
(Jun '98)
Sickle cell anemia is a blood condition common in people of African descent, which is also known as \B\1sickle cell disease\c\b. A new method for dealing with this condition was proposed in \IScience\i during June. The solution may lie in correcting the RNA, which translates that genetic information to the protein synthesis machinery of a cell, not the faulty DNA, the storehouse of genetic information which triggers the disease.\p
The paper's senior author indicates that it is possible to correct a genetic defect in blood extracted from patients for experimentation, not just in laboratory-grown cells, adding that all they are seeking is a 10-20% correction, as even this much would make a huge difference to patients.\p
There is no effective treatment for the underlying defect, which is found also in people of Middle Eastern origin - the sickle cell trait in a single dose gives useful protection against \B\1malaria\c\b, but a double dose, the homozygous condition, is not good news. Complications of sickle cell disease can include stroke, bone pain, kidney damage and breathing problems.\p
The trick is to use a kind of "molecular scissors" called a ribozyme, a type of RNA \Jenzyme\j that can find a specific sequence of RNA code, chemically cut out a section, and splice in another section, which obviously enough, represents the code for a normal globin gene.\p
Standard gene therapy schemes try to replace defective genes in the \Jgenome\j, but here, the ribozymes correct defective RNA, the message copied from DNA. If this method works, it will increase the amount of "good" globin at the same time as reducing the "bad" globin levels.\p
\IKey names\i: Bruce Sullenger, Ning Lan, Richard Howrey, Seong-Wook Lee and Clayton Smith\p
#
"Non-smokers protected by their genes",673,0,0,0
(Jun '98)
Some people are able to break down nicotine in their systems, but some people have a defective gene which limits their ability to metabolize nicotine. These people are less likely to become smokers and, if they do smoke, will smoke fewer cigarettes, according to a \INature\i report during June.\p
In fact, they are twice as likely to avoid smoking altogether if they have a faulty form of gene CYP2A6. Overall, 20% of non-smokers carry a defective version of the gene, compared to only 10% of smokers. As well, those with a defective gene smoked an average of 129 cigarettes a week compared to the 159 cigarettes a week smoked by people without the defect.\p
The prospect is now open to come up with a way of disabling the \Jenzyme\j produced by the active CYP2A6 gene, helping smokers to break their habit.\p
\IKey names\i: CYP2A6, Rachel F. Tyndale, Alan I. Leshner\p
#
"No mad cows today, thanks",674,0,0,0
(Jun '98)
Bovine spongiform \Jencephalitis\j (BSE), or "mad cow" disease is a worry to Americans, even though it has never been detected there, and even though the known causes of BSE, such as feeding \Jcattle\j on feed contaminated with BSE proteins, are not allowed to happen there, and even though there are no documented cases of BSE in the United States.\p
These BSE proteins bind to proteins already in the cow's system, known as PrPs (prion proteins), and rapidly reproduce, attacking the brain tissue and eventually killing the animal. Humans may be able to contract the human form of BSE, Creutzfeldt-Jakob disease, from ingesting infected meat, milk or other products from cows. (More detail can be found in the article on the \BNobel Prizes\b, October 1997.)\p
Because of the fear of mad cow disease, scientists must constantly test \Jcattle\j for BSE - an expensive and time-consuming process. Now comes a solution: \Jcloning\j \Jcattle\j with no PrPs. Working with mice, two \JTexas\j researchers have produced a strain of animals with no PrPs, which seem to be healthy and happy. Any BSE proteins put into their body slowly broke down and disappeared.\p
\IKey names\i: Jorge Piedrahita, Mark Westhusin\p
#
"Using viruses in gene therapy",675,0,0,0
(Jun '98)
Viruses are usually expert at entering our cells, but when scientists alter them to carry desirable genes for genetic therapy, they often have trouble getting into the cells where their cargo is needed. The answer, according to a June report in the \IProceedings of the National Academy of Sciences\i, is to build a bridge for the viruses.\p
The bridge is made from two proteins: the growth factor EGF and the receptor for a protein found on the avian leukosis virus (ALV). This links the virus and target cells, and allows the virus not only to bind but also enter the designated cell targets. Only those cells with the EGF receptor bound the molecular bridge, meaning that the virus is now carefully tuned to approach only cells of that type.\p
The ALV system is simple, using a single receptor to enter cells, but while the ALV receptor thus appears to be uniquely suited to the protein bridge approach, EGF is only one of many possible protein partners. This technique could be worth watching.\p
\IKey names\i: John Young, Sophie Snitkovsky, Adrienne Boerger\p
#
"Cloning cow cells",676,0,0,0
(Jun '98)
This story is worth reporting on, if only to record the warning that appeared at the head of the news release which first revealed it to your reporter. After a stern warning on the embargo date, it advises:\p
Warning: This document, and \INature Biotechnology\i papers to which it refers, may contain information that is price sensitive (as legally defined, for example, in the UK Criminal Justice Act 1993 Part V) with respect to publicly quoted companies. Anyone dealing in securities using information contained in this document or in advance copies of \INature Biotechnology\i's content may be guilty of insider trading.\p
Hmmm. Science reporting gets more dangerous every \Jday\j. The story itself is about a breakthrough: scientists have, for the first time, coaxed undifferentiated cells from a cow to reproduce indefinitely in culture, showing that they retain their ability to generate live transgenic animals. This procedure, previously possible only in mice, has the potential to provide a plentiful source of cells for transplant therapies for the treatment of human diseases.\p
For years, the mouse embryonic stem (ES) cell has been a remarkably successful tool for producing genetically engineered animals for use in medical and biological research, but efforts to find an alternative mammalian cell have failed, time and time again. Now James Robl and his colleagues have found a way of producing bovine cells that demonstrate many of the properties that make mouse ES cells so useful. And more importantly, it has the potential to be a highly profitable activity, which explains the stern warnings about insider trading.\p
\IKey names\i: James M Robl, Neal L First\p
#
"Algae attacking the floor of the Mediterranean",677,0,0,0
(Jun '98)
A flurry of concern ran briefly through an \JInternet\j \Jalgae\j list to which your reporter subscribes, in April this year, when a Brazilian list member indicated a willingness to export all sorts of marine \Jalgae\j to other parts of the world.\p
After a brief discussion, the matter was allowed to drop. Now, just a few months later, the news is out about a major invasion of the Mediterranean Sea with a new variant of \ICaulerpa taxifolia\i. The new strain is bigger than other types of the species, and only produces male gametes, but floating fragments are able to wash to new areas and take up residence there, over-running all other marine life.\p
It began in 1984 with a square meter or so of the invader in waters off Monaco, but by 1990, it had reached \JFrance\j, by 1992, \JMajorca\j, and by 1993, it was on the coast of \JSicily\j. By 1994, it was in the Croatian Adriatic, according to \IScience News\i.\p
Three times the size of other strains of the species, the new Caulerpa crowds everything else out, from the beach out to 50 meters (150 feet) down. The plant is tropical and sub-tropical, usually only found where the \Jwater\j remains warmer than 20║C (68║F), while the Mediterranean drops to 13║C (55║F), and it may be an aquarium escape, as similar plants are found in an aquarium close to where the first specimens were found.\p
Mechanical techniques do not work on an organism which reproduces from small fragments, and it is not feasible to use chemicals underwater. In the laboratory, specimens have survived for three months at 10║C (50║F), so it appears the only solution will be a biological one.\p
Many sea slugs eat nothing but \ICaulerpa\i, but not the Mediterranean slugs, which seem to be repelled by a poison which goes by the name of caulerpanyne. Luckily, there are Caribbean slugs which will eat the offensive plant, in fact they appear to relish it, apparently because they use the ingested toxins to make themselves unappealing to would-be predators.\p
Better still, these tropical slugs will not survive the Mediterranean winter, so they can be used for a "test run" in the Mediterranean. If they prove to be a problem, attacking other species, winter will take care of them. If they are a success, more hardy strains from \JFlorida\j can be brought in. The only drawback: another, less aggressive \ICaulerpa\i, \IC. racemosa,\i has also invaded the Mediterranean, and there is a risk that the slugs may tackle those instead.\p
There are no real controls on the movement of aquarium \Jalgae\j, and the new form of \ICaulerpa\i has been seen in an aquarium in \JHonolulu\j, so it might be a good idea to start brushing up on your \Jseaweed\j recipes.\p
#
"Turtles screwed",678,0,0,0
(Jun '98)
Endangered leatherback sea turtles are now being tracked by \Jsatellite\j, using tags which have been attached to the animals using a mini-bone anchor screw, a tool used regularly in human surgery.\p
Sea turtles only come ashore to lay eggs, so researchers, early in June, attached a \Jsatellite\j tag to a nesting female leatherback sea turtle. The new attachment method, using the bone anchor screw, implants a tiny bio-degradable screw, threaded with suture material, through the leatherback's shell into the tissue beneath it. The sutures fasten a \Jsatellite\j transmitter, about the size of a cell-phone, to the top of the turtle's shell. The screw disappears after a few weeks, and the sutures continue to hold the tag in place.\p
At last report, the turtle had been sending signals for over a week, and the team behind this work hopes to learn where her migration will take her. The tag not only gives a position and a time, but also provides data on the depth and time of dives. The turtles grow to 3 meters (10 feet) and weigh 700 kg (1500 pounds).\p
As you might expect from their name, leatherback turtle shells are rubbery and oily, so that adhesive material does not work on the shells of leatherbacks. Harness backpacks are a possibility, but these would need to be custom-fitted, and that takes up valuable time when the turtles are only ashore for a short while. If this method works, trapped males and juveniles could be tagged in the same way.\p
\IKey names\i: Molly Lutcavage, Sam Sadove, Anders Rhodin, Greg A. Early\p
#
"Sumatran tigers a distinct species",679,0,0,0
(Jun '98)
A study of mitochondrial DNA sequences of tigers, published in June in \IAnimal Conservation\i, demonstrates that Sumatran tigers are a distinct species from any group of living tigers. The finding has significant implications for tiger conservation efforts, as there are between 400 and 500 in the wild, and just 235 in captive breeding programs.\p
All the world's tigers are under threat in the wild, but Sumatran tigers are currently under-represented in the captive breeding programs of the world's zoos. The traditional view is that there is a single species of tiger with five surviving subspecies: the South China tiger, the Siberian tiger, the Bengal tiger, the Indochinese tiger, and the Sumatran tiger, and three extinct subspecies, the Bali tiger (late 1930s), the Caspian tiger (1950s), and the Javan tiger (1970s).\p
The Sumatran tiger is the smallest of the living tigers, with males ranging from around 2.2 meters (seven to eight feet) in length and weighing between 100 and 140 kg (220 and 310 pounds). In contrast, male Siberians, the largest living tigers, are about 3 meters (10 feet) in length and weigh between 190 and 300 kg (420 to 675 pounds).\p
#
"So why aren't more Australians called Bob?",680,0,0,0
(Jun '98)
Things are bobbing up, Down Under. The Australian continent has long puzzled geologists, because its history shows that it has been flooded at times when sea levels were low, and yet it has been high and dry during times of high sea levels around the world.\p
The answer: \JAustralia\j is flexing, moving hundreds of meters up and down in response to the vast churning of the \Jearth\j's internal heat engine. Not only is the continent moving north at the rate a fingernail grows, but for the past 100 million years, the continent has been churned up and down by convection forces under the tectonic plate that carries the island continent.\p
The huge island, almost as large as the United States, is a block only 20-50 km (12-30 miles) deep, floating on a layer of relatively cool mantle a couple of hundred kilometers thick. Some parts, in Western \JAustralia\j, have remained continually afloat for some 2-3 billion years, because they are lighter still, being formed largely from the original cold mantle material from that early time.\p
All of this has been revealed in a computer model which shows that the eastern side of the continent first rose as \JAustralia\j split away from \JAntarctica\j, draining away the Jurassic seas. Then it was dragged down and flooded by the sea and covered in sediments as it passed over a secondary plate, only to bob up again later, as the plate cleared the underlying plate that it had been sliding over. With the extra sediments, it was able to stay above sea level, even as the seas rose again.\p
The researchers who carried out the computer simulation believe it will have particular application in the hunt for oil and minerals and in reconstructing vast events in \JEarth\j's history.\p
\IKey names\i: Louis Moresi, Michael Gurnis, Dietmar Muller\p
#
"Giant SHRIMP dates rocks",681,0,0,0
(Jun '98)
\JAustralia\j, the land where every country town has a Giant Something as a tourist attraction, and where their advertising invites you to throw another shrimp (which they call a "prawn") on the barbie, is now exporting Giant SHRIMPs.\p
Designed and built at the Australian National University in \JCanberra\j, a giant SHRIMP has just arrived in Stanford: it is a brand new $2.5 million, 12-ton instrument called the SHRIMP, a Sensitive High Resolution Ion MicroProbe, arguably the most coveted instrument of its type in the world, which produces a tiny ion beam which can help answer big questions.\p
In particular, they can help work out the ages of very old minerals. It was a machine like this which determined the ages of the oldest minerals on \JEarth\j in 1983, the oldest rocks on \JEarth\j (3.96 billion years) in 1989, and the oldest minerals in the \Jsolar system\j (4.56 billion years) in 1992. The new machine is fast, doing measurements on a new sample every 15 minutes, and its keepers plan to feed it a varied diet of bits of stardust from very old meteorites, minerals from far-traveled sedimentary basins in western Canada, and samples from deep crustal rocks coughed up by volcanoes in the Bering Strait region, near the border between \JAlaska\j and \JRussia\j.\p
For each grain of material, the SHRIMP can measure the exact chemical constituents of the sample down to tiny differences in atomic mass, allowing isotope ratios to be determined to a combined sensitivity and mass resolution that far surpasses that of any previous ion probe. The probe can distinguish mass differences in atoms as low as one part in 40,000, which makes this a superior instrument indeed.\p
SHRIMP shoots the sample, usually an individual mineral grain from a rock or \Jmeteorite\j, with high-\Jenergy\j oxygen ions fired at speeds of 350 kilometers per second or nearly 800,000 miles per hour. The oxygen ions are focused into a very fine beam about the width of a single strand of human hair. The ions have a negative electrical charge, and when they hit the sample, they knock off positively charged ions and leave impact craters like tiny potholes on its surface.\p
This "sputtering" liberates ions from the sample and sends them traveling down a tube into a curved magnet about 1 meter long, where the magnet separates the ions according to their mass and \Jenergy\j, so lighter and slower ions hug the inside lane, while heavier and faster ones are accelerated to the outer lanes.\p
Next they are sorted in an electrostatic compensator, which re-organizes them according to mass only, removing the effects of \Jenergy\j differences between ions of the same mass, yielding a spectrum of ions perfectly organized in order of increasing mass, from \Jhydrogen\j, with an atomic mass of one, up to uranium, with an atomic mass of 238.\p
Then after that, it is down to good old-fashioned \Jradiometric dating\j, or in the case of meteorites, isotopic fingerprinting. The presence of particular \Jisotopes\j can be used to link samples of unknown origins to a probable source inside or outside the \Jsolar system\j. (For example, scientists believe certain meteorites came from Mars because they have that unusual mix of \Jhydrogen\j \Jisotopes\j that is peculiar to Mars.)\p
The SHRIMP has a website: \Bhttp://shrimprg.stanford.edu\b\p
#
"Ancient human skull has a modern look",682,0,0,0
(Jun '98)
A report in \INature\i during June describes a well-preserved skull of an early human found in the northeast African country of Eritrea which will help plug a major gap in the \Jfossil\j record of human \Jevolution\j, as it shows a mix of old and modern features.\p
The nearly complete brain case resembles that of \IHomo erectus\i, a human ancestor that appeared in \JAfrica\j 1.7 million years ago and in Asia until as recently as 30,000 years ago, according to some researchers. Like \IH. erectus\i, the skull has a pronounced brow ridge and an elongated brain case.\p
The rock stratum where the \Jfossil\j was found has been dated using signs of reversals in the \Jearth\j's magnetic field, and the resulting age of 1 million years puts the \Jfossil\j right between the youngest \IH. erectus\i found in \JAfrica\j, a 1.4-million-year-old \Jfossil\j from Olduvai, \JTanzania\j, and the oldest archaic form of \IH. sapiens\i, a 600,000-year-old specimen from Bodo, \JEthiopia\j.\p
In the long run, this specimen could help anthropologists make sense of other fossils found outside \JAfrica\j that fall in the critical time gap, such as 800,000-year-old remains at Atapuerca in \JSpain\j, and \IH. erectus\i in Asia.\p
\IKey names\i: Ernesto Abbate, Lorenzo Rook\p
#
"Mr Ples loses weight - in the brain",683,0,0,0
(Jun '98)
One standard method for estimating the intelligence of animals is to measure their brain size. This is by no means perfect, once you get over a certain threshold, which would no doubt be a comfort to a 19th century scientist called Bischoff, whose careful measurements over many years revealed to him that women were inferior, as the average weight of a man's brain was 1350 grams, that of a woman only 1250 grams.\p
It is one of the ironies of scientific history that Bischoff, being a true scientist, specified in his will that his own brain be added to his impressive collection. The post-mortem examination revealed to the world that his own brain weighed only 1245 grams. But around the 400-500 gram mark, variation in mass (or volume) is still thought to be important. So it made the cover of \IScience\i during June when "Mr Ples" was shown to have a smaller brain size than was previously thought.\p
This specimen is an australopithecine, and is dated at about 2.6 to 2.8 million years ago. These human-like creatures walked on two legs, ate tough vegetation, made primitive chopping tools, according to some scientists, and lived in a well-wooded, wetter \JAfrica\j from about 3.5 to 2.5 million years ago. Because it was found by Alun Hughes and Philip Tobias, it is unlikely to get a new species name for a while, as Tobias is a member of the "lumping" faction, as opposed to the "splitters" who coin a new species name for every find.\p
The original estimates for brain size came in at about 600 cubic centimeters, when most of the other \IAustralopithecus africanus\i brains are in the mid-400 cc range, but now computerized \Jtomography\j has reduced that to the low 500s. It is still the largest \IA. africanus\i brain known, especially as the same technique is now reducing the brain size estimates for other australopithecines, so that some of them now seem to have brains no larger than a \Jchimpanzee\j.\p
\IKey names\i: Philip Tobias, Glenn C. Conroy, Horst Seidler, Gerhard Weber\p
#
"Which came first, the chicken or the dinosaur egg?",684,0,0,0
(Jun '98)
In the past, a number of people who are emotionally opposed to the idea of \Jevolution\j have attempted to establish that \IArchaeopteryx\i, a missing link between the reptiles and the birds, was a clever hoax. Perhaps we will hear less of this in the future: a report in \INature\i during June describes two newly-discovered 120-million-year-old \Jdinosaur\j species from northeastern China both of which show clear evidence of true feathers.\p
One of the new creatures has been named \ICaudipteryx zoui\i, or "tail feather," for the fan of plumes that is visible at the end of the animal's tail. Down-like feathers and "semi-plumes" are visible on the \Jfossil\j, suggesting that most of its body was feather-covered. Unlike \IArchaeopteryx\i, and modern birds, whose wing-feathers are asymmetrical, with more plume on one side than on the other, \ICaudipteryx's\i "wing"-feathers are symmetrical. A symmetrical feather lacks the aerodynamic qualities thought to be necessary for flight, so it is unlikely that \ICaudipteryx\i could fly.\p
The other new species, \IProtarchaeopteryx robusta\i, earned its name because it resembles \IArchaeopteryx\i, but it is more primitive. \IProtarchaeopteryx\i is about the size of a modern-\Jday\j turkey, and was close to maturity when it died. Like \ICaudipteryx\i, most of its body was probably covered with feathers, although no evidence of wing-feathers was preserved. The relatively long legs of both animals show that they were swift, ground-dwelling runners.\p
\IProtarchaeopteryx\i was found to be more primitive, and may be a close relative to the Velociraptorinae, a group of dinosaurs that takes its name from its most famous member, \IVelociraptor\i, although that relationship is not yet fully resolved. Working out where to locate these new finds involves looking at ninety separate physical characteristics seen in the fossils. The analysis showed that neither \ICaudipteryx\i nor \IProtarchaeopteryx\i was a true bird, but that both were dinosaurs that were very closely related to birds - indeed, \ICaudipteryx\i was determined to be one of the dinosaurs most closely related to true birds.\p
Now the grim in-fighting begins: these are undoubtedly reptiles, yet they have feathers, and the presence of feathers in animals that are not birds raises fundamental questions about how "birds" should be defined, but that is an argument that was neatly avoided when the paradoxical \B\1monotremes\c\b were slipped into the mammalian category, even though they lay eggs.\p
More interestingly, if the feathers were needed for \Jinsulation\j, what does that say about the question of warm-blooded dinosaurs? Now the troglodytes who continue to insist that the dinosaurs will need to give ground: in all probability, they will argue that the dinosaurs were intermediate, being neither warm-blooded nor cold-blooded, and evidence to support such a compromise view has already been found. So far, there has been no strong reaction to it from the "warm-blooded" faction.\p
Previous finds such as \IUnenlagia\i in \JPatagonia\j (see \BBig bird problem\b, May 1997) and \ISinosauropteryx\i (\BChina in the wings\b, May 1997) mean that the pressure is now off \IArchaeopteryx\i as the sole missing link. Other Gobi desert finds, like \IMononykus\i and \IShuvuiaa\i (\BDinosaurs in the news\b, March 1998) and an oviraptor, found nesting on its eggs just as modern birds do, are all part of the larger picture that is now beginning to emerge.\p
#
"Talking about the weather",685,0,0,0
(Jun '98)
Curiously, there were no research reports on global warming during June, other than a brief reference in the story on life under the Antarctic ice, where the \Jbacteria\j may be in trouble if the \Jtemperature\j rises, as they are suspended two meters down in the ice, with just three more meters of ice between them and the deadly dark waters under the ice: with enough warming, they could fall right through and be wiped out!\p
#
"Triton undergoing global warming",686,0,0,0
(Jun '98)
Neptune's largest moon, Triton, seems to have heated up significantly since the Voyager space probe visited it in 1989, according to data gathered by NASA's Hubble Space \JTelescope\j and ground-based instruments. The warming trend is causing part of Triton's surface of frozen \Jnitrogen\j to turn into gas, thus making its thin \Jatmosphere\j denser.\p
The report on this issue in \INature\i during June explains that Triton is moving towards a warm summer, something that happens about once every few hundred years. The researchers think it may be due to seasonal changes in the absorption of solar \Jenergy\j by its polar ice caps, but it is driven by the moon's \Jorbit\j, which is carrying it to an extreme southern summer, where the moon's southern hemisphere receives more direct sunlight.\p
The \Jtemperature\j has gone from about -235.5║C (-392║F) to -234║C (-389║F), which while small, would be equivalent to a jump of 11║C, or 20║F for us, if we look at the relative changes in the absolute \Jtemperature\j.\p
Triton has been undergoing a period of global warming since 1989, and makes an excellent subject for study, as it is a simple example with just a few factors. At the temperatures involved, a small change in the \Jtemperature\j of the ice surface makes a big change in the vapor pressure of the thin \Jatmosphere\j. But even if it is thin, the gas mix around Triton is enough to affect light passing from distant stars, through the moon's \Jatmosphere\j, and on to the telescopes and instruments awaiting the light on this side.\p
Hubble has three Fine Guidance Sensors, and while two of the guidance sensors are normally used to keep the \Jtelescope\j pointed at a celestial target by monitoring the brightness of guide stars, the third can serve as a scientific instrument. In November 1997, this instrument was turned on Triton, and it revealed a "thickened" \Jatmosphere\j which has doubled in bulk since the Voyager encounter.\p
It is also possible, say the researchers, that some other factor like a change in albedo has produced the result, but they are quite confident that the warming is happening, but as one of them commented to the press: "When you're so cold, global warming is a welcome trend."\p
#
"The best and brightest",687,0,0,0
(Jun '98)
A rare event, the discovery of the universe's brightest object, has just been announced. It is a \Jquasar\j which is between 4 million-billion and 5 million-billion times brighter than the \JSun\j. It is estimated to be more than 10 times brighter than any other \Jquasar\j, and outshines the brightest galaxy by more than 100 times.\p
The discovery was announced in \INature\i during June, with a more detailed account now accepted for publication in \IAstrophysical Journal\i.\p
Geraint Lewis and collaborators made the discovery in observations taken with the 2.5-meter Isaac Newton \JTelescope\j at La Palma in the Canary Islands, and also on the 1-meter Jacobus Kapteyn \JTelescope\j at La Palma. The object is scheduled for observation by the Hubble Space \JTelescope\j in the near future.\p
Light in the ultraviolet and optical range comes from what is known as an \Jaccretion\j disk surrounding a supermassive black hole, millions of times the mass of the \JSun\j, while infrared radiation comes from thick dust heated by radiation from the center of the \Jquasar\j, with about half the radiation coming from each source.\p
Don't rush out to look for it in the night sky. The \Jquasar\j is estimated to be 11 billion light years from \JEarth\j, and even the brightest object in the universe dims over that sort of distance. Of course, there is always the possibility that the \Jquasar\j is being magnified through a gravitational \Jlens\j, which might exaggerate the real \Jenergy\j level by a factor of 30 or 40, but even then, it would outshine our own galaxy by more than 1000 times.\p
#
"UFOs worth studying say scientists",688,0,0,0
(Jun '98)
No it isn't a set of tired old worn-out scientists who have gone barking mad. In the first independent review of UFO phenomena since 1970, a panel of scientists has concluded that some sightings are accompanied by physical evidence that deserves scientific study. Not because they think there might be little green people, but because there may be a cause "out there" which could be interesting to science.\p
In 1807, US President Thomas Jefferson said of the claim that meteorites exist: " I could more easily believe that two Yankee professors would lie than that stones would fall from heaven." Rather than fall into the Jefferson error of rejecting as impossible that which we do not understand, we need to collect all of the facts.\p
In 1968, Dr. Edward U. Condon, director of the \JColorado\j Project, in his 1968 UFO report commented that "further extensive study of UFOs probably cannot be justified in the expectation that science will be advanced thereby." The new proposal has concluded that "it may be valuable to carefully evaluate UFO reports to extract information about unusual phenomena currently unknown to science. To be credible to the scientific community".\p
The scientists found that some of the reported incidents may have been caused by rare natural phenomena, such as electrical activity high above thunderstorms or radar ducting (the trapping and conducting of radar waves by atmospheric channels). However, the panel found that some of the phenomena related to UFOs are not easy to explain in this fashion.\p
\IKey names\i: Peter Sturrock, Von R. Eshleman\p
#
"New planet",689,0,0,0
(Jun '98)
If the UFOs aren't out there, the planets certainly are. Gliese 876, one of our \Jsun\j's nearest neighbors, just 15 light-years away, possesses a \Jplanet\j 1.6 times as massive as Jupiter, according to a report in late June at a symposium of the International Astronomical Union in Victoria, British Columbia. It was revealed by its gravitational pull on the star, as seen from the telescopes at Lick Observatory and the Keck I \Jtelescope\j atop Hawaii's Mauna Kea by Geoffrey W. Marcy and his colleagues.\p
The finding has since been confirmed through telescopes at the Haute-\JProvence\j Observatory in \JFrance\j and the European Southern Observatory in La Serena, \JChile\j.\p
This brings the current count of planets detected by star wobble to 12. The star itself is just one-third the mass of our \Jsun\j, and this is one of the first low-mass stars to be checked for a \Jplanet\j, raising the question: how many more unanticipated planets are there out there? An announcement of two more planets, found by a Geneva team, was supposed to come out later in June, but still had not surfaced when this report was prepared.\p
The newly detected \Jplanet\j takes 61 days to go around the star, and its average distance from its parent is one-fifth the separation between the \JSun\j and \JEarth\j, making it closer to its star than Mercury is to our \Jsun\j, but the surface \Jtemperature\j appears to be around -60║C (-75║F), but there could be warmer layers further down, where \Jwater\j might exist.\p
#
"Life is everywhere",690,0,0,0
(Jun '98)
Bacterial colonies are thriving underneath ice on one of the coldest, driest deserts on \JEarth\j, according to a \IScience\i report in late June. The \Jbacteria\j are in ice-covered lakes in the McMurdo Dry Valleys of \JAntarctica\j, where the average annual \Jtemperature\j is about 68 degrees below zero and gets less than four inches of precipitation a year. (That is on the Fahrenheit scale, though at that \Jtemperature\j range, it matters little, as it converts to -56║C - the scales coincide at -40║.)\p
Even in that frigid, arid environment, scientists found liquid \Jwater\j pockets embedded about two meters (six feet) deep in solid ice, where a combination of sediments, \Jwater\j and solar radiation during long summer days supports a viable population of \Jbacteria\j which use sunlight filtering through the ice to activate and sustain life when the South Pole tilts toward the \JSun\j each year.\p
Solar heating allows the \Jwater\j to melt around soil particles that have blown over the ice and have been buried in it. The sediment sinks to a depth of about two meters, and then seems to form a layer there. Microbes covering the soil particles can then spring to life within an hour under certain conditions, even though they are still embedded deep in the ice.\p
Among the microbes the scientists identified were blue-green \Jalgae\j or cyanobacteria, which are the most ancient photosynthetic, oxygen-producing organisms known on \JEarth\j, and various \Jbacteria\j. The community seems to make an ice-fixing substance which works like antifreeze to keep the \Jwater\j pockets liquid for an extra two weeks before the onset of winter, giving the life forms more time to reproduce.\p
If life can be found there, say the researchers, they can be found almost anywhere. In particular, they are looking at Mars and Europa, a large moon of Jupiter, as possible places where life may exist. So long as conditions have been a little less harsh in the past, life may have evolved, and now be clinging grimly on, somewhere below the surface of Europa or Mars.\p
We still have a great deal to learn about the tiny life forms on our own \Jplanet\j: microbiologists now realise that there are many more organisms out there than we thought, that up until now, only those which could grow in culture, or which caused disease could be detected. Now, with advances in molecular \Jbiology\j we can identify these unknown organisms, but studying them and their peculiar abilities will take longer.\p
In one sense, the new discoveries make expeditions to the \Jsolar system\j neighbors urgent - that is what scientists will be urging - but on the other hand, we probably need to know a great deal more before we rush off in search of the unknown.\p
#
"NEAR but far",691,0,0,0
(Jun '98)
NEAR, the Near \JEarth\j Asteroid Rendezvous \Jspacecraft\j has become the most distant man-made object ever detected by optical means when Gordon Garradd of Loomberah, New South Wales, \JAustralia\j, spotted it at 20,909,000 miles (33,650,000 kilometers) from \JEarth\j.\p
The craft was visible because NEAR's 100 square feet of solar panels fortuitously reflected sunlight directly at \JEarth\j for a few minutes following the \Jspacecraft\j's successful 12th trajectory correction maneuver.\p
NEAR has now completed more than 1.6 billion km (1 billion miles) of its journey since its launch on Feb. 17, 1996, and is more than halfway to a Jan. 10, 1999, rendezvous with asteroid 433 Eros. NEAR will be the first \Jspacecraft\j to \Jorbit\j an asteroid and to study its composition and characteristics at close range, getting as close as 15 km (9 miles) from the surface.\p
Images are available at \Bhttp://usrwww.mpx.com.au/~gjg/near2.htm\b\p
#
"Hubble 'sees' expanding nova shells",692,0,0,0
(Jun '98)
During June, NASA announced the discovery of expanding gas clouds thrown off by nuclear eruptions in stars. The result is of interest because it provides reliable distances to these stars. The rings which can be seen around the stars are actually expanding clouds of gas thrown off by the eruptions.\p
Pictures of the gas clouds, which were taken with the Hubble Space \JTelescope\j, are available for viewing on the World Wide Web at \Bhttp://www.astro.psu.edu/users/ringwald/\b and a grayscale PostScript image is available for downloading at \Bhttp://www.astro.psu.edu/users/ringwald/novashells.ps\b\p
The two novae had nuclear eruptions in 1984 and 1991, and the stars are called QU Vulpeculae, in the \Jconstellation\j Vulpecula (the Fox with a Duck), and V351 Puppis, in the \Jconstellation\j Puppis (the Afterdeck of Argo, the Ship). The stars are still visible in the centers of the expanding clouds, called \Jnova\j shells.\p
These two \Jnova\j shells are so far away that they appear to be tiny, barely one second of an arc in apparent diameter, while a full moon or the \Jsun\j covers a breadth of 30 arc minutes, or 1800 times that width. The expansion speeds of the shells have previously been measured at more than 3200 km/sec (2000 miles per second) for QU Vul, and around 5200 km/sec (3200 miles per second) for V351 Pup.\p
Since we know how fast the shells are expanding, and when the novae erupted, this means we know the absolute size of the shells, as well as their apparent size, so that calculating their distance becomes a simple piece of \Jgeometry\j. QU Vul is 18,300 light-years away, and V351 Pup is 14,800 light-years away.\p
\IKey names\i: Fred Ringwald, Jerome A. Orosz, Richard A. Wade, and Robin B. Ciardullo\p
#
"July, 1998 Science Review",693,0,0,0
\JFrench science to get a boost\j
\JChainsaw-equipped robot goes after smokers\j
\JTsunamis in the news\j
\JClean water\j
\JHerbicides the sneaky way\j
\JExotics on the rise in Hawaii\j
\JVirtual dissection\j
\JCeramics which keep their shape and size\j
\JThese shells aren't seashells, I'm sure\j
\JNegative resistance?\j
\JCarbon-36 fullerenes could be higher-temperature superconductors\j
\JAn artificial liver\j
\JTaking the automatic pilot a step further\j
\JTexas heat wave only the beginning\j
\JEarly 1998 the warmest on record in northeastern USA\j
\JSIDS rates dropping\j
\JHerpes in the news\j
\JPreventing eye herpes\j
\JMenopause less of problem in Japan\j
\JOsteoporosis and steroids\j
\JPreventing steroid osteoporosis\j
\JOlive oil not such a good defence after all\j
\JImplants safe in Britain\j
\JDark honey is better than light honey\j
\JSpices and health\j
\JMarijuana as damaging as tobacco\j
\JNow it's Dolly the mouse?\j
\JSyphilis genome sequenced\j
\JPhytophthora defence mechanisms clearer\j
\JThe rise and rise of MRSA\j
\JPoliovirus as a friend, not as a foe\j
\JThe pollinators of the cycads\j
\JThe first modern humans\j
\JClues in dung\j
\JLittle satellite lost\j
\JAlan Shepard dies at 74\j
\JA new family of galaxies\j
\JMartian life in doubt once again\j
\JA new class of asteroid\j
\JHeat wave on Io\j
\JGetting ready to collect comet dust\j
\JDon't bother to duck\j
#
"French science to get a boost",694,0,0,0
(Jul '98)
\JFrance\j's research and education minister is Claude AllΦgre, a geochemist, and he is planning an ambitious boost for French science. Over the next four years, if AllΦgre gets his way, \JFrance\j will double the impact of its scientific publications, triple the number of its international patents, and create 400 new high-technology companies.\p
This plan was announced on July 15, in the wake of \JFrance\j's World Cup victory in \Jsoccer\j. The changes will include a move toward a system of peer-reviewed grants to finance publicly funded research - at the moment, strange and impenetrable formulae are used to determine where most of the funding goes.\p
The government also plans to gradually shift the evaluation of research in large public research institutions, such as the basic research agency CNRS and the biomedical research agency INSERM, away from internal committees and instead rely on external review panels made up of French and foreign scientists.\p
Responses have been mixed, with some scientists saying that a peer-review system will funnel more research money to "active young people, good people," rather than funds ending up in the hands of senior lab directors, whose productive years may be behind them. Scientists from less fashionable disciplines are also concerned that they might be left out in the cold when the funds are being handed out. Some of them are even suggesting that certain areas need to be given special protection.\p
It is difficult to think of any government-funded drive to boost science which has actually delivered the promised advances, other than perhaps the Manhattan Project, which had rather different aims. Without funding, progress will never happen, but funding alone does not guarantee that a boost to science will occur. Still, with a committed scientist in the driving seat, French science has its best chance for quite a long while.\p
#
"Chainsaw-equipped robot goes after smokers",695,0,0,0
(Jul '98)
No, it's not another plot from the anti-\Jtobacco\j lobby. The smokers in this case are "black smokers", mineral chimneys formed by undersea \B\1hydrothermal vents\b\c, or hot springs, and they have been cut from the ocean floor by a submersible robot, equipped with a chainsaw.\p
The smokers were found 2250 metres under the sea at hydrothermal vents along the Juan de Fuca ridge, a line of undersea volcanoes some 300 kilometres off Washington state and Vancouver Island. They have been removed for study and eventual display at New York's American Museum of Natural History. The smokers were brought ashore during July, complete with the microbial communities and other life-forms they harbour.\p
Smokers form when seawater seeps down through cracks and crevices into the crust, picking up dissolved chemicals as it goes, until it is drawn into an upward current which carries it back to the sea floor again. By now the \Jwater\j is very hot, much hotter than the boiling point of \Jwater\j as we know it, but as soon as it bursts out into the ocean, the turbulent current is cooled, precipitating out many of the dissolved minerals. This slowly develops a tube of minerals, rather like a chimney, and the flow, complete with sulfide minerals, continues to pour out of the top, looking exactly like a chimney as the sulfides change into fine black particles, and rise up out of the chimney like smoke.\p
One of the active chimneys taken was venting \Jwater\j at 350║C. In spite of (in fact, because of) the heat, the chimney hosted a rich collection of creatures known as extremophiles - animals such as sea spiders (not real spiders but arachnid-like pycnogonids), snails, limpets, and worms that flourish in harsh environments.\p
There are also \Jhydrogen\j sulfide-metabolising microbes living inside the rock, and these will be of particular interest to the scientists who are now picking the smokers apart.\p
The process was filmed for \Jtelevision\j, and several of the smokers will end up on display at the New York museum when a new Hall of \JPlanet\j \JEarth\j opens, in the northern spring of 1999.\p
#
"Tsunamis in the news",696,0,0,0
(Jul '98)
A \B\1tsunami\b\c slammed into the northern coast of Papua-New Guinea on the evening of July 17, drowning an estimated two thousand people, causing many more to ask what could be done in the way of early warning systems or barriers to prevent such disasters.\p
The tsunami in question probably came from a nearby undersea \Jearthquake\j, so the time required to sound an alarm was very limited. The \Jearthquake\j did not show any signs in the seismic trace that it would cause this problem, and detectors at sea probably would not have experienced anything out of the ordinary run of events.\p
The answer probably lies more in building seawalls and controlling building in at-risk areas. The cost of sea walls, however, is probably prohibitive, and the nature of the coast in the area probably makes it close to impossible for the Papua-New Guinea people to find safe areas which are close enough to the coast.\p
A recent conference in \JJapan\j, looked at the effects of the tsunami which struck Okushiri Island and south-west \JHokkaido\j in July, 1993, killing 120 people and causing widespread destruction, estimated at $600 billion.\p
In the past five years, the Japanese government has completed $60 million worth of coastal improvements, including a sea wall of reinforced concrete, nearly 50 feet high, around areas of the island that received most of the damage from the 1993 tsunami. The southern tip of the island, where the entire village of Aonae was washed away, is now a memorial park, which is never to be built on.\p
Both tsunamis were caused by earthquakes close to the coast, although the New Guinea wave was generated by an \Jearthquake\j with its \Jepicentre\j on dry land. The quake caused the sea floor to subside, and when it bounced back, this caused a huge hump of \Jwater\j to form, and this hurtled towards the shore as the tsunami.\p
#
"Clean water",697,0,0,0
(Jul '98)
Aside from the fast-traveling tsunamis, the quality of slower-moving \Jwater\j seems also to have been a major talking point around the world during July. Late in the month, the four million residents of Sydney, \JAustralia\j, were told to boil all their drinking \Jwater\j after \ICryptosporidium \iand \IGiardia \iwere detected in pipes.\p
In the United States, a new way of removing deadly contaminants from \B\1groundwater\b\c was revealed during July. The developers say In-Situ Redox Manipulation, or ISRM, can be used to remediate (fix) contaminated \Jgroundwater\j at up to 60% savings over 10 years, when compared with present methods.\p
ISRM involves adding a chemical to the \Jgroundwater\j which can deal with problem chemicals, and it can be used on a variety of chemicals. In a recent test, the target chemical was chromate, which is often found in the \Jgroundwater\j near metal-plating operations.\p
ISRM has been used to control chromate left over from operations at the Hanford Site in Washington state, where \Jplutonium\j was produced from 1943 to 1989 for use in building nuclear weapons. The chromate was used to inhibit erosion in \Jaluminium\j fuel elements in nuclear reactors. In field tests, the chromate was removed to levels below those set for safe drinking \Jwater\j, and below the safe levels for aquatic life.\p
The idea is that ISRM uses a standard six-inch (150 mm) \Jgroundwater\j well to get to the \Jwater\j, and all treatment takes place well below the surface, making it safer. Five wells were placed about 500 feet (150 metres) from the Columbia River in the path of a known chromate plume. The chemical sodium dithionite was then added to the wells, diluted in \Jwater\j and buffered with \Jpotassium\j carbonate and \Jpotassium\j \Jbicarbonate\j.\p
The sodium dithionite spread out in a circle and created a barrier which is expected to remain in place for up to thirty years. Depending on the reagents used, such barriers will either immobilise or destroy the targeted contaminants. The developers also expect ISRM to be effective in removing technetium and chlorinated solvents, such as trichloroethylene and uranium. Other tests are planned which will target chlorinated \Jhydrocarbons\j.\p
Riverside soil is also being brought into play as a natural purifying agent. In early July, researchers at The Johns Hopkins University described a new method, riverbank filtration, for ridding \Jwater\j of unwanted wildlife such as viruses, \Jprotozoa\j and \Jbacteria\j. As \Jwater\j is required, it is taken from wells drilled some distance from the river, so that the \Jwater\j taken from the wells has first to pass through a considerable amount of soil, which acts as a natural filter.\p
Hopkins researchers have now started a study to find out how effective river filtration is at removing pathogens from the \Jwater\j. They will also be looking to see how well the method removes other organic matter which can form potentially dangerous by-products when disinfection treatment reacts with the organics. Tiny pieces of decayed plant material, for example, can be found in all \Jwater\j. By themselves, these pieces of plant may affect how the \Jwater\j tastes or smells, but they pose no health risks. Disinfection agents such as \B\1chlorine\b\c can react with the plant material to form \Jchloroform\j, a suspected carcinogen.\p
One of the researchers' concerns is with antibiotic-resistant \Jbacteria\j in the \Jwater\j, which might otherwise be spread more widely. Another is with \Jbacteria\j which are becoming more resistant to disinfection. Interestingly, the technique was developed in the 1970s to deal with taste and odour problems in European \Jwater\j supplies, and also to get rid of dangerous chemicals such as pesticides and \Jhydrocarbons\j, but little work has been done on pathogen removal.\p
Standard \Jwater\j treatment leaves \Jwater\j utilities in a tight spot: if they treat the pathogens with \Jchlorine\j to deal with a known health risk, there is every chance that they will trigger a new health risk by adding new toxic chemicals to the \Jwater\j when the organic fragments react with the \Jchlorine\j. So the well scheme offers a real alternative, since soil \Jbacteria\j should, all going well, consume most or all of the organic fragments before they reach the wells.\p
The first results of the project will be reported in the second half of 1999. In the mean time, related web sites are said to be available at \Bhttp://www.jhu.edu/~dogee/\b and at \Bhttp://www.jhu.edu/~dogee/bouwer.html\p
\bNitrate levels in runoff \Jwater\j are a problem in many parts of the world. The \Jnitrogen\j compounds are bad for \Jruminant\j animals such as \Jcattle\j, and also for infant humans, by interfering with oxygen-carrying in the blood. As well, high \Jnitrate\j levels also contribute to \B\1eutrophication\b\c, which can cause algal blooms in rivers and lakes, and the wasted \Jnitrogen\j represents an extra cost to farmers, which must be passed on to consumers.\p
Most of this excess \Jnitrogen\j comes either from waste \Jwater\j, or from the leaching of excess fertiliser into rives and \Jgroundwater\j. A study detailed in March in \IAgricultural Ecosystems and Environment\i, but discussed in more detail on the \JInternet\j in July, looks at what happened to \Jnitrogen\j used on fields in Illinois.\p
After a poor growing season, 100 pounds per acre (110 kg/hectare) of \Jnitrate\j remained in the soil, but about 40% of this washed out during the next winter and spring. It is no real surprise to learn that the two main factors in predicting \Jnitrogen\j run-off were the concentrations left in the soil, and the amount of rainfall which caused the leaching, with 95% of the leaching occurring in winter and spring. The lesson for farmers: apply your \Jnitrogen\j fertilisers in spring in climates like that of Illinois.\p
One of the lessons to be drawn from the Sydney, \JAustralia\j \Jwater\j scare is the need for ways of testing \Jwater\j and getting rapid results. If there are dangerous pathogens in the \Jwater\j, then early warning is essential.\p
An Ohio University scientist, Anthony Andrews, announced on the \JInternet\j a new technique (which he described recently in the journal \IAnalyst\i) which detects toxins in \Jwater\j in less than 10 minutes. It is based on the methods used in drug testing to screen for the presence of suspicious substances, with only those which prove positive on the first test being sent on for more detailed testing.\p
Accurate testing of a river may mean analysing as many as a hundred different samples, all taken from different locations. These analyses take several hours to complete, even if the sample later turns out to be completely safe. Andrews' method allows samples which need extra analysis to be identified much faster, drawing attention to those areas where extra sampling may be needed.\p
The method, sensitive down to just one-billionth of a gram per litre of \Jwater\j, uses a fibre coated with a special chemical layer. When this is dipped into the \Jwater\j sample, toxic molecules are preferentially drawn to the chemical layer, which takes a stronger hold than the \Jwater\j on the target chemical.\p
The fibre is then placed in an injection port of a gas chromatograph, where it is heated to 250 degrees \JCelsius\j, which drives the molecules off the fibre, and through the chromatograph for separation. Since the fibre has specifically selected the target molecules, only these are available to be driven off, making counting of the molecules a simple task.\p
So far, the method has only been tested on a DDT sampling exercise in the Hocking River, in south-eastern Ohio. Andrews says he can see no reason why the technique could not be applied to other pollutants as well.\p
#
"Herbicides the sneaky way",698,0,0,0
(Jul '98)
A July paper in \IGenes and Development\i indicates that there may be better ways to control weeds. Researchers at the Whitehead Institute for Biomedical Research say they have cloned and characterised a plant gene called EIR1 (\JEthylene\j Insensitive Root 1) which plays a critical role in the ability of roots to grow toward the \Jearth\j in response to gravity. They did this working on the genetic workhorse, \IArabidopsis thaliana\i, in which the roots of mutant weeds lacking EIR1 lose their ability to respond to gravity and are unable to grow downward into the soil.\p
This opens up the prospect of a \Jherbicide\j which is targeted at the EIR1 gene, which should be far safer for humans and other animals. Since \IArabidopsis\i is genetically similar to food crops like rice and corn, this could lead to useful new insights into the way these crops grow, say the researchers.\p
The findings also give us a better insight into a fascinating aspect of plant growth: tropisms in general, and gravitropism (also called geotropism) in particular. Plants have the ability to determine which way is up and which way is down, and radish seedlings germinating in \Jagar\j jelly will automatically turn the growing root tip if the seedling is rotated. This is tied in with a plant hormone called indole acetic acid (IAA). The redistribution of IAA around the root tip is responsible for gravitropism.\p
When the root tip is cut off, the plant is no longer able to grow downward. When roots are oriented horizontally, IAA accumulates along the lower side of the elongating zone. Cells on the top part of the root elongate, causing the downward curving of the root. It seems likely that the transport of IAA is assisted by a gene that acts as a \Jpump\j to redistribute the hormone up and down root cells as needed, and the EIR1 gene may represent this \Jpump\j.\p
\IKey names\i: Christian Luschnig, Paula Grisafi, Roberto Gaxiola, Gerald R. Fink\p
#
"Exotics on the rise in Hawaii",699,0,0,0
(Jul '98)
Islands are often home to many endemic species, species found nowhere else, which have evolved from one or two new arrivals in the distant past. Island ecosystems can withstand occasional arrivals, but they are less able to deal with large-scale invasions. Bird and other species brought into Hawaii from the mainland United States tend to invade new habitats and displace native species. A July paper in \IBioScience\i shows that nearly half of species on the brink of \Jextinction\j were put there in part by exotic invaders, often introduced into new habitats by people.\p
A team led by David Wilcove at the Environmental Defense Fund, combined the federal \Jendangered species\j list with a catalogue of species at risk which is maintained by the Nature Conservancy. Then they determined the percentage of species affected by five types of threats - habitat destruction, the spread of alien species, \Jpollution\j, overharvesting, and disease. The main problem turned out to be habitat destruction, which threatened 85% of species. The substantial damage caused by alien species appeared to harm 49% of the plants and animals on the list.\p
The problem is much wider than just the Hawaiian species: most islands of the world have similar problems with introduced species which destroy habitat and prey on established species which, once extinct, can never be fully replaced.\p
#
"Virtual dissection",700,0,0,0
(Jul '98)
Researchers at the Stanford University Medical Center say they are developing a \B\1virtual reality\b\c model of a frog to be used as a computer-based teaching tool for middle school and high school \Jbiology\j students. At a VR in education conference in London in early July, they explained how it was possible to dissect a frog while missing out on the smell, the mess, and the occasional slippages which destroyed that nerve or blood vessel you were seeking.\p
The "frog" is a 3D rotatable and zoomable beast developed by the SUMMIT (Stanford University Medical Media and Information Technologies) Group. It is part of the Virtual Creatures project, where the SUMMIT group are exploring ways to use computers and interactive software to teach vertebrate \Jbiology\j.\p
While computer games like "Myst" and "Doom" only generate simple virtual environments, the Virtual Creatures team exploited more powerful technology to create a richer environment - called Frog Island - with many opportunities for interactive learning. Besides viewing and manipulating the three-dimensional frog, students can call up photos of frogs in their natural habitats and consult virtual texts for thorough explanations.\p
Conservationists will welcome a move that secures the future of amphibians, although a report in \INature\i during July suggests that the drop in numbers of amphibians in two parts of the world, Panama and \JAustralia\j, may be caused by a chytrid \Jfungus\j.\p
Readers can visit SUMMIT's Virtual Creatures at \Bhttp://summit.stanford.edu/creatures\p
#
"Ceramics which keep their shape and size",701,0,0,0
(Jul '98)
\c\1Ceramics\b\c are made by heating material to make high-tech devices like fuel cells, medical implants, cellular phones, gas or \Jtemperature\j sensors, and even \Jautomobile\j engines. The major problem with \Jceramics\j is that the freshly-moulded parts have to be fired at high temperatures in order to obtain a ceramic body that is free of pores. This pore-filling process, called sintering, shrinks the parts. Sometimes the ceramic part shrinks non-uniformly, and the shrinking sometimes causes the product to deform and develop cracks.\p
This is not a problem with traditional \Jceramics\j such as pots and crockery, and even a teapot can have a loose lid without any problems, but more advanced devices just won't perform if the dimensions are not perfect.\p
According to a recent report in the \IJournal of Materials Research\i, the answer seems to be a question of starting with a mixture of ceramic and metal powders, rather than the traditional plain ceramic powder. When firing takes place, these metals are oxidised, and form \Jceramics\j as well. Most metals form a ceramic with a greater volume, but a few metals such as the alkaline \Jearth\j metals, \Jmagnesium\j, \Jcalcium\j, \Jstrontium\j, and \Jbarium\j, form \Jceramics\j with a smaller volume than the original metal: with the right combination, you can achieve a ceramic with exactly the same volume as the precursor materials.\p
The researchers find that they can produce a malleable powder that is easy to form into complex shapes. In fact, they are able to shape it with metallurgical techniques like rolling, forging, extrusion, and machining. They say they have even rolled a sheet just 20 micrometres thick - a human hair is about 100 micrometres across.\p
\IKey name\i: Ken Sandhage, Ohio State University\p
The children's tongue-twister about seashells sold by the seashore actually celebrates the work of a 19th\c century \Jfossil\j collector and preparator, Mary Anning, who made a living from finding and cleaning fossils for sale to rich patrons.. All over the world, shells have been admired as objects of beauty, and shells have even been used as currency. In one famous incident, German authorities used cheap pottery cowrie shells in the German New Guinea territory, but the forgeries were quickly detected.\p
Now comes the news that we may soon be able to grow shells which are impossible to distinguish from the real thing. Or at least that we can now create coatings that mimic seashell structures, according to a report in \INature\i in mid-July. Researchers at the US Department of \JEnergy\j's (DOE) Sandia National Laboratories and the University of New Mexico say the process permits rapid formation of tough, strong, optically transparent coatings suitable for applications such as automotive finishes, as well as coatings for implements and optical lenses.\p
Shells are made of \Jaragonite\j, a crystal form of \Jcalcium\j carbonate, and a small amount of organic material: an \B\1abalone\b\c shell, for example, is composed of approximately 1% polymer and 99% \Jaragonite\j (CaCO\D3\d) by volume, is two times harder and a thousand times stronger than its constituent materials. The secret of the shells' strength seems to lie in the structure of the shell, which features alternating layers of flexible, cushioning biopolymers and hard layers of \Jaragonite\j.\p
Hard substances break when cracks move through them: the effect of the biopolymer layers is to stop the cracks spreading.\p
\IKey names\i: Alan Sellinger, Jeff Brinker, Pilar Weiss and Yungfeng Lu\p
#
"Negative resistance?",703,0,0,0
(Jul '98)
This story is so peculiar, there might just be something in it, or it may turn out to be an important discovery with an entirely different explanation. Deborah D. L. Chung, a professor of mechanical and aerospace \Jengineering\j at the University of Buffalo has just described a strange observation which seems to imply the existence of negative resistance in a conductor.\p
Describing her work on July 9 in a keynote address at the fifth International Conference on Composites \JEngineering\j in Las Vegas, Chung was careful in her wording when she said that she had observed apparent negative electrical resistance at the interfaces between layers of carbon fibres in a composite material. She also stressed that the mechanism behind the observation of negative resistance, at the geometrically complex interface between fibre layers, is still unclear.\p
She believes that the observed negative resistance indicates that the electrons in the system are flowing in a direction opposite to that in which they normally flow. The results were obtained when she and a colleague were evaluating how different curing pressures and matrix materials affected the junction between carbon-fibre layers.\p
Even if Chung's explanation is wrong, there appears to be an interesting situation here, one that deserves further exploration. A paper has now been submitted to an unspecified peer-reviewed journal, and a patent application has been filed, so no doubt we will hear more of this issue in the future, one way or another.\p
#
"Carbon-36 fullerenes could be higher-temperature superconductors",704,0,0,0
(Jul '98)
On firmer ground, theorists at the US Department of \JEnergy\j's Lawrence Berkeley National Laboratory believe that the carbon-36 \B\1fullerenes\b\c may lose all electrical resistance at temperatures far higher than any other carbon structure, perhaps even at temperatures in the \Jtemperature\j ranges that superconducting copper-oxide \Jceramics\j have achieved. The details have appeared in \IPhysical Review Letters\i in July and also in \IChemical Physics Letters\i last March.\p
\JPotassium\j atoms sandwiched between the planes of \Jgraphite\j can produce a \B\1superconductor\b\c, but only at half a degree Kelvin. The standard C-60 buckyballs, if they are doped with \Jalkali\j metals, can be superconductors at up to 40 K.\p
The difference seems to be in the curvature of the ball, say the theorists. If this is so, then C-36 balls, with tighter curvature, should reach an even higher \Jtemperature\j. Three researchers at UC Berkeley were the first to extract an appreciable amount of C-36, having been encouraged by the theoretical calculations. They announced their success in producing the C-36 fullerenes in \INature\i on 25 June, 1998.\p
But why would it be a superconductor? The likeliest structure incorporating 36 carbon atoms consists of two "bowls", each a hexagon surrounded by six pentagons; each bowl has 18 vertices where atoms sit. The two pentagon-sided bowls face each other, forming an equatorial belt of six more hexagons. This has something called a D6h symmetry, meaning that if it is rotated around its long axis, it looks the same after each sixth of a turn, and if it is sliced through the equator, its top and bottom are identical.\p
A carbon atom in the D6h structure may have one of three different bonding configurations, depending on its relation to its neighbours. Thus C-36 has three unique atoms, but more importantly, the bonds are mostly strained. This strain influences \Jelectron\j-phonon coupling, which is a mechanism that makes \Jsuperconductivity\j possible, according to the BCS theory put forward by \B\1John Bardeen\b\c, Leon Cooper, and Robert Schriefer, to explain \Jsuperconductivity\j in terms of the motion of \Jelectron\j pairs in 1957.\p
Now here it gets technical. Phonons are a way of representing atomic vibrations in a solid; oscillations in interatomic charge can make it possible for electrons to move as pairs. The bent or strained bonds may expose \Jelectron\j orbitals normally unaffected by the vibrational modes in a sheet of flat \Jgraphite\j. The more \Jelectron\j orbitals the atomic vibrations can affect, say the theorists, the greater the potential for \Jelectron\j-phonon coupling and the greater the prospects for \Jsuperconductivity\j.\p
In the suspected structure, the carbons at the vertices of a pentagon are under greater strain than those at the vertices of a hexagon, and clustered pentagons create even more strain: every atom in the D6h structure is at the vertex of one or two pentagons.\p
Preliminary results suggest that the C-36 fullerenes may be superconducting above 77K, the boiling point of \Jnitrogen\j, and one of the key targets to reach. Whether it will get near, or even pass the present record-holder at 133 K, remains to be seen.\p
(In fact, one superconductor, HgBa\D2\dCa\D2\dCu\D3\dO\U8+d\u can reach a superconducting \Jtemperature\j of around 164K when hydrostatic pressure is applied, but this is an unusual situation, although epitaxial strain effects were explored in a Letter to \INature\i during July.)\p
\IKey names\i: Steven G. Louie, Marvin Cohen, Michel C≤tΘ, Jeffrey C. Grossman, Charles Piskoti, Alex Zettl, and Jeff Yarger\p
#
"An artificial liver",705,0,0,0
(Jul '98)
Each year, more than forty thousand people die from liver disease in the USA. In 1997, approximately 4100 American patients underwent liver transplantation while an estimated 1000 died awaiting a donor organ. Currently, there are more than ten thousand patients on the US liver transplant waiting list. So medical workers have welcomed the news that trials have just commenced in the USA of a "bioartificial liver support system" which is marketed under the trade mark of HepatAssist. The third trial, the first to come to your reporter's attention, was announced on the \JInternet\j during July, with perhaps thirty more trials to be undertaken during the next year.\p
The liver support system is designed to provide temporary essential liver function for patients with acute liver failure. Researchers hope that the system may provide a "bridge" to liver transplant or to regeneration and recovery. It should increase the survival rate of patients waiting for a liver transplant by extending the "bridge" time until an organ match can be found, and it may also support non-transplant candidates until their own liver can regenerate and heal.\p
The support system works on the blood outside the body, much like a \Jdialysis\j machine. The patient's plasma is circulated through a cartridge containing thousands of hollow fibre membranes surrounded by living, sterile pig liver cells, called hepatocytes. The patient's own plasma cells may be detoxified by oxygen-rich hepatocytes, which are capable of performing many of the metabolic functions of a healthy liver. One treatment session lasts seven hours; treatments can be administered at eight hour intervals for up to 14 days.\p
#
"Taking the automatic pilot a step further",706,0,0,0
(Jul '98)
Some time in August, the first \Jaircraft\j without a human pilot will cross the Atlantic, according to plans revealed in July. There are now three robotic \Jaircraft\j, also called autonomous \Jaircraft\j, ready to take to the air. The first plane, due to fly from Bell Island Airport in St. John's, Newfoundland, landing at the Benbecula Military Range on South Uist of the Outer \JHebrides\j Islands of Scotland, will take about 24 hours to cross the 3200 km (2000 miles) of ocean in between. The landing site had to be changed after Irish authorities declined to allow the plane to land at Belmullet.\p
The planes will fly below conventional air traffic patterns, and special arrangements have been made with international \Javiation\j authorities to ensure the safety of other \Jaircraft\j. Field trials off Canada and Western \JAustralia\j have established that the planes can last for the duration of the crossing, though there are some lingering problems with the engine. The planes have been under development since the early 1990s, and production models are planned to cost around US$10,000 each, making the risk of losing one now and then less of a worry than losing a conventional \Jaircraft\j.\p
The three planes all weigh 13 kg (29 pounds) and have a wing-span of 3 metres (10 feet). In the long term, we should not expect to see pilotless passenger craft on the Atlantic run: the planes are intended to be used mainly in \Jweather\j reconnaissance. The Aerosonde has been developed by Insitu and Environmental Systems and Services of Melbourne, \JAustralia\j, with support from the Australian Bureau of \JMeteorology\j - for reconnaissance to improve \Jweather\j forecasting. Insitu is now working with the University of Washington Department of Aeronautics and Astronautics, under sponsorship from the US Office of Naval Research, to develop a trial Aerosonde \Jweather\j reconnaissance program off the west coast of the United States. Future production models will need to have a greater range than at present, but the first Pacific crossings may be as early as mid-1999.\p
#
"Texas heat wave only the beginning",707,0,0,0
(Jul '98)
July saw a summer heat wave in \JTexas\j and the south-west of the USA, with temperatures soaring, \Jday\j after \Jday\j, apparently as a result of the action of El Ni±o. Now, as \B\1El Ni±o\b\c dies away, La Ni±a will be taking over, and this should guarantee that the same region has a dry northern winter.\p
The \Jdrought\j was probably kicked off by El Ni±o last January, when the tropical \Jatmosphere\j cranked up high-altitude winds that traveled northward and dried out as they descended over Central America and Mexico, parching those regions. At the time, this enhanced airflow also sent winter storms straight across the southern tier states and soaked them.\p
As the northern spring progressed, the dry, descending air persisted and shifted northward, bringing Mexico's dryness to \JTexas\j, and pushing the winter's wet storminess to the north. At the moment, the outlook is for a serious \Jdrought\j in the southern USA.\p
#
"Early 1998 the warmest on record in northeastern USA",708,0,0,0
(Jul '98)
The effects did not stop with the southern USA. The first half of 1998 was the warmest first six months for any year since records began in the northeastern USA. Based on a thirty year average, the twelve northeastern states have an average \Jtemperature\j of 41.2║F (5.1║C), but the average this year was 45.3║F (7.4║C). A major factor in this came from the El Ni±o effect, so the likely pattern for the rest of the year is uncertain. The previous record was set in 1921.\p
#
"SIDS rates dropping",709,0,0,0
(Jul '98)
The incidence of \B\1Sudden Infant Death Syndrome\b\c (SIDS) appears to be dropping in the United States, according to a report in a July issue of \IThe Journal of the American Medical Association\i (\IJAMA\i). The cause of the drop is presumed to be a campaign to have healthy infants placed on their backs or sides to sleep.\p
The report also indicated a number of characteristics which seem to make it more likely that mothers will place their babies on their fronts to sleep, even though between 1992 and 1996, the prevalence of U.S. infants being placed to sleep on their stomachs dropped by 66%. While a strict cause-and-effect relationship cannot be conclusively proven, the rate of SIDS dropped by 38% during that period.\p
According to the report, women were less likely to place infants to sleep on their backs if they had one or more of the following characteristics: race reported as Black, fewer than 16 years of education, had more than 1 child, or lived in southern or mid-Atlantic states, evidence once again that medicine is sometimes more a matter of public education than drugs, hygiene or surgery.\p
\IKey names\i: Marian Willinger, Samuel Lesko and Ruth Brenner, and there is a campaign website on this issue available through the NICHD home page \Bwww.nih.gov/nichd/\b\p
#
"Herpes in the news",710,0,0,0
(Jul '98)
In late July, a report in the \IProceedings of the National Academy of Sciences\i revealed that, in mice at least, changes in social interactions can stimulate a dormant herpes virus to resurface. In a series of experiments, 40 percent of mice with latent herpes had their virus reactivated when their social structure was reorganised, leading to conflicts among the mice. The outbreaks were most common in the dominant mice, who were involved in the most aggressive social interactions.\p
The herpes virus hibernates inside the body's cells and remains dormant there until it gets a signal from the body. In this case, social stress caused the virus to reactivate. One stress hormone, corticosterone, almost doubled its concentrations in the socially reorganised mice as compared to controls.\p
This finding gives other researchers an excellent animal model to study the relationship between stress and immunity. It may also mean that you can blame your next herpes outbreak on the appointment of a new boss, or any other change which makes you worry about your status.\p
\IKey names\i: Ronald Glaser, David Padgett, John Sheridan\p
#
"Preventing eye herpes",711,0,0,0
(Jul '98)
A report in the same week in the \INew England Journal of Medicine\i, shows that the anti-viral agent, acyclovir, is particularly effective in preventing recurrences of a serious strain of the eye disease, called stromal keratitis, which can lead to scarring of the \Jcornea\j with vision loss and \Jblindness\j. The rate of recurrence of this blinding form of the disease dropped 50 percent when patients took 400 milligrams of acyclovir by mouth twice a \Jday\j.\p
Acyclovir is commonly used to treat and prevent genital herpes, but this is the first time a treatment has been proven effective in preventing recurrence of the eye disease. This is good news, as acyclovir has few side effects.\p
Herpes of the eye is usually caused by herpes simplex I virus, which also causes fever blisters or cold sores around the face and mouth. The problem arises because the herpes simplex virus persists between disease episodes in the nerve cells involved in pain and touch to the lids and eyes. The virus may remain dormant for long periods, but once an eye has been infected, a repeat infection is very likely. The herpes virus also causes blisters on the eyelids, \Jconjunctivitis\j or superficial ulcers of the \Jcornea\j. These forms of the disease usually heal with locally applied anti-viral preparations and without complications.\p
The study used 703 patients, half taking acyclovir, and half taking a \Jplacebo\j, for a year. About 19% of the acyclovir group had a herpes flare up, while the infection flared up in about 35% of the \Jplacebo\j group.\p
\p
#
"Menopause less of problem in Japan",712,0,0,0
(Jul '98)
A report in the July-August issue of Psychosomatic Medicine indicates that \B\1menopause\b\c affects Japanese women less that it does western women. It seems that Japanese women report fewer incidents of symptoms such as hot flashes and night sweats.\p
The report, authored by medical anthropologist Margaret Lock, of McGill University, Montreal, Canada, is based on a decade-long study on menopause and aging in \JJapan\j. Dr Lock believes that biological and cultural variables act in concert to produce these marked differences between women in \JJapan\j and north America.\p
The study compared 1200 Japanese women aged from 45 to 55 with a sample of 8000 women in \JMassachusetts\j, and another 1300 in \JManitoba\j. Together, the surveys suggest that any explanation of menopause as a purely biological process is missing a large part of the picture. Japanese women also show lower levels of \B\1osteoporosis\b\c, breast cancer and \Jheart disease\j, pointing to some interesting areas for further study.\p
#
"Osteoporosis and steroids",713,0,0,0
(Jul '98)
While we hear about \B\1steroids\b\c mainly as drugs abused by sports stars, steroids are also used as legitimate agents in medical treatments. The steroids may protect those who receive transplants from rejecting their new organs and help people to have a normal life when they have chronic conditions such as \Jasthma\j, rheumatoid \Jarthritis\j, ulcerative colitis, multiple sclerosis and some blood and kidney diseases. Typically, the therapy is needed for many years, and this means that medical practitioners need to weigh up the risks of steroid treatment very carefully, as steroid treatment carries a nasty side effect: \B\1osteoporosis\b\c, sometimes called bone loss.\p
Medical researchers have known for sixty years that steroids cause bone loss, but they have been unable to explain why. A paper in the \IJournal of Clinical Investigation\i in July seems to answer that puzzle. In simple terms, people on steroids produce fewer bone-forming cells, and those which are formed are dying prematurely. Bones are living parts of the body, with bones being continually rebuilt under tight control and management. But what happens when the builders go on strike, while the demolishers keep going?\p
The problem is a serious one: more than one-third of patients taking steroids for more than five years have fractures. No bone is spared from the steroid-induced bone loss, but the effects are more dramatic in the spine and in the hip. Unlike the common age- and gender-related types of \Josteoporosis\j, this form of the disease occurs at any age, even in children. Some patients, after years on steroid treatments such as high-dose \B\1glucocorticoid\b\c therapy may even end up in a wheelchair. In the United States, where this work has been carried out, \Josteoporosis\j accounts for 1.5 million fractures annually.\p
\IKey names\i: Robert Weinstein, Stavros C. Manolagas, Robert Jilka, Michael Parfitt\p
#
"Preventing steroid osteoporosis",714,0,0,0
(Jul '98)
Just two weeks later, other American researchers reported in the \INew England Journal of Medicine\i that the drug alendronate (marketed as Fosamax) may help prevent and treat steroid-induced \Josteoporosis\j. The study looked at a large population of patients taking the steroid prednisone.\p
Patients in the studies received either an oral dose of alendronate (5 mg to 10 mg) or an inactive \Jplacebo\j. All the patients also were given \Jcalcium\j (800 mg to 1000 mg) and vitamin D supplements (250 to 500 IU), which are currently recommended for preventing and treating steroid-induced \Josteoporosis\j. The researchers found that either dose of alendronate, added to \Jcalcium\j and vitamin D, significantly increased bone mineral density (BMD) at the spine and hip, which is the best predictor of the risk of a later fracture.\p
The results were consistent, regardless of the patient's age, gender, underlying disease, dosage or length of time on steroid therapy, although the greatest increase in spine BMD was noted among post-menopausal women not taking \B\1oestrogen\b\c who received 10 mg of alendronate, already identified as a treatment effective in preventing and treating postmenopausal \Josteoporosis\j and in preventing fractures.\p
#
"Olive oil not such a good defence after all.",715,0,0,0
(Jul '98)
In spite of earlier reports that oleic acid, a component in olive oil, offers protection against breast cancer, this now appears not to be the case, according to a report in the July issue of the \IAmerican Journal of Clinical \JNutrition\j\i.\p
A new study took fat samples from 642 European women, analysed the samples for oleic acid, and then compared these levels with a later determination of whether or not they developed breast cancer.\p
The difference between this study and earlier studies is that in the past, researchers have relied on women's reports of what they ate, while the most recent study has measured the fat levels directly.\p
The study covered five centres, \JGermany\j, Ireland, the Netherlands, \JSpain\j and \JSwitzerland\j. In only one of them, Malaga, was there a slight association with oleic acid and reduced incidence of breast cancer, while the Dutch sample actually showed higher levels of breast cancer. So if there is an effect of olive oil on breast cancer, it seems likely that the effect is brought about by some component other than oleic acid.\p
\IKey names\i: Lenore Kohlmeier, Neal R. Simonsen\p
#
"Implants safe in Britain",716,0,0,0
(Jul '98)
For the third time in seven years, a British government scientific panel has concluded that silicone gel breast implants do not cause disease. The panel has suggested that women with implants should register in a \Jdatabase\j to allow further statistical studies, but that aside, the implants have been given a clean bill of health.\p
The panel was set up in June 1997, in response to women's concerns about possible health risks. It interviewed patients, industry workers, researchers, and lawyers and reviewed the scientific literature before arriving at the conclusion that the implants represent no greater threat than any other implants.\p
The panel looked particularly at the suspicion that silicone escapes from implants and breaks down into a form, \Jsilica\j, which could trigger autoimmune diseases such as rheumatoid \Jarthritis\j.\p
Radford \JShanklin\j at the University of \JTennessee\j has been looking at this hypothesis, but an analysis of his photomicrographs produced evidence of silicone, but not of \Jsilica\j.\p
Advocacy groups have expressed anger at the finding, arguing that there is still no good explanation for the fact that a small number of implant recipients have clearly suffered adverse reactions. Two other reports are in preparation in the United States, where there has been a moratorium on silicone implants since 1992.\p
#
"Dark honey is better than light honey",717,0,0,0
(Jul '98)
The source of honey is important in producing honey loaded with \Jantioxidants\j to help fight disease, according to a July report in the \IJournal of Apicultural Research\i. Dark honey, say the researchers, is better than light honey, and this is determined by what the bees eat.\p
Honey based on nectar from Illinois buckwheat flowers contains 20 times the antioxidant as honey produced by bees that use clover. Clover, perhaps the most common plant source tapped by honey bees, scored in the middle of the rankings. \JAntioxidants\j are useful in our bodies because they slow down \Joxidation\j, and stop the toxic effects of free radicals, which can cause DNA damage that can lead in turn to age-related problems such as \Jarthritis\j, strokes and even some cancers. The darker honey has less \Jwater\j and more oxidants, meaning that darker honey is generally better for you.\p
While the best honeys are as good for you as tomatoes, many more people would eat a full \Jtomato\j than would consume a \Jtomato\j-weight of honey, so the most likely application of this report would be in people changing from table sugar to honey as a sweetener -- or so the honey producers hope.\p
#
"Spices and health",718,0,0,0
(Jul '98)
It is often asserted that people in hot climates add spices to their foods to disguise the taste of slightly "off" meat, but now it appears that there may be more to spices than we realised. Your spice rack, it seems, is a potent weapon against \Jbacteria\j, even killer \Jbacteria\j like \IEscherichia coli \iO157:H7.\p
In late June, the Institute of Food Technologists' 1998 Annual Meeting & FOOD EXPO in \JAtlanta\j, heard a preliminary report from Erdogan Ceylan, Donghyun Kang, and Daniel Y.C. Fung, who gave a poster presentation on their tests on the antimicrobial effects of 24 spices tested against the common pathogen of foods, \IE. coli \iO157:H7 in a laboratory medium, uncooked hamburger, and uncooked salami.\p
Clove worked best on the hamburger meat, followed by \Jcinnamon\j, \Jgarlic\j, oregano, and sage, but in the laboratory cultures, \Jgarlic\j came out on top. The addition of 1.0 percent spice (\Jgarlic\j, clove, and \Jcinnamon\j) to salami mixed with starter culture and E. coli O157:H7 resulted in successful salami fermentation and slight reduction of the pathogen. However, the addition of 7.5 percent \Jgarlic\j and clove killed 99 percent of the pathogen and still resulted in successful salami fermentation.\p
While they are still to find the best overall combination, the Kansas State University team say that clove, \Jcinnamon\j, and \Jgarlic\j may have the potential to be used in meat products, adding that the method could apply to other pathogens such as \ISalmonella\i.\p
But while the method shows promise, the researchers remain cautious, reminding people that only thorough cooking can guarantee to kill all the \Jbacteria\j. They also warn that the spice levels needed to kill \Jbacteria\j in large numbers may be a bit much for the average palate. Using spices may add a margin of safety, as the infectious dose for \IE. coli \iO157:H7 is remarkably low.\p
The spice method appears to be flavour of the month this year, with another group publishing on a similar topic in \IThe Quarterly Review of \JBiology\j \iin March 1998.\p
#
"Marijuana as damaging as tobacco",719,0,0,0
(Jul '98)
A report in the July issue of the journal \IMutation Research \iwarns that smoking \B\1cannabis\b\c is as damaging to genes as smoking \Jtobacco\j. The study concentrated on pregnant women and their newborn babies, and the findings imply that marijuana smokers are probably as much at risk of getting cancer from their smoking as conventional nicotine addicts.\p
The researchers looked at genetic damage in the white blood cells of smokers and non-smokers. They also used urine tests to eliminate any chance of \Jtobacco\j smokers, \Jcocaine\j and heroin users being included in the tests, and to check the identification of the marijuana users.\p
Marijuana smokers had nearly three times more genetic damage than non-smokers - almost as much damage as observed in heavy \Jtobacco\j smokers during earlier studies by the same research group.\p
The marijuana smokers' babies also demonstrated significantly higher levels of genetic damage when compared to non-smokers' babies. While the study sample was small, the indications would seem to be clear enough for a general warning to be given, pending a more detailed study, especially as most marijuana users are convinced that their weed is safer than the nicotine weed.\p
#
"Now it's Dolly the mouse?",720,0,0,0
(Jul '98)
The world's first reproducible \Jcloning\j of a \Jmammal\j from adult cells was reported in Nature during July. An international team of scientists, lead by Ryuzo Yanagimachi, reported on their method which has successfully yielded three generations and more than 50 identical cloned mice.\p
Yanagimachi is at the John A. Burns School of Medicine of the University of Hawaii, explaining why the method is already being referred to as the \IHonolulu technique\i. The new technique is easily reproducible, meaning that it is likely to prove important in the production of drugs using transgenic animals, as soon as these have been created once. The method, pioneered by Teruhiko Wakayama, a postdoctoral researcher working in Yanagimachi's laboratory, has already been licensed to the biotechnology company ProBio America, Inc., for commercialisation and to test it for expanded uses. The technique may also be useful for \Jcloning\j wild or \Jendangered species\j in a controlled environment.\p
Earlier procedures generated clones either by injection or fusion of embryonic or fetal cells, or by the fusion of adult cells, which is how the sheep Dolly was created. This technique uses an injection method and adult cells. The scientists used adult mouse cells to create new mice which are genetically identical to the parent mouse. Using a very tiny pipette, the donor nucleus is micro-injected into an egg whose nucleus was previously removed. The researchers cultured the resulting cell, placed it in a surrogate mouse and allowed the clone to develop.\p
Repeating the procedure, they created further generations of cloned mice from the first clones, showing that the procedure can be done time and time again. The donor nuclei came each time from cumulus cells, which surround developing eggs within the ovaries of female mice. Each nucleus contains all of the genetic instructions needed to create an adult. The trick was to turn back the clock of an adult cell, so that it would behave like a newly fertilised embryo, in order that it would develop into a normal adult.\p
Oocytes, egg cells, go through a two-step maturation process, with the first step happening before fertilisation, and the second step typically occurring with the stimulation of a fertilising sperm. The oocytes used in the \JHonolulu\j technique had completed the first step of maturation. After the donor nucleus was inserted, the second maturation step was delayed for six hours or so, improving the chances that when the oocyte continued its maturation, a process called activation, it would divide and develop normally.\p
#
"Syphilis genome sequenced",721,0,0,0
(Jul '98)
The entire \Jgenome\j of the \B\1spirochaete\b\c which causes \B\1syphilis\b\c was reported in \IScience \iduring July. This opens up a whole new range of weapons against this debilitating disease. Clues are already being extracted about the properties that make the spirochaete so tenacious, and now there are hopes that the new knowledge might allow the development of a vaccine against it.\p
While ordinary \Jpenicillin\j will knock out the spiral bacterium, \ITreponema pallidum, \ithere is no vaccine. In the past, when \Jantibiotics\j were less common, the end result was tertiary \Jsyphilis\j, which could cause insanity, \Jblindness\j, heart problems, and eventually death. In recent years, the casual use of \Jantibiotics\j has almost removed tertiary \Jsyphilis\j from the experience of doctors, but if resistant forms arise, there will be a serious need for a vaccine.\p
The spirochaete has resisted vaccine attempts so far, partly because it seems to have built-in defences against the immune system, but also because it cannot be cultured in the laboratory. This issue might be side-stepped if researchers have access to the entire \Jgenome\j.\p
The work of sequencing \IT. pallidum\i's genes began with Steven Norris and George Weinstock around 1990, and finished in a burst when researchers at The Institute for Genomic Research (TIGR) in Rockville, Maryland used the so-called "shotgun" \Jcloning\j method to extract the remainder of the information, under the direction of TIGR's Claire Fraser, in just 18 months.\p
Norris describes the organism as "metabolically crippled", having very few of the enzymes for building complex molecules, such as the building blocks of DNA. Instead, it steals essential molecules from its host. This strong reliance on the host is what makes the organism so hard to culture, Norris believes.\p
There are some curious repetitive sequences which may explain how the bacterium gets around the immune system, as the stretches represent a family of very similar genes. It is possible the genes code for a range of similar surface proteins, just different enough to stop the immune system from recognising them. These proteins, referred to as the TPR proteins, bring out a strong immune response, so a vaccine based on all of them should produce a broad defence that would knock out all of the attacking spirochaetes in the first wave of infection.\p
There are also plans in place to use the TPR genes to track various strains of \Jsyphilis\j, providing useful information on the transmission of infection.\p
Plants are the ultimate food source for most living things, and a direct food source for a large number. Aside from snail, insect and vertebrate herbivores, a plant is also under attack below the surface of the soil, where \Jbacteria\j, \Jnematode\j worms and fungi are continually probing to get food from plant roots.\p
To avoid this sort of attack, plants need to sense the invader, and respond by producing chemicals to repel the trespasser. So how does a plant know when it is being attacked by an invading \Jfungus\j?\p
A report in the \IProceedings of the National Academy of Sciences \iduring July yields a fascinating aspect of the answer. Researchers at the Max Planck Institute for Plant Breeding Research have been looking at the reactions of \Jparsley\j plants, \IPetroselinum crispum\i, to attacks by two forms of the important plant pathogen, Phytophthora. In particular, they looked at the responses to \IP. infestans\i, which causes \B\1potato blight\b\c, and \IP. sojae\i.\p
These pathogens, like many others, produce molecules which the plant reacts to as a signal, molecules called elicitors. When the plant detects an elicitor, it starts a general "non-host" reaction. Any advance in our understanding of this mechanism will help scientists to improve crop plants.\p
The elicitor from \IP. sojae \iis a glycoprotein from the mycelium (fungal thread) wall, and this has recently been purified. If \Jparsley\j cells are incubated with this elicitor, there are changes in both biochemical activity and gene activity, and the same changes are seen when the \Jparsley\j cells are attacked by \IP. infestans\i. There are, however, some extra reactions as well, when the attack comes from a real \Jfungus\j.\p
At very early stages of the infection process, before the fungal hypha (a thin tube formed by the \Jfungus\j to grow and to invade plant material) has completely penetrated the cell wall and formed structures inside the attacked cell, some of the cell contents are already moving towards the site of penetration, including the nucleus.\p
As well, the cell deposits cell-wall material beneath the penetration site as a physical barrier against penetration.\p
The attacked cell still has another weapon available to it, hypersensitive cell death. This is rather like the cell committing suicide, and it involves a rapid and sudden collapse of the cell contents around the intracellular fungal structure. The plant cell also releases toxic compounds that possibly kill both the plant cell and the \Jfungus\j.\p
This sort of reaction cannot be achieved with the elicitor molecule alone, which raises the question: are there other elicitor molecules needed to bring on these other reactions? Perhaps the actual penetration produces signals at a mechanical level, rather than at a biochemical level, and maybe both of these signals are needed for a full response.\p
In a delicate experiment, Dr. Sabine Gus-Mayer replaced the penetrating \Jfungus\j with gentle local mechanical stimulation of the cells using a \Jtungsten\j needle of the same diameter as a fungal hypha (2-5 ╡m). This local mechanical stimulus triggered the movement of \Jcytoplasm\j and nucleus to the site of stimulation.\p
Curiously, some of the biochemical defence responses were also observed, suggesting that we still do not have the entire story, but at least the mechanism is now clearer.\p
\IKey names\i: Klaus Hahlbrock, Sabine Gus-Mayer\p
#
"The rise and rise of MRSA",723,0,0,0
(Jul'98)
Methicillin-resistant \IStaphylococcus aureus\i (MRSA) is a major public health threat around the world. As well as being resistant to methicillin, the bacterium is also able to resist a large number of other common \Jantibiotics\j. It is commonly found in hospitals around the world, and it has been the subject of a recent New York study, where molecular fingerprinting techniques were used to track the spread of MRSA in 12 hospitals in the New York City metropolitan area. The work was described in the \IJournal of Infectious Diseases\i in July.\p
The researchers confirm the prevalence of antibiotic- resistant strains of staph in New York City. It also reveals a fascinating set of patterns, with one single clone present in 11 of the 12 hospitals, and accounting for 40% of all the resistant samples. Three additional clones of MRSA appear to be highly localised, with one apparently primarily associated with AIDS patients, a second one associated with a Manhattan-based burn centre and a third one found in a Brooklyn-based Veterans Administration \Jhospital\j.\p
#
"Poliovirus as a friend, not as a foe",724,0,0,0
(Jul '98)
A recent report in the \IProceedings of the National Academy of Science\i describes an important breakthrough in re-\Jengineering\j the poliovirus, the virus which causes \B\1poliomyelitis\b\c. The end result is that it could some \Jday\j be used as a transporting mechanism for inducing immunity against disease pathogens in humans.\p
The technique might be used to produce re-engineered poliovirus vaccines for HIV, hepatitis B, other viral diseases and cancer. The idea is to produce a recombinant virus, engineered with foreign DNA fragments, able to replicate in a variety of host cell types and induce responses from the immune system.\p
Two other types of viruses, vaccinia and adenovirus, have already been used to develop recombinant vaccines for a variety of pathogens, but poliovirus could provide an alternative for situations in which these viruses cannot be used, and the poliovirus has the advantage that it can be administered orally. This makes it an excellent choice in developing countries where the cost of needles is high, and more importantly, even the virus can be produced cheaply.\p
In the study, researchers produced a recombinant poliovirus vector that was able to trigger a response from cytotoxic (cell-killing) CD8 T-cells, which seek out and destroy pathogen-infested cells. Getting a reaction from this cell-mediated arm of the immune system is important, because previous vectors have only been shown to incite \Jantibodies\j\p
In the study, mice were shown to be immune to a malignant \Jmelanoma\j cell line that was injected into them after treatment with the altered poliovirus. The CD8+ T-cells actually kill virus-infected cells, thus eliminating any reservoir of virus and preventing spread of infection.\p
On technical grounds (outlined below), the cancer-killing example was contrived, admit the researchers, but it nonetheless represents a useful step in the right direction. They had re-engineered the poliovirus vector to express a chicken \Jantigen\j called Ova, and they then inoculated the transgenic mice with this viral vector, which prompted the desired cytotoxic T-\Jlymphocyte\j response. The malignant \Jmelanoma\j cell line they used in the test also expresses the Ova \Jantigen\j.\p
\IKey names\i: Raul Andino, University of \JCalifornia\j, San Francisco\p
#
"The pollinators of the cycads",725,0,0,0
(Jul '98)
\B\1Cycads\b\c, primitive palm-like plants with no flowers, have been on the \Jearth\j for around 300 million years. In the days of the dinosaurs, forests of cycads were the main features of the landscape, but today, there are few cycads left.\p
Just 11 of the 30 known genera survive today, but these are enough to reveal a fascinating story of \Jevolution\j to us. While cycads do not have flowers as we understand the term, they have reproductive cones, and like the flowering plants, they use insects to carry the \Jpollen\j from the male plants to the female plants, and the relationships between these insects and their plants reveal a great deal.\p
Dr. Dennis Stevenson, Director of the Harding & Lieberman Laboratories at The New York Botanical Garden, has confirmed that cycads are pollinated by specific weevils and other beetles. More importantly, each \Jgenus\j of cycad has its own \Jgenus\j of pollinator, and each species has its own species, suggesting that the relationship has extended over a very long period of time, according to an account posted on the \JInternet\j during July.\p
The cycads, with separate male and female plants, use rather different techniques in their struggle for representation in the next generation of cycads.\p
Taking one example, \IZamia furfuracea\i, the male plant provides shelter and food, as well as a breeding site and protection for the larvae, with the male cone turning down its usual defences by sequestering (locking away) most of the protective toxins, while the female plant spreads its toxins through the seed to make them unpalatable to insects, so that the seeds are protected, and not only against insects. In some parts of the world, especially \JGuam\j and \JAustralia\j, indigenous human populations have learned how to treat cycad seeds so as to remove the toxins, but without special treatment, most cycad seeds do not suffer a second visit from herbivores.\p
The male cone attracts pollinators by emitting heat and an attractive aroma (to insects), and rewards the visitors with food, while the female cone, with nothing to offer, produces the same smell, and effectively passes itself off to the insects as another male plant, in order to draw in the \Jpollen\j carriers.\p
While the insects on the male cone are able to eat the parts which are low in toxins, they use high-toxin materials to make their larval cocoons, gaining greater protection from their host.\p
The symbiosis even extends to the pollinators adjusting to the 2- to 3-year reproduction cycle of the cycads as the end-of-the-season larvae conveniently go into a dormancy stage, eventually to be drawn from their Sleeping Beauty status when newly matured cones release their aroma.\p
#
"The first modern humans",726,0,0,0
(Jul '98)
Modern humans are generally defined as people capable of speech and abstract thinking, but thoughts and words do not show up as fossils. The only way to assess whether fossils were "modern" is to look for signs of culture that could only be found among "modern" humans.\p
Some of the signs are obvious enough: inscriptions, cave paintings and carvings, for example, but other signs are less obvious. These less obvious signs are also the ones that occurred earliest. A report in the April issue of the \IJournal of Archaeological Science\i, described on the \JInternet\j in July, claims that the earliest known signs of modern behaviour have been found at the Enkapune Ya Muto (EYM) rock shelter in the central Rift Valley of \JKenya\j.\p
By forty thousand years ago, humans seem to have already learned to seal social alliances and prevail over others by giving token gifts. That, at least, is what Stanley Ambrose has inferred from his finds at the site. Ambrose says that the EYM site contains very early, perhaps the earliest known, examples of Upper Palaeolithic stone-tool technology, and then later in the sequence, \Jostrich\j eggshell-bead technology. The blade tools found at EYM are between 46,000 and 50,000 years old, and if they are closer to the older date, they will be older even than the current record holder, a site in Israel.\p
The beads have been carbon-dated to an age of about 40,000 years, which makes them the oldest directly dated ornaments in the world. In modern hunter-gatherer societies, beads like these are not only used as ornaments, but are "the most common kind of gift in a formal system of delayed reciprocity, which has further implications for the \Jevolution\j of a social safety-net system" -- in other words, the beads act as a sort of social glue.\p
If humans are going to adapt to risky environments, they need social relationships with other people, so they have somebody to fall back on when times are bad. So the beads tend to suggest that the makers were already at this social stage, and that means that they were modern, as we use the term.\p
The owners may have used the beads for status, or as a form of currency, but the end effect is that the bead people would be better able to deal with difficult conditions than more primitive human groups like the Neandertals, who probably lacked such a social security system.\p
#
"Clues in dung",727,0,0,0
(Jul '98)
Forget the DNA in the amber line that was offered in \IJurassic Park\i, and look instead for clues in \Jfossil\j dung, says a report in \IScience \iduring July. \JFossil\j faeces, animal droppings, may be less attractive as a notion, but they are able to provide a wealth of clues about the \Jecology\j and relationships of extinct animals. The coprolites may even be able to tell us more about early humans.\p
In spite of all the exciting action in the movie, the scientific reality is that amber has nothing to offer the scientists. Neandertal bones have provided useful human DNA for Svante PΣΣbo and his colleagues, but they had no success at all with coprolites provided, some 20,000 years ago, by an extinct ground sloth in \JGypsum\j Cave near Las Vegas, Nevada. The elephant-sized sloths survived in America until about 10,000 years ago, and the cave is richly supplied with their dung.\p
The problem, it seems, has been Maillard products which form extra cross-links in the DNA. These are sugar-rich tangles of proteins and nucleic acids, and the cross-links interfere with the DNA amplification which is necessary if the DNA is to be identified, although PΣΣbo believes that these same Maillard products also protect the DNA from degradation.\p
The answer came in the form of PTB, more formally \IN\i-phenacylthiazolium bromide, a chemical which chops through the bonds which tangle the DNA up in the Maillard products.\p
PTB-treated sloth coprolites have since yielded mitochondrial DNA, most probably from intestinal cells sloughed off into the faeces. They have also found a wide variety of plant DNA, as might be expected from vegetarian sloths, with recognised sequences including eight plant families, including grasses, yucca, mustard, grapes, mallow and mint. There is potential here for this sort of DNA analysis to assist in identifying diet features which have been chewed beyond microscopic recognition.\p
There are limitations, of course: the coprolites may be seasonal in their composition, and the technique will probably only be good for around a hundred thousand years, the currently accepted "lifetime" for DNA's identifiable molecules, but even that is better than nothing. The next move for PΣΣbo: he hopes to throw light on Neandertal life in this same way, with a view of finding out, from 45,000-year-old Neandertal faeces from a cave near \JGibraltar\j, how they were related to us, what they ate, and maybe even which parasites plagued them.\p
#
"Little satellite lost",728,0,0,0
(Jul '98)
In late June, a combination of unfortunate errors led to the loss of the SOHO \Jspacecraft\j, as it sat in the L-1 \B\1Lagrangian point\b\c between the \Jearth\j and the \Jsun\j, about 1.5 million km from the \Jearth\j. This is one of a number of stable points between the \Jearth\j and the \Jsun\j where the \Jearth\j's gravitational pull is the same as the \Jsun\j's pull.\p
SOHO, the Solar and Heliospheric Observatory, operated jointly by NASA and the European Space Agency (ESA), went radio silent on 25 June, and a post-mortem in July revealed that a number of factors had come together to cause the problem. First, a pre-programmed software sequence lacked a command to turn on a \Jgyroscope\j necessary to reorient SOHO toward the \Jsun\j in case of an emergency, so that when problems arose, this failsafe was unavailable.\p
Then one of the \Jspacecraft\j's three gyroscopes sent back a faulty reading, which sent SOHO into a planned Emergency \JSun\j Reacquisition (ESR) mode, but with the software error in place, the ESR failed.\p
The next step was a human error when the ground staff sent a command to turn off the malfunctioning \Jgyroscope\j, but instead turned off a \Jgyroscope\j that was working perfectly. As a result, the craft went into a spin which misaligned its communications antenna and turned its solar panels away from the \Jsun\j. With the wisdom of hindsight, critics have blamed the operators for over-reacting, but more level heads consider that blame needs to be more evenly apportioned.\p
All is not lost, however. In late July, ground-based radio telescopes located SOHO. It was rotating slowly near its original position in space, making it more likely that contact and control will be regained. On July 23, the 305-metre dish at Arecibo in Puerto Rico was used to direct a radar signal at the general area, with the 70-metre dish of NASA's Deep Space Network in Goldstone, \JCalifornia\j, acting as a receiver. The Goldstone dish located the \Jspacecraft\j's echo and tracked it using radar techniques for more than an hour.\p
This sort of system is called bistatic radar, and it was made necessary because the Arecibo dish cannot be steered very far, due to its huge size. So by the time the radar signals had reached SOHO and bounced back, the Arecibo dish was no longer able to receive the return signals. Bistatic radar has been used in the past to study the rings of Saturn and \Jearth\j-crossing \Jasteroids\j.\p
Arecibo can \Jpump\j out 500 megawatt radar pulses, more than enough to detect a slowly spinning \Jsatellite\j, enough in the past to map the surfaces of Venus and Mars, and even to study the properties of near-\JEarth\j \Jasteroids\j.\p
The slow spin rate suggests minimal structural damage, and raises hopes that within the next two months, SOHO's solar panels will once again rotate into the correct position to power the \Jspacecraft\j, and when that happens ground controllers will be able to re-establish communications.\p
#
"Alan Shepard dies at 74",729,0,0,0
(Jul '98)
America's first man in space, the fifth to walk on the moon and the first man to play golf on the moon, has died in the USA of leukemia. Alan Shepard was a World War II veteran who served on the destroyer \ICogswell\i in the Pacific. After the war, he became an aviator, and then flew as a test pilot, before being selected as one of the first seven Mercury astronauts. It was in this role that he flew a 15-minute, 302-mile (500 km) flight on May 5, 1961.\p
Shepard will long be remembered for the problem that arose when his flight was delayed, and his bladder filled. With only a short flight planned, no plumbing arrangements had been made, and Shepard was told "do it in your suit". From his position, lying down in the capsule, he did this, but the liquid, trapped in the suit, began to pool beneath his back. His comment: Weh-ayl . . . I'm a wetback now."\p
#
"A new family of galaxies",730,0,0,0
(Jul '98)
A group of American and Japanese astronomers has found a population of distant galaxies in an area of the sky known as SSA13. These stars are radiating as much \Jenergy\j as the rest of the optical universe. The galaxies have remained undiscovered until now because the stars are obscured by the large amount of dust they contain, according to a report by the eight authors in \INature \iin mid-July.\p
Another report from a British group, covering a smaller area around the well known Hubble Deep Field, indicating generally similar conclusions, appears in the same issue.\p
These results tend to suggest that much of the star formation which is happening in distant parts of the universe may be hidden from our observations, whether they come from ground-based observatories or from the Hubble Space \JTelescope\j.\p
The results were obtained at sub-\Jmillimetre\j wavelengths, because the dust clouds absorb the optical-frequency light emitted by young stars, and re-emit the \Jenergy\j at much longer wavelengths, well into the far infra-red. When the galaxies are far away, red-shifting extends this light until it is at wavelengths only slightly less than a \Jmillimetre\j.\p
At this wavelength, the radiation can punch through the dust clouds and be detected by our instruments. This new ability opens up a prospect of new discoveries on a par with those which arose when image quality on the Hubble Space \JTelescope\j was restored, as the dust-enshrouded galaxies that may be obscured or even invisible in the optical can be detected in the submillimetre range, in a portion of the spectrum that lies between infrared and radio waves.\p
The survey looked at two blank regions of the sky, using a revolutionary new instrument on the 15-metre diameter James Clerk Maxwell \JTelescope\j (JCMT) on Mauna Kea (Hawaii). The JCMT is the world's largest \Jtelescope\j which can look at the sub-\Jmillimetre\j range, and it has been fitted with camera called SCUBA (Submillimeter Common User Bolometer Array). Supercooled detectors in SCUBA measure heat emission from small dust particles, letting astronomers map a selected region of sky at submillimetre wavelengths.\p
The rate of star formation in the dusty galaxies is between 10 and 100 times greater than the rates found in most optical sources. So while there are fewer sub-\Jmillimetre\j sources than there are normal optical sources, the sub-\Jmillimetre\j sources actually radiate more \Jenergy\j.\p
In our own area, the only similar objects are the ''ultraluminous infrared galaxies'' which were one of the major discoveries of the IRAS \Jsatellite\j. These ultraluminous infrared galaxies are often formed by a strong merger between two gas-rich galaxies. It is possible that the submillimetre sources at high \Jredshift\j may be galaxies in the process of formation through the merger of smaller pieces\p
The new evidence is making astronomers think again about when the universe's star formation reached a peak. Optical surveys have previously indicated that the peak was reached when the universe was about three quarters of its present age. The large number of distant galaxies that we are now seeing for the first time are so distant that their light was formed early in the life of the universe. In other words, we now have evidence of a great deal more star formation, early in the life of the universe, pushing the peak backwards in time.\p
#
"Martian life in doubt once again",731,0,0,0
(Jul '98)
\JMeteorite\j ALH84001 has been queried again. The July issue of \IMeteoritics and Planetary Science\i featured an analysis of the crystals found in the \Jmeteorite\j. According to the report, the crystals were formed by epitaxial processes at temperatures too high for living things to have existed in the rock.\p
Using transmission \Jelectron\j microscopy, researchers found that the magnetite crystals in the rock were intergrown at the atomic level with the surrounding \Jcarbonates\j by what they call a rigorous form of epitaxy, which is an ordered growth of one mineral on top of another.\p
The way the magnetites and \Jcarbonates\j are oriented suggests that they grew together at temperatures greater than 120║C, rather too high for life forms as we know them. This is the third paper to call in question the original NASA-funded suggestion that the \Jmeteorite\j contains nanofossils, with all three of the negative papers also resulting from NASA-sponsored research. While the authors of the original claim still stick to their guns, most other scientists seem now to be convinced that, even if there is life on Mars, there are no traces of life to be found in \Jmeteorite\j ALH84001.\p
#
"A new class of asteroid",732,0,0,0
(Jul '98)
\JAsteroids\j are found in the asteroid belt, between the \Jearth\j and Mars, right? Wrong. Not only do we have \Jasteroids\j which cross over from outside the \Jearth\j's \Jorbit\j to inside our \Jorbit\j, now David Tholen and his colleagues at the University of Hawaii's Institute for \JAstronomy\j have found \Jasteroids\j which remain entirely inside the \Jearth\j's \Jorbit\j.\p
The object was first detected in February 1998, using a specialised camera, fitted on the University of Hawaii's 2.24-metre \Jtelescope\j on top of Mauna Kea, and now carries the name 1998 DK36.\p
So far, the asteroid's shape and size remain uncertain, but the asteroid's furthest distance from the \Jsun\j appears to be very close to, but slightly inside the \Jorbit\j of the \JEarth\j. It is probably around 40 metres in diameter, about the size of the asteroid which hit the Tunguska region of \JSiberia\j in 1908, but it appears that DK36's closest approach to the \Jearth\j's \Jorbit\j is about 1.2 million km (750 thousand miles).\p
#
"Heat wave on Io",733,0,0,0
(Jul '98)
Jupiter's volcanic moon, Io, continues to sizzle, and now it takes the record for the highest recorded surface temperatures of any planetary body in the \Jsolar system\j. A report in \IScience \iin early July indicates that Io has twenty active volcanic vents, with at least twelve of them spewing \Jlava\j ranging from 2200║ to 3100║F (1200║ to 1700║C) in \Jtemperature\j. The top of this range is significantly higher than any \Jtemperature\j known from the \Jsun\j-warmed surface of Mercury, and a great deal hotter than the rest of Io's surface, estimated at -243║F, or about -150║C.\p
The \Jtemperature\j estimates have been made from two instruments on board the \JGalileo\j \Jspacecraft\j, which take the infrared signatures of the vents. The temperatures, combined with the colours shown in the visible light range, suggest that the lavas are rich in \Jmagnesium\j.\p
The source of all this heat seems to be gravitational attraction. The nearby moons, Europa and Ganymede, pull Io into an elliptical \Jorbit\j, so that it passes close to Jupiter and then swings farther away. Jupiter exerts a very strong gravitational pull, which makes the moon flex and change shape. Just as a metal wire heats up when it is repeatedly bent, so does the average moon, and it is this heat which fuels the eruptions.\p
The composition of the \Jlava\j raises some interesting questions, as it appears that the \Jlava\j is made up of dense material which ought to sink towards the moon's centre. Putting it another way, in a volcanically active body, lighter materials should melt first and rise to the surface where it cools and forms a crust. This process is called differentiation, but the evidence we see here seems to suggest that differentiation has not happened.\p
While it is harder for dense material to rise through a crust of lower density, this is what has happened on the \Jearth\j's moon, so it is possible that some mixing process draws the crust down into the interior, much as one of two colliding tectonic plates can be forced deep below the \Jearth\j's surface. For the moment, this must remain a matter for scientific speculation.\p
#
"Getting ready to collect comet dust",734,0,0,0
(Jul '98)
A project called Stardust is due to take off in February 1999, to collect samples of \Jplanet\j dust. Two decades in the planning, Donald Brownlee's dream child will be the fourth mission in NASA's Discovery series, which captured public imagination a year ago with Mars Pathfinder. More importantly, it will be the first mission since Apollo to return samples of space material to \JEarth\j for analysis.\p
Stardust will fly for seven years on a 5 billion km (3.1 billion mile) journey to Wild 2, a \Jcomet\j which altered its course in 1974 after a close encounter with Jupiter. As a result of the change, the \Jplanet\j now orbits among the inner planets, making it easier to reach. Wild 2 (pronounced vihlt 2) was discovered in 1978, after its first close approach to the \Jearth\j.\p
Stardust will cost US$200 million, and will pass some 120 km (75 miles) from the main body of the \Jcomet\j, close enough to trap small particles from the \Jcomet\j's coma, the gas-and-dust envelope surrounding the nucleus. A camera built for NASA's Voyager program will transmit the first-ever close-up \Jcomet\j pictures back to \JEarth\j. The encounter will last for about twelve hours overall, but the really intense part will just take a few minutes.\p
Trapping particles is rather challenging, as the \Jcomet\j's speed, relative to the \Jspacecraft\j, will be something like nine times the speed of a rifle bullet, and at that speed, even a particle the size of a grain of sand could be lethal. So the collector will use Aerogel, material sometimes called "frozen smoke", to absorb the impact. Aerogel is a transparent blue \Jsilica\j-based solid that is as much as 99.9% air. It is as smooth as glass, and has been described as something like plastic \Jfoam\j without the lumps.\p
Along the way, the Aerogel will be used to trap interstellar particles on one side of the collection panel, and as it approaches the \Jcomet\j, some 390 million km (240 million miles) from \Jearth\j, the panel will be tilted over to expose the clean face. The trapped particles will leave a tell-tale trail in the Aerogel, allowing scientists to track them down and gather them up, knowing the source of each particle. Once it leaves the \Jcomet\j, the collection panel will retract into the \Jspacecraft\j.\p
The Stardust craft, all going well, will parachute down into the Great Salt Desert in \JUtah\j in 2006. Brownlee believes that the particles will include cryogenically preserved interstellar dust left from the birth of the \Jsolar system\j some 4.6 billion years ago. These particles, he says, would not be found anywhere except in the outer \Jsolar system\j, because closer in, the \Jsun\j's heat will have destroyed them. Wild 2 has only arrived in the inner \Jsolar system\j recently, and should still preserve most of the record he is seeking.\p
Brownlee has previously given his name to a class of cosmic dust particle, which are known as Brownlee particles.\p
#
"Don't bother to duck",735,0,0,0
(Jul '98)
Despite the enthusiasm for movie disasters based on a \Jcomet\j or asteroid striking the \Jearth\j, Jay Frogel and Andrew Gould have asserted, in \IAstrophysical Journal Letters\i, that it is highly unlikely a \Jcomet\j will rain death and destruction on the \Jearth\j during the next half-million years. They have reviewed the motions of thousands of nearby stars, which has failed to show any rogue stars capable of pulling comets out of their orbits and into the \Jearth\j's path.\p
There is little support for the so-called "death star" scenario where a passing star might alter the current orbits of comets near our \Jsolar system\j and send them our way. While people tend to assume that the stars are fixed in their positions relative to our \Jsun\j, some of them are not. \B\1Barnard's Star\b\c, for example, is moving closer to us for the next ten thousand years, but will never approach close enough to cause any harm.\p
A successful "death star" would need to come close enough to the \B\1Oort \Jcloud\j\b\c to pull a number of bodies out of their orbits, sending them plunging into the inner \Jsolar system\j. This is possibly the cause behind previous impact events like the asteroid which probably killed the dinosaurs at the boundary between the Cretaceous and the tertiary, as proposed by \B\1Luis Alvarez\b\c.\p
A "death star" candidate would have to be moving directly towards us: if it shows any sideways motion, it will never get close enough to cause harm. So the researchers looked at the Hipparcos Catalogue, created by the Hipparcos \Jsatellite\j, launched in 1989 by the European Space Agency (ESA) with a mission to accurately measure the location and motion of more than 120,000 stars.\p
The most likely threat would come from a bright star, of 8th\c \Jmagnitude\j or above, as these stars are likely to be close enough and/or large enough to fit the bill, but so far there are no really good risks out there. Of course, a single \Jcomet\j could strike the \Jearth\j this year, but a major \Jcomet\j shower in the next half million years looks unlikely. In fact, the chance that a big enough star to cause significant damage would go through our region even in the next 10 million years is extremely small.\p
Another \Jsatellite\j, \BGAIA,\b has been proposed by ESA which would measure the motions of 50 million objects, including stars as faint as 15th \Jmagnitude\j. If it is approved, GAIA would be launched no sooner than the year 2009.\p
#
"August, 1998 Science Review",736,0,0,0
\JUnderground microbes and Mars life\j
\JAnd tomorrow's temperature in the Crab Nebula . . .\j
\JA better transistor from gallium nitride\j
\JCommunications meteors\j
\JA new threat to networks\j
\JUltrasound imaging of prostate cancer\j
\JErlichiosis can be detected with x-rays\j
\JSalt hunger\j
\JA quick fix for H. pylori\j
\JInfluenza in the news\j
\JHow bacteria protect themselves against the immune system\j
\JDouble danger from Fen-phen\j
\JDiabetes and oral contraception\j
\JNew HIV strain identified\j
\JMeasles outbreak in South America\j
\JA new weapon against cancer\j
\JNew breast cancer gene\j
\JJumping DNA and evolution\j
\JThe structure of cytochrome bc1\j
\JSpotting the bullies early\j
\JLiverworts got there first\j
\JFruit flies should take their time\j
\JThe effects of high CO2 levels\j
\JAfrica cold . . .\j
\JAfrica hot\j
\JMost scientists but not all . . .\j
\JTurtles win standing\j
\JLoggers may not be so harmful after all\j
\JDioxins still a problem\j
\JCanola oil hits the market place \j
\JAerosonde: good news and bad news\j
\JSOHO found again\j
\JThe Wife of Bath's chloroplasts\j
\JThe earth's magnetic field\j
\JRich couple pays to have pet cloned \j
\JThe moon's \Jatmosphere\j\j
\JNeutrino discoverer dies\j
#
"Underground microbes and Mars life",737,0,0,0
(Aug '98)
\1Mars\b\c, and the possibility of life on Mars, remains a hot topic in the scientific community. The discussion has been largely fuelled by the debate over \B\1meteorite\b\c ALH84001, which we have reported regularly since our first update in December 1996. In simple terms, if we have \Jfossil\j evidence of something like a life form on Mars, then we can reasonably hope to find evidence of life even today, in some other parts of the \Jsolar system\j. And since we continue to find life in ever-more unexpected and extreme environments, the search, and the speculation, continue.\p
Recently, with discoveries of life in rocks deep beneath the surface of our \Jplanet\j, scientists have speculated that it may have been fuelled by \Jhydrogen\j gas, produced when \Jbasalt\j reacts with \Jwater\j. But while this reaction can be made to take place under laboratory conditions, an August report in \IScience\i indicated that the reaction is extremely unlikely. As the same reaction was also thought to be a possible fuel source for Martian microbes, this is disappointing news.\p
All life forms need \Jenergy\j inputs to survive, and they usually get this, directly or indirectly, from the \Jsun\j, although some life forms around submarine vents are able to extract \Jenergy\j from chemicals released from the vents. Obviously, we can rule out any direct conversion of sunlight below the surface of the ground. Equally obviously, none of these chemicals is very likely to be found in deep rock, but the \Jhydrogen\j looked like a distinct possibility as a source of life-supporting \Jenergy\j, at least to microbiologists.\p
Unfortunately, the chemists and geologists say otherwise. According to the report, \Jhydrogen\j could only be produced from the \Jbasalt\j when the rock was exposed to acidic conditions, but environments containing \Jbasalt\j are never acidic. All the same, the \Jbacteria\j are there, and something is sustaining them. For the moment, organic material, carried down in \Jgroundwater\j seems the most likely \Jenergy\j source.\p
There is an alternative: another report in the same issue of \IScience\i indicates that there may be rock-eating microbes living nearly a mile beneath the ocean floor in conditions which suggest that similar life could exist on Mars or other planets. Where the \Jbasalt\j in core samples was glassy, having quickly been cooled by seawater, the scientists found a series of tracks and trails, and the tracks showed traces of DNA, indicating a biological origin.\p
The \Jbasalt\j has all of the basic elements for life including carbon, phosphorous and \Jnitrogen\j, and would need only \Jwater\j to complete the life support system, and \Jgroundwater\j seeping in through the rocks could easily supply that last need.\p
The conditions involved could as easily be found on Mars or a moon of Jupiter, though whether the life originated there or seeped in with the \Jgroundwater\j is a question still to be answered. But if life can flourish under those conditions, hope will continue to flourish in the minds of scientists. In fact, the leading author, Martin Fisk, even goes so far as to suggest that even on a \Jcomet\j containing ice crystals that gets warmed up when the \Jcomet\j passes by the \Jsun\j could be a home to life. Fisk believes, however, that the \Jbasalt\j microbes would have seeped in with seawater, something which might be more difficult in other parts of the \Jsolar system\j.\p
We already know a lot about the interior of Mars from meteorites that have been blasted off the \Jplanet\j, that the rocks of Mars have all of the needs for life, including carbon, phosphorous, small amounts of \Jnitrogen\j, and minerals that contain \Jwater\j, or at least some evidence of \Jwater\j, and a new Martian \Jmeteorite\j was identified in August - (see below). The possible \Jtemperature\j range for \Jearth\j \Jbacteria\j goes from 113║C in deep ocean vents down to -15║C in some freezing brines.\p
For the moment, Fisk will not be drawn on the nature of the microbes involved, which may be either \Jbacteria\j or members of the group Archaea, and he concedes that the tracks might still turn out to be caused by a new, undocumented chemical process. The trails are only found in the "glassy outer inch" (2 cm) of the basalts, and Fisk believes the looser chemical structure of the quickly-cooled rock makes it easier for the microbes to break the \Jbasalt\j down than the more tightly-bound inner rock which cooled more slowly.\p
The core samples were collected some years ago, so while the DNA traces have remained, there were no living microbes found, so the next step will be to take new fresh cores, and examine them right away. There will, however, be severe problems in getting any living microbes to the surface where pressures are much lower.\p
Our knowledge of Martian \Jgeology\j received a small boost in August, when a 13\Uth\u Martian \Jmeteorite\j, weighing just over 2 kg (4.5 pounds) was described at the 61st Meteoritical Society meeting in Dublin, after the \Jmeteorite\j was found in May this year. This is the first Martian \Jmeteorite\j to have been found in the \JSahara desert\j, and the first to have been found since the recent round of speculation on whether or not there may be life on Mars.\p
The \B\1inert gases\b\c which are found in the Martian \Jatmosphere\j, the " inert gas inventory", is very characteristic and has been known since the Viking mission measurements on the surface of Mars in 1976. If a \Jmeteorite\j has this "fingerprint", then it is classified as Martian. In this case, further evidence came from mineral chemistry and petrographic (geological) observations, such as the presence of feldspathic glass in the sample, and a British study of a 150 mg sample looked at the composition of the oxygen \B\1isotopes\b\c found in the \Jmeteorite\j, and a week after the Dublin conference, this research gave further \Jcorroboration\j for a Martian origin.\p
The inert gas analyses show that this \Jmeteorite\j was ejected from Mars about 1 million years ago, marking its source as an ejection event unknown from other Martian meteorites. After that, the \Jmeteorite\j took its time to travel through space before it was captured by the gravity of the \JEarth\j and landed in Northern \JAfrica\j. The \Jmeteorite\j has weathered over time, and is unlikely to show any signs of \Jfossil\j life, even if such signs existed there originally.\p
Interestingly, an analysis of the other twelve Martian meteorites, reported in the journal \IMeteoritics and Planetary Science\i in July, concluded that the meteorites came from six distinct regions on Mars, based on the trace elements found in the meteorites. This was, of course, too early for the Saharan \Jmeteorite\j to have been included. Scientists are hopeful that the different geological make-ups of the groups may allow us, some time in the future, to identify exactly where the meteorites came from on Mars.\p
But while there still seem to be some hopes for life existing on Mars, and us finding proof of it, the cold hard facts say that the potential amount of life that could have existed on Mars is tiny compared to the biomass early in \JEarth\j's history. That, at least, is the view which was expressed in \IJournal of \JGeophysics\j Research\i, late in August. Jakosky and Shock modelled volcanic activity on Mars, and concluded that Mars would have been, at best, a much poorer place for life to star than our own \Jplanet\j. They estimate that the amount of chemical \Jenergy\j available to organisms must have been much less on Mars.\p
In simple terms, the two researchers estimate that a square \Jcentimetre\j of the \Jearth\j's land surface can produce 20 grams of organisms every 1000 years, using \Jphotosynthesis\j, but that Mars would take a billion years to achieve the same production, using chemosynthesis alone.\p
The problem with estimates such as these is that they are intended as much to test our assumptions as anything else, so rather than being predictions, the scientists here are asserting that, given what we believe, this is the situation. If future space exploration shows that life is there, it does not make \Ithem\i wrong, so much as show that our \Iassumptions\i were wrong. NASA plans to bring Mars samples back to \JEarth\j to look for evidence of life, so we may be able to test their reasoning sooner, rather than later. But if life is found on Mars, this will not make Jakosky and Shock wrong.\p
For example, we have no evidence that \Jphotosynthesis\j is now taking place on Mars, so we assume that it never has: maybe Martian \Jphotosynthesis\j is different, or maybe it no longer happens, but it did once. Then again, maybe Martian life can extract \Jenergy\j in some radically different way, though that is less likely.\p
Jakosky and Shock have even offered a way of proving them wrong: they say that the chances of picking up rocks containing fossils is small, and that life is more likely to be detected if NASA targets and explores \Jfossil\j or active hydrothermal systems, aqueous systems that could be exposed in walls of Mars' deep canyons, or active springs discharging at the surface.\p
And the chances for life on Europa? Even lower than on Mars, they say.\p
\BKey names\b: Martin R. Fisk (\Jbasalt\j tracks), Jutta Zipfel (\Jmeteorite\j), Michael Lipschutz (\Jmeteorite\j groups) Bruce Jakosky and Everett Shock (chances of life) \p
#
"And tomorrow's temperature in the Crab Nebula . . .",738,0,0,0
(Aug '98)
In the middle of 1054, astronomers in \JJapan\j and China recorded an amazing display in the sky when a \Jsupernova\j exploded. Suddenly, above the southern horn of the \Jconstellation\j \JTaurus\j was a new star which the Chinese described as six times brighter than Venus and about as brilliant as the full Moon. Once a star between eight and twelve times the size of our own \Jsun\j, the \Jsupernova\j ran out of fuel in about 5000 BC, but the \B\1supernova\b\c flash remained hidden until the light crossed the 6000 light years between there and \JEarth\j.\p
Even at that distance, it was bright enough to be seen even during the \Jday\j for a month after its first appearance, but after a year of visibility in the night sky, the "guest star", as the Chinese called it, faded to levels where it could only be seen with a powerful \Jtelescope\j. All that was left was a \Jcloud\j of gas, now seven light years across, and in the centre, a neutron star 20 kilometres (12.5 miles) across. Rediscovered in the 18\Uth\u century, the remnants of the \Jsupernova\j are now called the \B\1Crab Nebula\b\c, on account of its shape.\p
Even today, the \Jsupernova\j remnants remain a matter of interest, and NASA has plans to use the High Resolution Camera aboard the Advanced X-Ray Astrophysics Facility (AXAF), scheduled for launch in December, 1998, to examine the surface \Jtemperature\j of the neutron star at the centre of the Crab Nebula.\p
This \B\1neutron star\b\c is classified as a pulsar because of it sends bursts of \Jenergy\j out 33 times a second with reliability as high as that of our most dependable clocks and watches. The neutron star has a magnetic field about a trillion times that of the \Jearth\j, and this field focuses most of the radiation from the star into a cone which sometimes points at the \Jearth\j, and sometimes does not. So each spin of the neutron star produces a pulse of radiation, making it a pulsar.\p
The Crab Nebula's neutron star is of interest because it seems to be the youngest of about 700 known pulsars. Neutron stars cool as they age, so the \Jtemperature\j offers direct evidence of the physical activity occurring inside the star. As the youngest, it will also be the hottest, and that makes its \Jtemperature\j of interest to astronomers. \p
The standard model of a neutron star has a superfluid interior, under a crystalline neutron crust, something that we could never hope to make in a laboratory, so advanced physical theories need to be based on the available natural sources, and that means the Crab Nebula's pulsar. We will keep you posted on the results as they come in, during 1999.\p
#
"A better transistor from gallium nitride",739,0,0,0
(Aug '98)
A new generation of high-frequency, high-power transistors was announced during August by Cornell University researchers. Based on \Jgallium\j nitride, the new transistors promise to deliver up to 100 times as much power at microwave frequencies as the semiconductors now used in cellular telephones, military radar and \Jsatellite\j transmitters. \JGallium\j nitride is a semiconducting material with previously recognised potential for use in optical devices such as green and blue light-emitting diodes.\p
With the emphasis on miniaturisation, power is measured in terms of the output power per \Jmillimetre\j length occupied on a chip, and results as high as 2.2 watts per \Jmillimetre\j have been claimed at a frequency of 4 gigahertz (GHz), and the researchers believe they can do four or five times better in the near future. While others have achieved higher power outputs, they say, nobody has produced performances like theirs at such a high frequency, and as demand for a share of the electromagnetic spectrum increases, this will become more and more important.\p
One major application will be in the next generation of portable telephones. Where present systems use a cellular array of transmitters, the new system will use large numbers of low-\Jorbit\j satellites to provide a total coverage, rather than coverage limited to areas where transmission towers are located. The main advantage with the new transistors will be that higher power outputs will allow the use of fewer satellites in higher orbits, producing remarkable cost savings.\p
Typically one transistor on a chip will be about 0.3 microns wide by 250 microns long, and chips with arrays adding to a length of half a \Jmillimetre\j are being tested, the idea being to use smaller devices to ensure that there are no problems with heat build-up. Lester Eastman, one of the developers, plans to combine four devices, each 2 millimetres long, on a monolithic integrated circuit to make a chip with an output power of 100 watts at a frequency of 10 GHz.\p
\JGallium\j arsenide devices, also developed by Eastman, operate at frequencies well above 12 GHz, but there is a trade-off between power output and frequency, and as frequency increases, the power output of a given device drops rapidly. James Shealy makes the crystals, which are grown on a heat sink of either silicon carbide or sapphire. Silicon carbide conducts heat about 10 times as well as sapphire and makes the high power possible, but it is still a very expensive material, so the research depends on donations of wafers of silicon carbide from two companies which are also working to develop \Jgallium\j nitride transistors, Northrup Grumman and Cree Research.\p
The usual procedure in making a transistor involves "doping" very pure semiconductor material with a few atoms of another material which either creates free electrons or "holes" in the crystal. This makes the semiconductor into a conductor which can be switched on and off by the application of a small voltage.\p
Shealy and his colleagues have decided to make crystals in which a very thin layer of \Jgallium\j \Jaluminium\j nitride is laid on top of a base of \Jgallium\j nitride. The bond between the two layers places a strain on the upper layer that allows free electrons to flow into the \Jgallium\j nitride layer, a phenomenon known as a piezoelectric effect.\p
This in turn causes enormous charge densities, and produces a material with very low resistance, allowing higher voltages. The crystals are then used by Eastman to produce chips in a special clean room, and for now, the main problem is with small imperfections in the crystal, which they hope to iron out soon.\p
#
"Communications meteors",740,0,0,0
(Aug '98)
A scheme, originally developed to keep US military communications going after a nuclear strike may now be used to manage an ambulance service. The \Jcold war\j scheme relied on bouncing radio signals off \B\1meteor\b\c trails, but now it is being used to keep track of the vehicles in a private ambulance service, according to a \INew Scientist \ireport in mid-August.\p
Each \Jday\j, more than a million dust specks enter the \Jearth\j's \Jatmosphere\j from space. Once they come in contact with the \Jatmosphere\j, friction heats the particles until they burn or vaporise, leaving a trail of particles that radio signals can bounce off. Each trail may only last for a small part of a second, but with so many of the particles around, there will usually be a good enough reflective layer to support a ground-based communications system.\p
The system, known as "\JMeteor\j Burst", was cancelled when the Cold War ended. Some of the scientists involved then set up a company called StarCom Technologies, which has developed a civilian version as a cheap alternative to \Jsatellite\j systems. The transmitters send continual probe signals, and when one senses a return signal, it sends a rapid burst of digital data at frequencies between 40 and 50 megahertz that can be picked up over a wide area.\p
Data transfer rates are usually limited to about 20 kilobits per second, mainly because each \Jmeteor\j offers only a few hundred milliseconds of transmission time, enough to operate a low-information system that does no more than monitor the positions of vehicles. An ambulance firm in Washington state and Oregon has now fitted StarCom transceivers to a quarter of the 80 vehicles that it uses in the Seattle area to transfer patients around.\p
#
"A new threat to networks",741,0,0,0
(Aug '98)
People running networks under Windows 95 or Windows 98 have been alarmed to discover that there is a hacking program called Back Orifice which can wreak havoc on their systems. The name is a play on Microsoft's Back Office suite.\p
By mid-August, more than 50 thousand people had downloaded copies of the program from web sites controlled by "the \JCult\j of the Dead Cow". A copy of the program may be placed on a machine, or a user may be tricked into downloading the program under some other name.\p
Back Orifice allows a remote user to take over the operation of the attacked computer, but virus protection firms are rushing to release software that will check for the program. Back Orifice allows a remote user to do everything a legitimate user can do, even down to trapping and viewing passwords, without the legitimate users' knowledge.\p
The makers claim that their aim is to reveal flaws in security in Microsoft's Windows operating system, but the flaws are not in fact unique to Windows. The \JInternet\j was alive with messages in August warning users of the threat, and mainly with an introduction such \Ias "I was not sure about publishing this one to this group, but security by obscurity never works and it is just too nasty if you miss it\i." \p
Microsoft has been providing advice which avoids a number of the issues, suggesting that a specific chain of events has to happen if the trojan is to be installed, calmly ignoring the possibility that a malicious user will deliberately install it and walk away, coming back to use it later.\p
Microsoft also asserts that the attacker needs to know a user's IP address, but most users capable of using such a program know several ways of accessing this information in "2 seconds flat". Microsoft also suggest that the attacker can be kept out by a firewall, but one large segment of networks, those in educational establishments, see the most likely attackers already on the inside.\p
The rest of the year could be interesting for network administrators.\p
#
"Ultrasound imaging of prostate cancer",742,0,0,0
(Aug '98)
In western societies, about one man in a thousand will be diagnosed each year with a localised cancer in his \B\1prostate \Jgland\j\b\c, for which the preferred treatment has switched from surgical removal of the prostate, or external-beam radiation therapy, to radioactive seed implantation, or brachytherapy. The only problem: reliable identification of those eligible for the new treatment. A paper, soon to appear in journal \IIEEE Transactions on Medical Imaging\i, and reported on the \JInternet\j in August, describes how an assessment can now be made in minutes rather than days, and at a fraction of the cost, using a new \B\1ultrasound\b\c imaging technology.\p
As a plus, the new imaging method will also help surgeons to place the radioactive seeds more accurately. In the USA, about 10-15% of patients with localised prostate cancer currently are being treated with brachytherapy, and this proportion is likely to rise. Surgeons use an external template and ultrasound imaging to implant the seeds according to a pre-determined plan, but the needle needs to pass under the patient's pubic arch bone to reach all parts of the prostate \Jgland\j. In between 20 and 40% of cases, the arch interferes with access, making brachytherapy useless.\p
The ultrasound study reveals the size of the prostate: in general, if it is no more than 40 cubic centimetres, pubic arch interference is rarely encountered, but if the volume is 60 cc or more, interference will nearly always occur. In the intermediate range (40 - 60 cc), a separate x-ray \B\1computerized \Jtomography\j\b\c (CT) scan has been required, up until now, to map the pubic arch bone. This has then been combined with an ultrasound image to produce an overall map.\p
CT images are expensive, only available in large clinics, and take a week or more to prepare. The new method does away with this, using ultrasound to map the pubic arch bone. This has been made possible by the use of image-enhancing technology to reduce the visual noise and significantly increase contrast in ultrasound imaging of the pubic arch bone - previously, the bone scattered enough of the ultrasound to cause significant visual noise.\p
Now a surgeon can see the prostate \Jgland\j and pubic arch bone in a single ultrasound image and to assess pubic arch interference within minutes of the patient scan: at the moment, the biggest holdup is in getting data from the ultrasound machine to a computer for image processing. In the future, say the researchers, the new technology will be installed within the ultrasound machine itself so doctors can view the combined prostate \Jgland\j-pubic arch image in real time as the patient is being scanned.\p
One other interesting plus: the CT scan requires patients to lie in a prone position for imaging, which can change the position of the prostate relative to the pubic arch. Since the brachytherapy treatment is given in the "tucked-knee" position, and since ultrasound can be done in this position, the results will be more reliable. In fact, trials have given a failure rate of less than 1%.\p
\BKey names\b: Peter Grimm, Yongmin Kim and Sayan Pathak.\p
#
"Erlichiosis can be detected with x-rays",743,0,0,0
(Aug '98)
X-rays can now be used against at least one disease spread by tick bite. Lyme disease is now well-known in a number of parts of the world, where it is spread by ticks, and in the USA, Rocky Mountain spotted fever is also known to the public as a disease spread by ticks, but ticks are also able to spread a variety of other diseases, including tick paralysis, relapsing fever, Q fever, tularemia and certain forms of \Jencephalitis\j.\p
Erlichiosis was first described as a disease in Algerian dogs in 1935. In 1956, Japanese scientists found that a bacterium caused a human form of the illness. It was only recognised as a tick-borne disease in 1986, when three children in North Carolina caught the disease, including one who died in 1996. Two forms are recognised in the US, human monocytic ehrlichiosis, which occurs in the central and south-eastern United States, and human granulocytic ehrlichiosis, chiefly found in north central states.\p
Now Dr. Lynn A. Fordham has discovered the illness shows up on chest x-rays as increased fluid in the lungs. This is important because early diagnosis and treatment is important. The most effective treatment, an antibiotic called doxycycline has the side effect of discolouring teeth, and for this reason, doctors often avoid prescribing it unless they are certain of the need.\p
The report, which will appear in the November issue of the \IAmerican Journal of Roentgenology \iis written by Fordham and four other authors: Chung, Specter, Merten and Ingram.\p
Fordham points out that the fluid build-up seen on the x-rays may also be caused by organ failure, severe burns and drug reactions, but in the absence of any of these conditions, patients with exposure to ticks and the fluid build-up ought to be looked at very carefully indeed.\p
Even a few hours head start on a child who is gravely ill in an intensive care unit can be significant in the race between life and death. Of course, careful observation of those bitten by ticks, and avoiding tick bites in the first place are also important. If a child suffers tick bite, this should be noted on a calendar, and reported if the child becomes sick in the next few weeks.\p
#
"Salt hunger",744,0,0,0
(Aug '98)
Salt is a dietary problem. Too much of it is bad for you, causing problems such as \Jhypertension\j. But so is using too little salt, so using the condiment wisely is an important matter. The trouble is that some people experience a craving for salt which goes well beyond their bodily needs. According to a recent report in the journal \IAppetite\i, if you have this sort of craving, it may be a result of your mother's \B\1morning sickness\b\c.\p
Two University of Washington psychologists, Ilene Bernstein and Sue Crystal, say that people's preference for salt may have been imprinted while they were still in their mother's womb. So if you hanker after chips, popcorn, pretzels and the other snack foods rich in salt, and if you reach for the salt shaker first, it may not be your fault.\p
The two researchers say they have found a link between people's salt preference and the level of morning sickness experienced by their mothers when they were pregnant. They found that 16-week-old infants whose mothers suffered moderate to severe nausea and vomiting in early \Jpregnancy\j showed a greater preference for salt-\Jwater\j solutions than babies whose mothers experienced mild or no morning sickness. \p
They had previously found that salt preference in young adults was stronger in people whose mothers reported moderate or severe morning sickness. The preference measure was based on self-reporting on salt use and food choices, and also on salt intake under laboratory conditions. The offspring of women who had mild or no symptoms showed a reduced preference for salt.\p
Nearly two-thirds of pregnant women suffer the symptoms of morning sickness and Bernstein believes it is the \Jdehydration\j associated with vomiting that seems to be the key in shaping a fondness for salt. As fluid is lost in vomiting, \B\1hormone\b\c systems spring into action to restore fluid balances: either these \Jhormones\j cross the placenta, or there is a "knock-on" effect, with the baby responding to \Jdehydration\j by releasing its own \Jhormones\j.\p
Young adults are able to assist in a study of this sort, but how do you test a 16-week-old baby? Simple: you squirt small amounts of distilled \Jwater\j, 0.6 percent salt \Jwater\j and 1.2 percent salt \Jwater\j, into the babies' mouths, and watch their facial and other reactions, which can range from grimaces to licking happily. The filmed reactions were coded later by observers who had no knowledge of which solution a given baby was being given in any test.\p
Another test involved giving the babies a bottle with 20 millilitres of one of the test solutions for one minute, or until they rejected the bottle, and the amount of fluid consumed was then measured.\p
The babies of mothers who had experienced moderate to severe morning sickness preferred the 1.2% salt solution - not as salty as \Jtomato\j juice or chicken soup, but about the level people usually feel comfortable gargling with, which is saltier than mother's milk or human \Jsaliva\j. Mild morning sickness in this study involved vomiting once or twice during the \Jpregnancy\j, while moderate to severe vomiting ranged from once every other \Jday\j for at least one week to two and three times a \Jday\j for three weeks.\p
#
"A quick fix for H. pylori",745,0,0,0
(Aug '98)
\I\1Helicobacter pylori\b\i\c, the bacterium which causes something like 90% of all \Jstomach\j ulcers, usually abbreviated to \IH. pylori\i, is now treatable in a much shorter period than was previously thought possible. The most commonly prescribed treatments run 14 to 28 days and involve multiple doses of pills each \Jday\j, while recent research suggests that much shorter periods of treatment with the right drug combination can do the job just as well.\p
The problem is in the phrase "drug combination". Patients need to take the right drugs, \Jday\j by \Jday\j, at the right times, and the longer a course of treatment, the more likely it is that the patient will forget one or more doses. And the shorter treatment is also less costly . . .\p
In fact, the study, reported in the \IArchives of Internal Medicine \iduring August, suggests that a 10-\Jday\j treatment is as good as a 14-\Jday\j course of drugs. The standard "triple therapy" dosage involves a combination of three drugs: a powerful acid-inhibiting drug called a proton-\Jpump\j inhibitor (such as lansoprazole) along with amoxicillin and clarithromycin. Based on this research, the U.S. Food and Drug Administration recently reduced the recommended length of treatment for \IH. pylori \ifrom 14 to 10 days.\p
The treatments showed an 85% clearance rate for the 14-\Jday\j treatment, and 84% for the 10-\Jday\j treatment, a difference which has no statistical significance. It seems that people acquire their \IH. pylori \iinfections as children, probably from poor hygiene, and the bacterium exists harmlessly in many adults' digestive systems. In recent decades, higher hygiene standards have reduced the incidence of infections, but they are still common. Studies have shown that \Jstomach\j \Julcer\j patients who are free of the bacterium have only a 15 - 20% chance of suffering another \Julcer\j, compared with 75% for those who still carry the bacterium.\p
#
"Influenza in the news",746,0,0,0
(Aug '98)
One of the mysteries of the 1918 \B\1influenza\b\c epidemic, which killed 20 million people around the world may have been revealed during August in the \IProceedings of the National Academy of Sciences \i(\IPNAS\i). There is an unusual molecular mechanism which amplifies the disease-causing power of influenza A virus which allows the virus to attack parts of the body other than the usual attack sites in the respiratory tract.\p
It looks as though this could provide a better understanding of how the influenza A virus suddenly becomes virulent, and might be a useful marker for any future emerging pandemic. \p
There are two surface proteins on the influenza A virus, referred to as haemagglutinin (HA) and neuraminidase (NA). The virus becomes infectious when the HA molecule is cut into two sub-units which attach it to human cells. The protease enzymes which cut the HA molecule are provided by us, with the enzymes being common in the lungs and \Jtrachea\j but not in other tissues. As a result, these are the tissues which are usually attacked by the virus.\p
Yoshihiro Kawaoka and Hideo Goto, both from the University of \JWisconsin\j School of Veterinary Medicine have found a strain of virus which uses a more common human \Jenzyme\j, plasmin, to split the HA molecule.\p
The viruses they studied are descended from the 1918 pandemic strain, and have adapted to grow in the brains of mice. This strain's NA molecule traps and holds plasminogen, a plasmin precursor (a molecule which can be turned into plasmin). This serves to accelerate the rate of HA cleavage, speeding up the rate of infection.\p
After studying ten other strains of human, bird and pig flu virus, they have found no other sign of this ability to gather plasminogen, suggesting that this ability is the key to the severe virulence seen in the 1918 pandemic influenza A strain. There are two features which give the NA molecule this power, and Goto and Kawaoka suggest that any strain of influenza whose NA molecules carry this feature should be regarded as potentially dangerous.\p
A number of \Jbacteria\j, including the group A streptococci, carry plasminogen-binding proteins, which make infection easier. This is the first time the mechanism has been found in a virus. Now it will be necessary to check other pathogenic viruses for the same mechanism.\p
#
"How bacteria protect themselves against the immune system",747,0,0,0
(Aug '98)
A report to appear in the September \IProceedings of the National Academy of Sciences \iwas announced at the end of August. It reveals a major mechanism used by \Jbacteria\j to get around the \B\1immunity\b\c which we are given by the human immune system. This in turn opens the way for the development of a new class of \Jantibiotics\j to fight infection.\p
It appears that flavohaemoglobin, a bacterial protein, gives protection against nitric oxide, a toxic chemical secreted by the immune system to help kill disease-causing microorganisms. The protein, which the researchers have re-named nitric oxide dioxygenase (NOD), detoxifies nitric oxide, providing protection against infection. \p
The nitric oxide (NO) is produced by inflammatory cells, and then kills the \Jbacteria\j. If the \Jbacteria\j can break the NO down with NOD, the \Jbacteria\j will survive.\p
NOD is an ancient protein, having been around since about the time oxygen first appeared in the \Jearth\j's \Jatmosphere\j. It is used by living things to make \B\1haemoglobin\b\c, the protein in red blood cells which carries oxygen. While the presence of a form of haemoglobin in \Jbacteria\j has been known for some time, its actual function has always been a mystery - the only thing that could be ruled out was that it was used to carry oxygen. Now that a role has been found for the protein, its renaming as NOD is justified.\p
The discovery is also important evolutionary news, showing us what role the haemoglobin family played before the proteins were needed to carry oxygen.\p
Medically, the importance lies in the fact that \Jbacteria\j vary remarkably in their ability to withstand attack from NO. If the protein can be attacked, the \Jbacteria\j will be left with almost no defences - and hopefully, no easy way to get around the attack, either.\p
Interestingly, \Jtobacco\j smokers are continually filling their lungs with huge concentrations of NO, and \Jbacteria\j react to increased levels by increasing their defence capability. Could this explain why smokers suffer more from infection?\p
\BKey names\b: Andrew Salzman, Paul Gardner\p
#
"Double danger from Fen-phen",748,0,0,0
(Aug '98)
Two diet drugs, Fenfluramine and Phentermine, are often taken together, but during August the combination was described as potentially toxic. The report, due to be released at an \B\1obesity\b\c congress on September 2, appeared on the \JInternet\j several days earlier. Used together, the two drugs take away the body's ability to control the amount of \B\1serotonin\b\c in blood plasma. High \Jserotonin\j levels damage blood vessels, especially in the lungs, and may also damage heart valves.\p
The two drugs were commonly combined between 1992 and 1997, when fenfluramine was voluntarily withdrawn by its manufacturer. The combination was never checked for approval, as the two drugs were supplied as two pills. The researchers have suggested that phentermine may cause problems when paired with antidepressants such as Prozac, and also with common cold remedies such as pseudoephedrine, phenylpropanolamine, and ephedrine.\p
The body usually controls \Jserotonin\j by absorbing it into \B\1platelets\b\c which use it for clotting, or by destroying the \Jserotonin\j with an \Jenzyme\j, monoamine oxidase (MAO). Fenfluramine, like many antidepressant drugs, stops plasma \Jserotonin\j from being taken up into \Jplatelets\j, while phentermine inhibits the MAO that destroys \Jserotonin\j. Phentermine was identified as a MAO inhibitor in the 1970s, but this fact did not appear on the drug's label, pointing to a need for drug labels to be complete and up-to-date. The US standards for labels, which tend to apply in other parts of the world, were established before the MAO inhibition by phentermine was known. Later, when this became apparent, the labels were not updated. The researchers also sounded a note of caution about two herbs, St. John's Wort and ma huang, which may also be MAO inhibitors. \p
#
"Diabetes and oral contraception",749,0,0,0
(Aug '98)
The so-called "mini-pill", the progestin-only \Jbirth control\j pill, may be a problem for women who develop \B\1diabetes\b\c during \Jpregnancy\j, as it may increase the risk that they will later develop type-2 diabetes, also called non-\Jinsulin\j dependent diabetes mellitus, or NIDDM, according to a report in the \IJournal of the American Medical Association \i(\IJAMA\i).\p
On the positive side, the same report indicates that the more common low-dose combination oral contraceptives appear to be a relatively safe method of contraception for these very high-risk women. These pills contain a mix of \B\1oestrogens\b\c and progestin. The research is based on a study of 904 Latina women who had developed gestational diabetes mellitus (GDM) during a recent \Jpregnancy\j, all of whom had returned to normal sugar processing after the birth of their child.\p
The women were about equally divided between those choosing hormonal oral contraceptives and those choosing a non-hormonal contraceptives. Of those using hormonal contraceptives, 383 were prescribed low-dose combination \Jbirth control\j pills and another 78 women who were breast-feeding received the progestin-only contraceptives. This choice is common in women who wish to breast-feed because the progestin-only oral contraceptives do not interfere with milk production. The women in the study switched to low-dose combination contraceptives when they stopped breast-feeding.\p
Across the sample, 169 women developed chronic diabetes during the study, but the distribution was significant. In the progestin-only sample, the rate of type-2 diabetes was 26.5% a year, while the rate was 11.7% per year for the low-dose combination oral contraceptive, and for women using non-hormonal contraceptives, the rate was just 8.7%.\p
Type-2 or adult-onset diabetes can be asymptomatic in its early stages, so many cases are only diagnosed with a blood test. Most people with this condition develop the disease in middle age. \JObesity\j, having a close relative with the disease and being a member of certain ethnic groups (including African-American, Latino and Native American) are known risk factors. This form of diabetes is treated by lifestyle changes in diet and exercise, and some oral medications, rather than by diabetes. \p
\BKey names\b: Ruth Peters and Siri Kjos\p
#
"New HIV strain identified",750,0,0,0
(Aug '98)
A new strain of the \B\1HIV\b\c-1 virus which causes \B\1AIDS\b\c has been reported from the \JCameroon\j. The new strain is more similar to the SIV (simian immunodeficiency virus) than the two previously known strains, M ("majority") and O ("outlier"). The strain was isolated from a \JCameroon\j woman who died of the disease in 1995, and seems to be rare and localised. The strain is described in the September issue of \INature Medicine\i, and was found during a study led by virologist Franτois Simon of the Bichat \JHospital\j in Paris. Checks had failed to find any sign of M or O strains, but showed positive to SIV. After researchers isolated the patient's virus and sequenced its genetic code, they concluded that it belongs to a previously unknown third HIV-1 group, which they called the N group.\p
The new strain is also identified as YBF30, and is genetically closer to \Jchimpanzee\j SIV than either of the other HIV-1 strains, suggesting that its evolutionary ancestors might have been transmitted from chimps to humans. YBF30 is both rare and localised, on present information, with only three out of 700 blood samples from HIV-1 infected patients in \JCameroon\j tested positive for the strain. That, however, is no reason to be complacent, and standard \B\1HIV\b\c tests will now need to be modified to pick up the new strain as well as the others.\p
#
"Measles outbreak in South America",751,0,0,0
(Aug '98)
The campaign to rid the Americas of \B\1measles\b\c by the year 2000 has received a setback, with more than 4000 new cases in Argentina, \JBolivia\j and \JBrazil\j. While \Jmeasles\j cases in north and south America have been cut from 250,000 in 1990 to 2109 in 1996, a break-out like the present one could turn all of that progress back in a very short time.\p
So far, eleven children have died in Argentina, and another eleven in \JBolivia\j, which may seem like a small threat, but across the world, a million children die each year from \Jmeasles\j and its complications.\p
The race was on during August to use \Jvaccination\j as a means of containing the outbreaks. Numbers in the rest of the Americas were satisfactorily low, with just 7 cases in Canada and \JColombia\j during 1998 to date, 47 in the US, with Venezuela, \JParaguay\j and \JGuatemala\j reporting 3, 2 and 1 case so far this year. \p
\JMeasles\j vaccine can be provided for about 10 cents a dose, while the triple vaccine against \Jmeasles\j, \Jmumps\j and rubella costs just 49 cents, which sounds like a very good investment. The Pan-American Health Organization maintains a web site at \Bhttp://www.paho.org\b, where more information may be obtained.\p
#
"A new weapon against cancer",752,0,0,0
(Aug '98)
In 1991, scientists found a \B\1protein\b\c in frog eggs which can kill \B\1cancer\b\c cells. Now a related protein has been found in mammals which has the same potential to fight cancer cells. According to a report in the \IProceedings of the National Academy of Sciences\i, ribonuclease A, a digestive protein made by the \Jpancreas\j, can be genetically altered to kill cancer cells. This might open the way to fight cancer with chemicals while avoiding the side effects of standard \Jchemotherapy\j.\p
The original ribonuclease protein was found in the Northern leopard frog, and is now being manufactured as Onconase, which is currently undergoing clinical trials. Intravenous treatment with Onconase has shown promise against malignant mesothelioma (an \Jasbestos\j-related cancer, related to \B\1asbestosis\b\c), and the drug also seems to inhibit the replication of the \B\1HIV\b\c virus.\p
The new research was aimed at finding out why the frog protein is so effective against cancers, and researcher Ron Raines decided to compare the molecule with a "cousin", a similar ribonuclease protein found in cows. Bovine ribonuclease is very similar to the human form of the protein.\p
The mammalian forms of the ribonuclease bind to a molecule called a ribonuclease inhibitor (RI), which attaches to the molecule and stops it attacking and breaking down RNA in the cell. But while RI is found in just about every human cell, Onconase binds less strongly to RI, leaving it free to attack cells. Luckily, Onconase seems only to attack cancer cells, while leaving normal human cells alone.\p
This remains a mystery to be solved, although Raines speculates that there may be \Jreceptors\j on the outside of cancer cells which bind more tightly to ribonucleases. But even if the mechanism is not explained, Raines and his colleagues have been able to create two variants of the bovine ribonuclease which do not bind tightly to RI, and which have proved, in laboratory tests, to attack cancer cells. In other words, the property behind the effect is not special to Onconase, and every human ribonuclease has the potential to wipe out cancer cells.\p
Raines' lab is currently working on creating variant strains of human ribonuclease that can produce the same cancer-fighting effects. "Using a human protein is less problematic than integrating a substance that is foreign to the human body", he says.\p
#
"New breast cancer gene",753,0,0,0
(Aug '98)
The September issue of \INature \JGenetics\j \idescribes a new gene which causes a small increase in the risk of hereditary breast cancer in women of Ashkenazi Jewish descent. The gene has an especially pronounced effect in those who already carry the other well-known BRCA mutations which are linked to the disease.\p
The \Jmutation\j, which has been named APC 11307K, increases the risk of inherited breast cancer by 50% in women of Ashkenazi Jewish descent, which translates to about a 15% lifetime breast cancer risk for this group of women with the APC \Jmutation\j. This compares with a risk of about 10% for women in the general population. \p
In a study of 632 women with breast cancer, 10.1% carried the APC \Jmutation\j, but in those with BRCA mutations, the APC level approached 19%, while the level was only about 7% in a much larger sample of women of Ashkenazi Jewish descent, based on the analysis of blood samples. These were healthy women who were taking part in another epidemiological study.\p
The APC gene has been previously linked to an increased risk of developing colon cancer, and some of the individuals identified in that study had a family history of breast cancer, alerting researchers to the possible link which now seems to be confirmed. \p
The APC gene \Jmutation\j could be a genetic modifier of breast cancer risk in women who are already under a hereditary risk of breast cancer. Oddly, the women with the APC \Jmutation\j did not seem to develop breast cancer at an earlier age. For the moment, this means there would seem to be no need or reason to undertake general screening of women for the \Jmutation\j, and the incidence of genetic breast cancer seems to be only in the 5-10% range of all breast cancers.\p
\BKey name\b: Kenneth Offit\p
#
"Jumping DNA and evolution",754,0,0,0
(Aug '98)
Where did the human immune system come from? Based on a report in \INature\i this month, it is possible that it evolved from a mobile piece of DNA that inserted itself into the forerunner of the mammalian \Jgenome\j more than 450 million years ago. The report describes evidence that tiny gene particles, which are vital to the task of producing millions of different kinds of \Jantibodies\j, act like a gene segment that can "jump" into foreign DNA. These genes are known in lower organisms, but this is the first cut-and-paste "transposase" ever found in humans.\p
Only the jawed vertebrates have a second, adaptive immune system, in addition to the innate immune system that all other species have, suggesting that it was some ancestor of the higher vertebrates which was fortunate enough to acquire this gene. Alone among the vertebrates, the \B\1lamprey\b\c and \B\1hagfish\b\c lack this adaptive immune system.\p
The key point to remember is that our adaptive immune system has two different methods of defence, both based on a class of white blood cells called lymphocytes, and which are found in the blood and lymphoid organs. One group, the B lymphocytes, produce \B\1antibodies\b\c that bind tightly to a foreign molecule, inactivating it or marking it for destruction by other cells in the immune system. The T lymphocytes, on the other hand, detect the presence of foreign molecules inside special "processing" cells, once those cells have displayed a fragment of the foreign molecule-those pieces are called antigens-on the processing cell's surface. They do this because T-cell \Jreceptors\j on the surface of the T \Jlymphocyte\j bind strongly to the \Jantigen\j.\p
Both \Jlymphocyte\j types generate an almost unlimited number of \Jantibodies\j and T-cell \Jreceptors\j, using the same genetic mechanism. This uses a smaller number of gene segments that can be shuffled and joined to one another to produce many distinct combinations. In a sense, each recombination is a new gene to add to an almost unlimited \Jdatabase\j of genetic information from which to generate \Jantibodies\j and T-cell \Jreceptors\j.\p
Two closely linked genes, RAG1 and RAG2 (standing for recombination-activating genes 1 and 2), code for proteins which promote this genetic recombination. Together, RAG1 and RAG2 work as a transposase, an \Jenzyme\j which takes pieces of DNA from one location in a \Jchromosome\j and moves (transposes) these pieces to another place. This mechanism seems to be central to the way vertebrates create millions of different \B\1immunoglobulin\b\c and T-cell \Jreceptors\j from a limited number of genes.\p
Lampreys and hagfish lack the system of B and T lymphocytes, and have neither RAG1 and RAG2, nor any close relatives of these genes, but all other vertebrates do have this characteristic. While it is possible that the condition evolved more than once, it is usual to assume a single occasion on which the RAG transposase was a virus-like particle which inserted itself into the germ-line of an ancestral jawed vertebrate, whose descendants later became all of the jawed vertebrates on \Jearth\j today.\p
The only catch: the transposition can be observed in laboratory preparations, but so far, there is no evidence that it works the same way in the human body, where the RAG genes now just work to slice and connect pieces of genes without inserting the excised piece of DNA in a different location. Of course, wandering around with genes like this inside you could be as dangerous as rolling around in a barrel full of razor blades, so the next step will need to be finding out how the gene is stopped from shuffling our genetic decks until we are randomly killed.\p
#
"The structure of cytochrome bc1",755,0,0,0
(Aug '98)
When the structure of DNA was revealed by Watson and Crick in 1953, nobody could have predicted that, a generation and a half later, we would have the sorts of genetic \Jengineering\j that is now commonplace. The structure of \Jcytochrome\j bc1, revealed in July by Bing Jap, may not be as fruitful, but it is still extremely promising. The \Jcytochrome\j is found in mitochondria, the organelles often called the powerhouses of the cell, where most food is broken down into smaller \Jenergy\j units, but the structure of this molecule may provide a key to treating Alzheimer's disease or any of a number of other degenerative diseases - including aging. \p
The report reveals the crystal structure of the molecule, also known as complex III, which plays a critical role in the relay of electrons for \Jenergy\j production. It is one of four protein complexes in the mitochondrial respiratory chain. The solution took some eight years, and used x-ray \Jcrystallography\j to produce structural images of the entire 11 subunits of \Jcytochrome\j bc1 at a resolution of approximately 3 angstroms, using \Jcytochrome\j bc1 from cow heart cells.\p
Every plant and animal cell contains hundreds of mitochondria, which generate about 90% of the \Jenergy\j the cell needs. The \Jenergy\j is generated by the transfer of electrons from food molecules. The electrons are passed through the respiratory chain and into the production of ATP, \B\1adenosine triphosphate\b\c. If anything interferes with the \Jelectron\j transfer process, \Jenergy\j production drops away, which is not good for the maintenance of normal life. Aside from anything else, the drop-off also leads to the production of oxygen free-radical molecules which can cause mutations in DNA, both in the mitochondria and also in the nucleus.\p
\JCytochrome\j bc1 is revealed as a dimer, a complex of two molecular chains called monomers. These monomers determine the shape of the dimer, and in particular, of a hollow between the two monomers, and the shape of the dimer controls the operation of the \Jcytochrome\j. With the structure revealed, the hollow, previously no better than a hypothesis, is now confirmed.\p
In living mitochondria, the dimer is a non-crystalline protein embedded in the lipids of the mitochondrial inner membrane, so the first challenge was to crystallise the protein, because x-ray \Jcrystallography\j (as its name implies) depends on the existence of a crystal. The first step was to break the membrane up by using just the right amount of just the right detergent.\p
The small complex II protein and the very large complex I protein remain to be solved, and Jap and his colleagues will now join the effort, already under way, to solve complex I, using a combination of x-ray and \Jelectron\j \Jcrystallography\j techniques. Jap describes this as involving "multiple crystal averaging and phase information from both heavy atom derivatives and \Jelectron\j micrographs".\p
\BOther key names\b: Joong Lee, John Kyongwon Lee\p
#
"Spotting the bullies early",756,0,0,0
(Aug '98)
According to an article in the August issue of the \IArchives of General \JPsychiatry\j\i, size is important when observers are trying to predict which children will turn into school-yard bullies. In an American study, researchers have found that a difference as small as half an inch (1.3 cm) at three years can translate into a higher level of aggression when the same children are eleven years old, at least in \JMauritius\j.\p
Toddlers who are more fearless and stimulation seeking than their peers are also more likely to show aggression at eleven, and earlier research has already shown that the most aggressive children at age 11 are more likely than normal to become violent criminals as adults - regardless of their height at that age, though there is no evidence that taller 3-year-olds become violent criminals.\p
Adrian Raine, a clinical neuroscientist and lead author on the study, is wary of going too far, and warns that the findings cannot be used to accurately predict whether any particular child will or will not grow up to be a criminal. What his team have found, he asserts, is that the early markers should be regarded as warning signs that some children may be more inclined to show signs of aggression.\p
Raine believes there may be a critical period in development - sometime after age 3 but before age 11 - when a child learns to use his physical advantage to aggressive ends. Parents of tall toddlers, he suggests - especially those who are very stimulation seeking and fearless - need to take extra care to drive home the message that there are better ways than physical force to get what they want in life.\p
The study was based on height and weight measurements on 1130 male and female three- year-olds in \JMauritius\j, a racially mixed island nation in the Indian Ocean off the coast of \JAfrica\j. The island has low emigration rates, so the researchers were able to track the children as they matured.\p
Children were ranked at the start on several stimulation- seeking scales, including a four-point scale that tested their willingness to explore toys independently of their mother in a laboratory setting, ranging from 1) passively clings to mother through to 4) actively explores toys without returning to mother. They were also rated on a five-point fearlessness scale.\p
Then at age 11, their mothers answered questionnaires to measure the youths' aggressiveness ("fights," "is cruel," "swears" and "threatens"). The youths ranked by their mothers in the highest 15th percentile on an aggression scale were found to have stood, as toddlers, an average of one-half inch above their peers. There was also a weaker link to weight.\p
Raine hypothesises that the cause may be a matter of \B\1testosterone\b\c, with taller people usually having higher testosterone levels, given that a number of studies have shown that violent offenders are higher than normal in testosterone. Yet if the whole issue could be explained that simply, the most aggressive 11-year- olds might be expected to stand taller than their less aggressive peers, but no research has ever borne out such a connection.\p
The link to stimulation-seeking may be as simple as children who seek stimulation getting away from home more, and getting into more trouble, since one of the strongest predictors of delinquency is a lack of parental supervision.\p
#
"Liverworts got there first",757,0,0,0
(Aug '98)
A mid-August report in \INature\i indicates that the earliest land plant was a \B\1liverwort\b\c. The first evidence for plants coming up out of the sea is in the form of mid-Ordovician \Jspore\j tetrads, fossilised groups of four spores, about 476 million years old, but it has always been uncertain which plant group owned the spores. There are no associated megafossils, large pieces of plant material, to make the identification clear, and a number of different plant groups can produce such spores.\p
At different times, different phylogenetic analyses have identified the \Jliverwort\j, the \B\1hornwort\b\c group and some kind of \B\1bryophyte\b\c as each being the first group of land plants, but a new survey of 352 diverse land plants shows that three mitochondrial group II introns are present, with occasional losses, in mosses, hornworts and all major lineages of vascular plants.\p
These introns are entirely absent from liverworts, green \Jalgae\j and all other \B\1eucaryotes\b\c. These results suggest that the liverworts arose earlier, and came ashore, and that after that time, a common ancestor of all the other land plant groups acquired the introns, all before the other groups moved from \Jwater\j onto the land.\p
#
"Fruit flies should take their time",758,0,0,0
(Aug '98)
A fruit fly only has a short life as an adult, and logic would suggest that, in evolutionary terms, the most "sensible" thing an adult fruit fly could do would be to mate as soon as possible, in order to ensure that it produced offspring. Nature is a little more subtle than that, and the wham-bam style of reproduction has now been shown not to pay off - not if you are \IDrosophila melanogaster\i, a fruit fly, anyway. A report at the end of August in the \IProceedings of the National Academy of Sciences \ishows that when female fruit flies are given a choice between mates, their offspring live longer as adults than do the offspring of females who have only a single mate from which to choose. \p
The results seem to suggest that even something as small as a female fruit fly is somehow able to detect those potential mates who offer the best "chemistry" - or in scientific terms, choose males which carry the genes most likely to complement her own genes.\p
The method or methods that the females might use to identify a "good" mate remain a mystery. Many, if not most, animal signals may fall outside the range of unassisted human perception, and may involve smells, movements, or appearance in some part of the spectrum which we do not see. Most previous studies have been restricted to observable appearance and how it relates to female choosiness or male attractiveness.\p
In the present study, the researchers created two sets of artificial selection lines \Iof D. melanogaster\i. In the first line, named the "S" line, they reduced the opportunity for sexual selection by mating one virgin male and female in each vial. In the second ("M") line, they placed one female in a vial with five males so that female choice and male competition would come into play.\p
In some of the "S" vials, the female rejected the only available male, but with replication over ten generations, they were then able to rate the offspring for a number of survival characteristics, including wing size, larval competitive ability, and the number of "teeth" in the sex comb, a structure resembling a stunted hair comb with long teeth, which is found only on the legs of males and which is used to sense females.\p
Most importantly, they looked at measures of adult age-specific survival, and found that the males and females in the "M" lines lived longer than the flies in the "S" lines. Perhaps the most interesting aspect of this work is that it was carried out over a number of generations, allowing the "\Jmagnification\j" of any slight genetic effect. So for fruit fly, at least, arranged marriages would seem not to be the way to go.\p
\BKey names\b: Daniel Promislow, Emily A. Smith and Louise Pearse\p
#
"The effects of high CO2 levels",759,0,0,0
(Aug '98)
The first year's results from a Duke University research facility that exposes open-air forests to high carbon dioxide levels suggest that South-eastern US forest trees could grow up to 12 percent faster in the higher CO\D2\d \Jatmosphere\j expected by 2050 from \Jfossil\j fuel combustion and other human activities.\p
The researchers said that they did not believe such increased growth would serve to absorb the extra CO\D2\d being produced around the world, and they were doubtful that such high growth rates would be sustained as the experiment continues. William Schlesinger of Duke University, one of the researchers, commented: "I would not be at all surprised if the growth response is somewhat lower in '98. I certainly expect it's going to decline after a few years as the forest adjusts."\p
The research was reported at the annual meeting of the Ecological Society of America, held during August. It is based on two plots: in one, three patches of loblolly pine-dominated woodlands are being enveloped around-the-clock by the CO\D2\d equivalent of 21st century air delivered by computer controlled rings of towers. Another three identical tower rings, each encircling similar patches of pine forest, are not providing extra carbon dioxide. The gas-less rings thus serve as "control" sites that scientists can compare to the high-CO\D2\d plots.\p
Rising levels of atmospheric carbon dioxide might make plants grow faster, they could be bad news for plant-eating insects, according to a report in New Scientist magazine. A \JFlorida\j biologist has found that subtle increases in CO\D2\d can kill leaf-eating moths by reducing the nutritional value of the leaves they feed on. \p
\BOther key names on the Duke project\b: Shawna Naidu, Jacqueline Mohan, Elke Naumburg, David Ellsworth (Brookhaven), Andrew Allen and Adrien Finzi\p
#
"Africa cold . . .",760,0,0,0
(Aug '98)
Late in August, \IScience\i carried a report about evidence of a series of global \Jice age\js which may have nearly snuffed out life about 700 million years ago. The report is based on isotopic and geological evidence from \B\1Namibia\b\c. Rock deposited on the edge of a long-vanished ocean in what is now south-west \JAfrica\j has provided the clues.\p
In the first place, long before any glacial debris was deposited, the isotopic ratio suggests that carbon was being removed from the world ocean through about half chemical and half biological processes. Then as the \Jice age\j developed, the biological proportion of the loss dropped, as would be expected as ice spread slowly across the globe. Isotopic levels are not preserved in glacial debris, but a layer of carbonate rock over the top of the debris suggests that biological productivity had dropped all the way to zero. Later, biological productivity appears to have recovered slowly.\p
And how were we saved? It seems that an outbreak of volcanic activity broke the cycle and got the \Jearth\j's \Jweather\j going again. There must have been breaks in the ice where life survived, and once volcanic carbon dioxide got the \B\1\Jgreenhouse effect\j\b\c going again, suggest the researchers.\p
\BKey names\b: Paul Hoffman, \JGalen\j Halverson, Daniel Schrag and Alan Kaufman.\p
#
"Africa hot",761,0,0,0
(Aug '98)
Two weeks earlier, also in \IScience\i, another research group reported the results of an isotopic analysis of the sediments from Hausberg Tarn, a small lake 4350 metres above sea level on a slope of Mt. \JKenya\j, a dormant \Jvolcano\j in East \JAfrica\j whose top, another 300 metres up, is covered by permanent glaciers. The conclusion is that a sudden warming of climate, lasting several centuries, took place in equatorial \JAfrica\j some 2,000 years ago. The researchers took a core from the lake for the period from 2250 BCE and 750 AD, based on carbon-14 dating.\p
The core contained fossils of various \Jalgae\j, and the researchers next looked at the oxygen isotope ratios in "biogenic \B\1opal\b\c", the siliceous remnants of the \Jalgae\j. Cooler \Jwater\j leads to the collection of more O-18 in the biogenic opal than will be found when the \Jalgae\j are living in warmer \Jwater\j.\p
The lake \Jwater\j warmed significantly, by about 4 degrees \JCelsius\j, between the years 350 BCE and 450 AD, reflecting a warming of climate in equatorial East \JAfrica\j, which seems to match up with warm periods occurring during approximately the same period in two other parts of the world - in the Swedish part of Lapland and in the northeastern St. Elias Mountains (southern \JYukon\j Territory and \JAlaska\j). \p
This finding gives us some insight into how the climate fluctuated naturally long before modern industries began releasing large quantities of gases, which would boost the normal natural \B\1\Jgreenhouse effect\j\b\c, into the \Jatmosphere\j. This in turn may allow us to distinguish between natural climate variability and the global warming which most scientists believe to be affecting our \Jplanet\j in recent years due to human factors.\p
\BKey names\b: Aldo Shemesh, Wibjorn Karlen and Miri Rietti-Shati\p
#
"Most scientists but not all . . .",762,0,0,0
(Aug '98)
" The great tragedy of science - the slaying of a beautiful hypothesis by an ugly fact.", said T. H. Huxley, many years ago. The global warming hypothesis is still alive and well, but there are a few ugly facts sniffing around which need either to be dealt with or explained.\p
The problem is that \Jsatellite\j data on \Jearth\j temperatures do not reflect the patterns which seem to be present in other data. Either there is a fault in the instruments, or the global warming phenomenon is more complex than we thought. Scientists around the world were reminded of this by a paper published in \INature\i in mid-August by Wentz and Schabel.\p
A number of systematic errors have been detected in results obtained from different sources, and even after these have been corrected, there remain some discrepancies between data gathered on the surface, and those gathered from higher up. The arguments remain too technical to digest easily here, but while the best data seem to reflect a small overall warming of about a hundredth of a degree \JCelsius\j each year, other data suggest cooling levels almost as great. Whether or not global warming is happening, the argument about the phenomenon is certainly generating a great deal of heat.\p
We will keep you posted on this dispute, once the data begin to make a bit more sense, and once the heat has gone off the polemic a little.\p
#
"Turtles win standing",763,0,0,0
(Aug '98)
In an uproarious story which sounds suspiciously fictional, but which really happened, three turtle species have won a legal battle to appear in court. They are a loggerhead turtle (\ICaretta caretta\i), a \B\1green turtle\b\c (\IChelonia mydas\i) and a \B\1leatherback turtle\b\c, and they won their appeal in a suit filed under the Endangered Species Act against a \JFlorida\j county, in the 11th U.S. Circuit Court of Appeals.\p
Named by species as lead parties, the first two turtles complained through their attorneys that by allowing beach driving, the county inadequately protected turtle hatchlings that emerge at night on the Atlantic Ocean beach. They lost their case at the District Court level, but won all three issues on appeal in a 2-1 majority decision. The Appeals Court also allowed the leatherback to join as a \Jplaintiff\j in the litigation, once again reversing a District Court ruling.\p
The Endangered Species Act allows \B\1\Jendangered species\j\b\c or threatened animals to file suit, and the turtles join the marbled murrelet, a threatened seabird that sued Secretary of the Interior Bruce Babbitt, and the palila, an endangered bird that sued Hawaii environmental regulators, as animals with legal standing.\p
The main problem in the case is lighting. Hatchlings emerge at night during the summer and early fall (autumn) months, and instinctively head towards light. In an undeveloped area, the light source would be moonlight reflecting off the surf, but vehicles on the beaches use lights and confuse the young turtles, which may then head away from the sea, and die.\p
Further court cases are likely to follow, but for now, the turtles appear to be ahead on points.\p
#
"Loggers may not be so harmful after all",764,0,0,0
(Aug '98)
Conventional wisdom has it that logged \Jrainforest\j is damaged \B\1rainforest\b\c, that even selectively logging areas caused the forest to suffer from a decline in tree species diversity. A late-August report in \IScience\i questions the usual assumption that young trees do not have much of a fighting chance against the erosion, soil compaction, and loss of tree canopy shelter caused by logging. In fact, it goes further, and states that within a decade of selective logging, forests can recover levels of tree species diversity similar to those in unlogged areas.\p
The research was carried out in Kalimantan (Indonesian \JBorneo\j), where researchers counted the number of species of trees on parcels of land in the rain forest that had been selectively logged for tall trees, either one or eight years ago, and in nearby areas which had not been logged because they were inaccessible to heavy machinery. Areas that had been logged one year before had 43% fewer tree species than unlogged forests. Forests logged eight years ago, however, had a diversity similar to forests that were not logged.\p
The conclusion of Charles Cannon, the main researcher involved, is that while logging can never improve the forest, it would be a mistake to write off forest which has been selectively logged as good for nothing but conversion to agricultural land.\p
Indonesian government rules allow the logging only of trees with a dbh (diameter at breast height) of 50 cm, so the researchers looked ahead to the next generation of logging, and recorded younger trees with a dbh of between 20 and 30 cm. Puzzlingly, while the one-year sites showed lower levels of diversity, after just eight years, this deficiency had been made good somehow.\p
(The diameter at breast height or dbh, is taken to be the diameter of the tree trunk at a height of 4'3" or 130 cm above ground level. It is measured with a "dbh tape", which is wrapped around the tree trunk at the right height - every forester can show you where that is on his or her body. The tape is marked in graduations which are 3.1416 cm apart on a metric tape, and 3.1416 inches apart on a non-metric tape, so that the \Jcircumference\j is automatically converted to a diameter measurement, so long as we assume that the trunk is a cylinder.)\p
\BOther key names\b: David Peart and Mark Leighton\p
#
"Dioxins still a problem",765,0,0,0
(Aug '98)
The most worrying organic pollutants today are the persistent chlorinated organic compounds, such as DDT, hexachlorobenzene, the \B\1dioxin\b\c group and the \B\1PCBs\b\c (polychlorinated biphenyls). While the release of these compounds is now being limited, the threat is still there, because the compounds are long-lasting, the Swedish Environmental Protection Agency has warned. Worse, there are other similar pollutants around.\p
The organochlorines, as a group, are not new. Hexachlorobenzene, also known as Lindane, gammexane, or BHC, was discovered in 1825 by \B\1Michael \JFaraday\j\b\c. It is formed by treating benzene with \Jchlorine\j in the presence of light, but the actual product is a mixture of nine \Jisomers\j, with only the gamma isomer being active. The insecticidal power of the mix was found in 1943: between 13 and 18% of the product is the gamma isomer.\p
The Swedish Environmental Protection Agency produced a report, \IMonitor 16\i, during August. This is a broad-based popular account of ten years of Swedish research in the field of persistent organic pollutants, which has also been published in English. It was written by Claes Bernes at the Swedish Environmental Protection Agency. The report was released in the run-up to \JDioxin\j'98, a Stockholm conference attended by 650 scientists from around the world from August 17 to 21. (The formal title, which may be useful for literature searches, is Halogenated Environmental Organic Pollutants, \JDioxin\j'98.)\p
The \JDioxin\j'98 Highlights included:\p
\BLinda Birnbaum\b looked at the harmful effects of low doses on the immune systems and sexual development in a number of animal species with low-level exposure.\p
\BJanneche Utne Skaare\b argues that in general, more northerly animals in Scandinavia have higher levels of pollutants, based on studies of \Jpolar bear\js, porpoises, glaucous gulls and \JArctic\j fox, although this view is questioned by other researchers.\p
\BBj÷rn Helander\b reported that the white-tailed eagle of the Baltic Sea has gone from suffering the highest concentrations of toxic pollutants among birds of prey in the world to much lower levels. Some signs remain - female birds born during the 1960s and 1970s are still laying, but fewer of their eggs hatch than with younger females, but this remains unexplained for the moment.\p
\BCourtney Sandau\b reported on blood levels of pollutants in \JInuit\j (Eskimo) people from Canada, concentrating on people who have a diet largely of fatty fish contaminated with PCBs. The blood levels of OH-PCBs, reaction products formed from PCBs in the blood, are on average 30 times higher than in other populations (control group), but levels of up to 100 times higher than "normal" can occur. \p
\BLinda Spoonster Schwartz\b indicated that women Vietnam war veterans who were exposed to Agent Orange show poorer health, even today. Agent Orange, a \Jdefoliant\j/\Jherbicide\j mixture used during the Vietnam war, contained the most toxic of the \Jdioxin\j group, a compound known as TCDD.\p
\BLars Rylander\b shows that women smoking ten or more cigarettes a \Jday\j and eating Baltic Sea fish contaminated with PCBs seem to have reduced fertility. In addition to this, women who grew also frequently eat fish from the Baltic Sea, run an increased risk of having children with low birth weight. \p
Several reports dealt with endocrine (hormonal) problems caused by pollutants. Tetrabromobisphenol-A (TBBP-A) is the most widely employed brominated flame retardant. Unfortunately, it competes with thyroxin, a hormone produced by the thyroid \Jgland\j, and essential to a proper metabolic balance. TBBP-A and other brominated flame retardants have been shown to accumulate in the brains of unborn mice, and so do some PCBs. \p
The second report deals with Swedish and \JLatvian\j men who eat fish: they showed significant levels of chemicals expected to affect hormonal levels, although no disturbances in hormonal levels were discovered.\p
Thyroxin controls blood sugar levels in the blood, and Geoffrey Calvert described a study of American workers exposed to dioxins and their tendencies to develop diabetes. The study showed that workers with a high exposure had higher blood sugar levels than those with low exposure, and a slight increase in the number of diabetes cases was noted in the most exposed workers.\p
\BPer Eriksson\b reports that ten- \Jday\j-old mice exposed to a few micrograms of DDT will suffer permanent damage to the central \Jnervous system\j. There are no external signs of damage, but the mice show reduced learning ability and hyperactive behaviour, suggesting that there has been permanent brain damage. PCBs and the brominated flame retardants, can also cause similar effects even in small doses. \p
\BJacob de Boer\b has found the flame retardants present in a variety of whale, seal and dolphin species, in amounts that he describes as alarming. The brominated substances have also been found in sperm whales, which seek food at great depths, indicating that these pollutants are found at several hundred meters depth in the Atlantic.\p
\BKoidu NorΘn \bfound brominated flame retardants in mother's milk from Swedish women. \p
\BDwain Winters\b has been studying American canned food which can be accurately dated, some going back as far as 1908, and he finds that the levels found in food over that time are similar to those in the environment, peaking in the late 1960s, and falling since then.\p
#
"Canola oil hits the market place",766,0,0,0
(Aug '98)
Canola oil (see \BSafer oil\b, January 1998) is now closer to being available over the counter. Duane Johnson, who developed the canola-based \Jlubricant\j, has just signed an agreement with a manufacturer. The oil product, which also contains soybean oil, is now claimed to be almost 100% recyclable, and Johnson says the waste from crushing the canola seeds can be used as a \Jcattle\j feed.\p
In use, says Johnson, the oil produces 30% less hydrocarbon \Jpollution\j than \Jpetroleum\j-based oils, and the oil can be more easily disposed of, once it has been used - or recycled into greases and chain oils. In the first trials, the oil will be available in \JMichigan\j (USA), with every chance of the market extending after that, according to the distributors.\p
The vexed question of gaining certification from the American \JPetroleum\j Institute has progressed no further since January, because the API sees the oil as a non-\Jpetroleum\j product. Unfortunately, \Jautomobile\j manufacturers require this before they will agree to the oil's use in warrantied engines. Johnson reports that many government agencies are testing the oil for use in government engines.\p
#
"Aerosonde: good news and bad news",767,0,0,0
(Aug '98)
Last month, (see \BTaking the automatic pilot a step further\b, July) we brought you news of the plans to fly a pilotless \Jaircraft\j across the Atlantic Ocean. The first attempt, \Jaircraft\j Trumper, never arrived at its destination, and may have crashed "anywhere from 30 to 3000 km away", according to reports sent out to a select band of \JInternet\j fans who were following the events, your reporter among them.\p
A second craft, Piper, was launched about three hours later, to give the team two chances of using the fine \Jweather\j that was available then. Piper was lost about one minute after take-off, due to a problem involving autopilot initialisation when manual control commands are blocked by communications fading. The reason why communication fades are being seen in the immediate vicinity of the take-off point at Bell Island airport remains a mystery.\p
The third \Jaircraft\j, Laima, was readied for launch at the next fair-\Jweather\j opportunity. A \Jday\j or so ahead of Laima's launch, \Jweather\j looked good at Bell Island, and we were told that "the en route winds are expected to be well within the acceptable range, and the Benbecula \Jweather\j for Friday, while far from fair, also is expected to be quite acceptable".\p
Laima's launch time was 0959 UTC (0729 local time) on August 20. The Aerosonde fleet brought to Bell Island included Millionaire, Insitu's 3-year-old workhorse. In view of Piper having been lost to an accident, the team put Millionaire into the transatlantic adventure. It was launched some three hours after Laima, but never arrived at Benbecula.\p
The team reported to us that " . . . it went out in a good cause. We came to the North Atlantic prepared to lose \Jaircraft\j in order to make our point; next we must concentrate on making the design reliable, so that long-distance operation will not be a matter of uncertainty and suspense."\p
Laima's launch weight was 13.1 kg, including 4.9 kg of fuel, and it landed at 12.44 UTC on August 21, making it the first pilotless \Jaircraft\j to fly over the Atlantic, the slowest flight across the Atlantic, and the smallest \Jaircraft\j ever to cross the Atlantic. The \Jweather\j en route had good tailwinds with lots of heavy rain, the trip was 3270 km, and the amount of fuel used was probably in the range of 3.8 kg and 4.3 kg - the Benbecula crew did not have a scale suitable to weigh the plane, and figures have not yet been posted.\p
Perhaps we should let the team's correspondent close this story: "The flight time, and the position vs time as logged onboard, were very close to the estimates made using \JAviation\j model from the US National \JWeather\j Service - so perhaps we don't need Aerosondes after all! On the other hand the \Jweather\j at landing, as far as I can tell from the video, was a good deal better than forecast - windy but without the expected overcast."\p
#
"SOHO found again",768,0,0,0
(Aug '98)
SOHO, the Solar and Heliospheric Observatory, has been contacted again by ground controllers. Lost on June 25, SOHO was contacted sporadically on 3 August. The transmissions came in 10- to 15-second bursts, as SOHO's solar panels rotated in and out of the sunlight, and on 8 August, controllers told SOHO to use the power from its panels to charge up one of its batteries. This succeeded, and by August 9, full communication for a period of several minutes was achieved.\p
\JTemperature\j readings suggest that the \Jsatellite\j is cold enough to have its fuel tanks frozen. Now an expert team of engineers will begin to bring the \Jspacecraft\j back from the brink. They will try to find a way to gradually return power to SOHO's many heaters, which normally keep the craft at around room \Jtemperature\j, without damaging frozen components.\p
#
"The Wife of Bath's chloroplasts",769,0,0,0
(Aug '98)
When \B\1Geoffrey Chaucer\b\c wrote his \ICanterbury Tales\i, he wrote in English, rather than the Latin that scholars and literary people favoured in those times (and in some cases, for several centuries more). Luckily, the English tongue had just turned the corner, so what he wrote then is intelligible to modern readers, so long as they try hard. Less luckily, printing had not been invented when he wrote, so all the early copies were individually made from existing hand-written copies. Just as English has evolved since Chaucer's time, so did the text of his work, as copying progressed, with the errors mostly being carried forward into new generations of copies.\p
That raises the question: which of the many copies is the one that we should accept today, as being closest to Chaucer's original? According to an article by Christopher Howe in late August in \INature\i, the answer may lie in the methods used by evolutionary biologists to construct biological family trees of chloroplasts by comparing inherited changes in DNA.\p
The reasoning is simple: each copy is a descendant of the earlier version it is copied from, and unique changes and errors, when transmitted, will show the path of descent. Later, once printing existed, the chance might arise to compare copies, but a computerised \Jdatabase\j of 58 copies of "The Wife of Bath's Prologue," from \IThe\i \ICanterbury Tales\i, all transcribed before 1500, could be a useful starting point, according to Howe. The computer has identified 11 copies that seem to have no close relatives. The unique pattern of changes and errors suggest that these may have been copied directly from Chaucer's lost original, or from early copies of the original.\p
#
"The earth's magnetic field",770,0,0,0
(Aug '98)
Data were published in \INature\i during August, giving a number of new measures of the intensity of the \JEarth\j's magnetic field over the past 160 million years. Such information is difficult to extract from rocks, and gets harder as the rocks get older, so that almost a third of the previous data related to the last 5 million years. There were about a hundred in the 5-160 Myr range, now another 21 data points have been added, based on submarine \Jbasalt\j glasses collected from locations throughout the world's oceans - previous data tended to be from restricted parts of the world.\p
The main finding is that the average magnetic field intensity (average dipole moment), rather than being similar to what we experience today, may have been only half of that. In itself, this sounds rather unimportant, but geologists trying to model the convective processes in the \JEarth\j's core which have been responsible for generating the magnetic field, will find this important in unravelling what is happening at the \Jearth\j's centre.\p
#
"Rich couple pays to have pet cloned",771,0,0,0
(Aug '98)
While the extension of Dolly-the-sheep methods to humans is definitely off-limits for all scientists, a rich American couple has just contributed US$2.3 million to cover the costs of \Jcloning\j Missy, their beloved dog, a 12-year-old, whose traits suggest a mix of border collie and husky.\p
The scientist, Mark Westhusin, has already succeeded in producing puppies by transplanting embryos into a surrogate mother, putting him ahead of his US competitors, but a lot of basic research will be needed before any \Jcloning\j attempt, since far more is known about cows, sheep, and mice than about canine reproductive \Jbiology\j. Nonetheless, succeed or fail, the effort will provide a lot of useful information.\p
Samples of Missy's skin and mucosal cells have been taken, and a Web site has been created to provide further information about progress and ethical standards on the project, dubbed Missyplicity. The site's address is \Bhttp://www.missyplicity.com\p
Other key names\b: Bob Burghardt and Duane Kraemer\p
#
"The moon's atmosphere",772,0,0,0
(Aug '98)
According to most people, the moon lacks an \Jatmosphere\j, and while this is good enough for most purposes, it is not strictly true. The Apollo program identified \Jhelium\j and \Jargon\j atoms there, and \JEarth\j-based observations of the moon added sodium and \Jpotassium\j ions to the list in 1988, but these four elements seem to only account for about 10% of the density of the lunar \Jatmosphere\j. Now a report which will appear soon in \IGeophysical Research Letters\i has identified even more atoms to the moon's thin \Jatmosphere\j. (Unusually, this research has been released on the \JInternet\j before being published in a printed journal.)\p
The Suprathermal Ion Spectrometer (STICS) instrument aboard the WIND \Jspacecraft\j has identified ions of several elements, including oxygen, silicon, and \Jaluminium\j, though only in small amounts. The quality and quantity of the STICS measurements will increase considerably in November 1998, when WIND will spend an extended period of time near the Moon.\p
#
"Neutrino discoverer dies",773,0,0,0
(Aug '98)
Physicist Frederick Reines, who shared the 1995 \B\1Nobel Prize\b\c for his discovery of the neutrino, died on August 26 after a long illness. Reines was one of the founders of neutrino physics. He was also one of the pioneers in developing the huge neutrino detectors that are used to ensnare them.\p
#
"September, 1998 Science Review",774,0,0,0
\JMore magnetar news\j
\JA black hole at the centre of the Milky Way\j
\JThe Leonids are coming!\j
\JMaglev as a cheap way into space\j
\JTwo new planets\j
\JPlanets galore?\j
\JJupiter's rings explained\j
\JWhat are little planets made of?\j
\JPhotonic crystals open the door to faster computing\j
\JModems with attitude\j
\JHiding in a crowd\j
\JMathematical news\j
\JJet-lagged blue-green algae?\j
\JSpreading the genes\j
\JNew prostate cancer gene\j
\JBowel cancer gene\j
\JTake two aspirin and call me . . .\j
\JDiabetes and cancer\j
\JDiabetes and fat\j
\JThe Russian TB is coming!\j
\JGreen coffee beans to cure AIDS?\j
\JMad cow disease detection\j
\JInfluenza -- the pigs' disease\j
\JNew flu drug\j
\JAnd yet another flu drug\j
\JTransplant news -- the Frankenstein option?\j
\JMalaria outbreak in Honduras\j
\JKiss of illness?\j
\JAnorexia and zinc\j
\JMaking smokers\j
\JDid we get a kick-start in winter?\j
\JNorthern exposed dinosaurs\j
\JCoelacanths in a new home\j
\JA new look at adaptation\j
\JWhy is South Africa so high?\j
\JFinding fault\j
\JOzone hole gets larger\j
\JSafer air conditioning for cars\j
\JLead in the environment\j
\JLadybird, ladybird, please stay at home\j
\JSnowballs may be figments after all\j
\JLaima update\j
\JSOHO looking good\j
\JRichard Leakey rises again\j
\JJonathan Mann and Mary Lou Clements-Mann\j
#
"More magnetar news",775,0,0,0
(Sep '98)
Although an intense wave of gamma rays, coming from a catastrophic magnetic flare on a magnetar 20,000 light years away, struck the \JEarth\j's \Jatmosphere\j on August 27, 1998, it was not until late September that any information was released on the subject. (See \BMagnetic Quakes Shake Neutron Stars\b, May 1998 for earlier details)\p
The gamma rays posed no health risk to humans -- by the time they reached those sleeping beneath it, the power was about one-tenth that of a dental x-ray. All the same, as they hit the night side of our \Jplanet\j, the rays produced a strong blast of ionisation, up to normal daylight levels, and powerful enough to blast sensitive detectors to maximum or off-scale on at least seven scientific \Jspacecraft\j in \JEarth\j \Jorbit\j and around the \Jsolar system\j. In one case, the Rossi X-ray Timing Explorer (RXTE) was actually pointed in a different direction, and the RXTE's Proportional Counting Array detected it through the shielding intended to keep stray radiation out. Then the array locked up because of the overload.\p
The magnetar, SGR1900+14, was already under close scrutiny from a team of scientists led by Dr. Chryssa Kouveliotou of NASA's Marshall Space Flight Center in \JHuntsville\j \JAlabama\j, because, as a soft gamma repeater (the SGR of the star's name), it was also emitting faint x-rays coming from the star, which pulsed regularly in intensity every 5.16 seconds. More importantly, data from other sources showed that the x-ray pulses were gradually slowing down, very slowly, but at a rate which can be measured.\p
This indicates that the SGR has a magnetic field about 800 trillion times stronger than \JEarth\j's magnetic field, and about 100 times stronger than any found anywhere in the Universe, which is what Robert Duncan and Christopher Thompson predicted in their 1992 magnetar theory, but before this confirmation could be published, SGR1900+14 flared up.\p
The ionisation event also showed evidence of a 5.16 second pulsation, and it may explain earlier ionisation events noted in 1983 and again in 1996. The actual cause seems to have been an out-of-control magnetic field realigning itself in a manner similar to what happens inside solar flares, and it now implies that there may be millions of "dead" magnetars out in space, sitting quietly, without radiating, within the remnants of supernovae.\p
Anomalous X-ray Pulsars (AXPs) are neutron stars whose slow periods and odd x-ray spectra indicate they are old, yet they are associated with young \Jsupernova\j remnants. We now can fairly safely assume that magnetars spend several millennia spinning and creating a great, erratic fuss, bursting in gamma radiation as SGRs. Then they slow down as their "juice" is depleted and glow as AXPs for several tens of thousands of years. \p
Finally, they fade to near invisibility, explaining why so many \Jsupernova\j remnants appear to have no pulsars at all, and at the same time explaining where some of the universe's missing \1\Jdark matter\j\c (see next story as well) has gone.\p
According to Duncan and Thompson, ultra-strong magnetic fields are generated only in neutron stars that are born spinning very rapidly, and these become magnetars. Those rotating more slowly at birth become radio pulsars, according to this model.\p
#
"A black hole at the centre of the Milky Way",776,0,0,0
(Sep '98)
What makes our galaxy go around? Because of the speed at which it rotates, astronomers have believed for some time that there is a large amount of \1\Jdark matter\j\c at the centre of the \1Milky Way\c, perhaps even a \1black hole\c, and this speculation now appears to be confirmed. Andrea Ghez, of the University of \JCalifornia\j-Los Angeles reported to the Central Parsecs Galactic Center Workshop '98 in Tucson, \JArizona\j during September on her studies of the centre of our galaxy.\p
Black holes cannot be seen, of course, but they can still be detected indirectly. Ghez started tracking the movement of 200 stars near the galactic centre in 1995, using the Keck I \JTelescope\j on top of Mauna Kea in Hawaii. At least twenty of the stars showed clear signs of influence by extreme gravitational forces. They are spiralling around the black hole at speeds of up to 5 million km/hr (3 million mph), about 10 times the speed at which stars typically move, and this implies that there must be a massive object -- Ghez estimates it to be 2.6 million times the mass of our \Jsun\j, all in one black hole.\p
Ghez used a method called infrared speckle interferometry, to analyse thousands of high-speed, high-resolution snapshots by computer, about the only way to see action which is happening some 24,000 light-years away. This results in an image that has at least 20 times better resolution than those made by traditional earthbound imaging techniques. In 1995, Ghez managed to witness the disappearance of a star that was, at the time, the closest object to the black hole, but whether the star was sucked into the black hole, or simply went behind it, scientists may never know. But at least we know where a bit more of the missing \Jdark matter\j is, now.\p
#
"The Leonids are coming!",777,0,0,0
(Sep '98)
November 1998 should see an excellent display of meteorites around the 17th. The \1Leonids\c are a \Jmeteor\j shower which is an annual event, caused by debris from the \Jcomet\j Tempel-Tuttle, and reaches a peak every 33 years or so. They were first recorded in 1833 as a spectacular sound and light show, even making hissing, crackling and popping sounds, probably 'electrophonic sounds', produced by radio waves generated in the meteoroids' wakes.\p
The meteorites all seemed to radiate out from the same point in the \Jconstellation\j Leo, but this is merely an illusion of perspective, as the pieces are all travelling on parallel paths: they seem to radiate from one point in space exactly as parallel railway lines appear to radiate from a point on the horizon.\p
This was not the first appearance of the Leonids, which had put on a smaller show in the previous year, as the \Jearth\j passed through the \Jorbit\j of the \Jcomet\j, and earlier showers had been recorded in 1799 by the Prussian scientist and explorer Alexander von Humboldt from his camp in Cuma±a, Venezuela. Humboldt wrote that there was "no part of the sky so large as twice the Moon's diameter not filled each instant by meteors." Other sightings were also reported from \JFlorida\j and \JGermany\j on the same night (November 12), and Humboldt was told by locals of a similar event observed in South America in 1766. Later studies took the commencement date back to at least the year 902.\p
The Leonid displays have been poor for almost two decades now, but already they are starting to build up again. While we may yet be disappointed in 1998 (as people were in 1899), we may do better in 1999 or 2000, because that pattern was set when 1900 and 1901 turned on spectacular displays.\p
The sounds of 1833 may present a problem, since their existence implies that some of the meteoroids must have been over a metre across -- too big to escape the \Jcomet\j in the usual way, by sublimation pressure, but large enough to make an awful mess of a \Jsatellite\j or space station. In 1966, we had very little hardware in space, but now we rely more and more heavily on space-based communications.\p
Will it be worth watching? Probably, but only if you are in the right place. Asia should probably see thousands of meteors per hour, in the early hours of the morning. \JAustralia\j will probably have good views as well, although the major population centres on the east coast will have to wait until just before sunrise to see anything. \p
Note the frequent use of "probably" in the details above -- the only thing that is certain is that somewhere around the time 19.45 hours (GMT or Universal Time) give or take a few hours, people who are outside and looking upwards will see some "shooting stars", and they are quite likely to see quite a lot. Around the world, amateur astronomers are already organising themselves into \Jtelephone\j "hot line" networks, just in case . . .\p
#
"Maglev as a cheap way into space",778,0,0,0
(Sep '98)
\1Magnetic levitation\c, says NASA, may be the answer to getting into space cheaply. At the moment, rocket technology requires us to raise a huge mass of material up into the air, just to lift a very small payload into space. NASA's Advanced Space Transportation Program at the Marshall Space Flight Center in \JHuntsville\j, \JAlabama\j, is proposing that magnetic levitation technologies could reduce the cost of going to space so dramatically that everyday people could leave the \Jplanet\j, perhaps even just for a thrill ride. \p
What is more, tickets could be on sale just after the turn of the century, according to a report from NASA in late September. The most expensive part of any mission to low-\JEarth\j \Jorbit\j is the first few seconds when the payload is getting off the ground, but Maglev offers a low-cost alternative for space transportation, because it leaves the first-stage propulsion system on the ground.\p
The thrill ride option is a serious one. NASA and an industry partner are teaming with an amusement ride manufacturer and a British university for research into magnetic levitation, to " build the highway to space," while also providing some shorter-term cash flow. Technically a maglev launch-assist system, the "ride" would use electromagnetic forces to drive a space vehicle down a track. The carrier could be similar to a flatbed railcar, and when the system reaches 1000 kilometres an hour (600 mph), the magnetically levitated vehicle would catapult from the ground and switch on a rocket engine to reach \Jorbit\j.\p
The beauty of the system, say NASA, comes from the lack of moving parts and freedom from contact, so the system is low in friction, and does not wear out, so that a single maglev system could help launch a \Jspacecraft\j from a typical airport runway to low-\JEarth\j \Jorbit\j every 90 minutes, and keep on working for thirty years.\p
A test track at the University of Sussex in Brighton, England is already running a 2-foot-long (60 cm) sled at 120 mph (200 kph) along a 20-foot (6 metre) electromagnetic track. This track is actually an advanced linear induction motor that provides thrust, lift and guidance of the launch vehicle. The linear induction motor, effectively a rotary motor split in half and rolled out flat, produces thrust in a \Jstraight line\j, rather than by turning a shaft or gears.\p
During 1999, two larger tracks measuring 50 and 400 feet (15 and 120 metres) are planned in \JHuntsville\j. Design plans are scheduled to be finalised within two years for a one-mile (1.6 km) track capable of launching a 40,000-pound (20 tonnes) payload at a test site.\p
As early as 2007, a maglev launch assist system could be used to launch very small communications satellites for thousands of dollars per pound. Within 20 years, this technology could be used to help launch much larger payloads to \Jorbit\j for only hundreds of dollars per pound. NASA regards this as a welcome contrast to today's launch costs of US$10,000 per pound. And when the cost gets down to the lower end, we can expect to start seeing tickets going on sale for a ride that would be literally off the \Jplanet\j.\p
#
"Two new planets",779,0,0,0
(Sep '98)
A report in Science News, late in September, identifies two new planets orbiting \Jsun\j-like stars. This brings to an even dozen the official number of such orbiting bodies, although this number is expected to increase quite soon.\p
One of the planets has an average distance from its star (HD210277, 68 light-years away from \JEarth\j) rather similar to that of our own \Jplanet\j. In other ways, the \1planet\c is very different from \Jearth\j, being at least 1.36 times as massive as Jupiter, and having a much more elongated \Jorbit\j. At its closest, the \Jplanet\j gets closer to its star than Venus does to our \Jsun\j, while it recedes to a distance farther away than Mars' average distance.\p
The other new \Jplanet\j orbits its parent star, (HD187123, 156 light-years away from \JEarth\j) more closely than any other \Jplanet\j found so far, having a circular \Jorbit\j which lies at a distance less than one-ninth the average separation between the \Jsun\j and Mercury. More information can be found at a Web site called The Extrasolar Planets Encyclopaedia.\p
\IKey Names:\i R. Paul Butler and Geoffrey W. Marcy\p
#
"Planets galore?",780,0,0,0
(Sep '98)
Most stars do not form as isolated stars, but exist in multiple systems of stars orbiting each other. While \Jscience fiction\j has long accepted the idea of planets in multi-star systems, real scientists used to think that such systems would not form planets. A report in Nature in late September indicates that a system called L1551 IRS5 is a binary star, still in the process of forming, and around each star, there is a small disc of gas and dust that may be turning into planets.\p
This is an exciting discovery, because the standard view has been that planets form, or at least occur, in the discs around young solar-type stars. Until now, it seemed likely that the dynamics of a binary system would affect the structure of the disk and perhaps interfere with \Jplanet\j formation. The discs were found by studying the core of the star-forming region L1551. \p
By achieving a linear resolution of seven astronomical units (less than the diameter of Jupiter's \Jorbit\j), the researchers were able to establish that the core of L1551 contains two distinct discs. These discs have a separation of 45AU; suggesting that they are associated with a binary system. Both discs are spatially resolved, with semi-major axes of about 10AU, which is about a factor of ten smaller than disks around isolated stars. The disc masses are of the order of 0.05 solar masses, which could be enough to form planetary systems like our own.\p
And two weeks earlier, Nature carried another planetary report, this time of planets around a pulsar, PSR1257+12, which shows that planets may form around post-main-sequence stars.\p
#
"Jupiter's rings explained",781,0,0,0
(Sep '98)
\1Jupiter\c has a diameter of approximately 143,000 kilometres (86,000 miles). It has a ring system which begins about 92,000 kilometres (55,000 miles) from Jupiter's centre and extends to about 250,000 kilometres (150,000 miles) from the \Jplanet\j. In the late 1970s, NASA's Voyager \Jspacecraft\j first revealed the structure of Jupiter's rings: a flattened main ring and an inner, \Jcloud\j-like ring, called the halo, both composed of small, dark particles. One Voyager image seemed to indicate a third, faint outer ring.\p
The new \1Galileo\c data reveals that this third ring, known as the gossamer ring because of its transparency, consists of two rings. One is embedded within the other, and both are composed of microscopic debris from two small moons, Amalthea and Thebe. In summary, Jupiter's ring system is formed by dust kicked up as interplanetary meteoroids smash into the giant \Jplanet\j's four, small inner moons while the outermost ring is actually two rings, one embedded within the other. There is now every reason to expect that we will see similar systems at Saturn and the other giant planets, since similar faint rings are probably associated with the many small moons of the \Jsolar system\j's other giant planets.\p
It appears that dust is kicked off the small moons when they are struck by interplanetary meteoroids, or fragments of comets and \Jasteroids\j, at speeds greatly magnified by Jupiter's huge gravitational field. The small moons are particularly vulnerable targets because of their relative closeness to the giant \Jplanet\j. By the time it reaches one of the moons, a typical meteoroid is going so fast it buries itself deep in the moon, where the \Jenergy\j is used to vaporise the meteoroid, blasting off debris at such high velocity that it escapes the moon's gravitational field.\p
To be a dust source, the moon has to be small enough to have a low gravitational field, and with a diameter of just eight kilometres (five miles) and an \Jorbit\j that lies just at the periphery of the main ring, tiny Adrastea is "most perfectly suited for the job" according to NASA scientists. Once they are separated from their moon, the particles are still left travelling in much the same \Jorbit\j as the source \Jsatellite\j, and this causes them to fill the orbital space around Jupiter with a disc or particles. NASA's \JGalileo\j \Jspacecraft\j has been orbiting Jupiter and its moons for more than two years, and NASA maintains an excellent information site on the Web at http://www.jpl.nasa.gov/\Jgalileo\j\p
#
"What are little planets made of?",782,0,0,0
(Sep '98)
The inner planets of the \Jsolar system\j are, according to all of the standard theories, assumed to be formed from the same material. So why do the inner planets exhibit different mean densities? The question has been around for fifty years or more, but it was back in the news in mid-September when a Science report by Connie Bertka and Yingwei Fei of the Carnegie Institution's Geophysical Laboratory and Center for High Pressure Research revealed a new analysis of data from the Mars \1Pathfinder\c Mission.\p
According to the report, one current theory explaining density variations is wrong, and it goes on to suggest that future modellers of \Jplanet\j formation close to the \Jsun\j ("inner \Jsolar system\j \Jaccretion\j") must be able to explain in their model a set of inner planets with differing elemental compositions in their models.\p
It now appears that the bulk elemental composition of Mars does not match the composition of a type of primitive \Jmeteorite\j called a C1 carbonaceous chondrite. Until now, these bodies have been assumed to reflect the standards for the abundance ratios of non-volatile elements in C1 chondrites, especially the iron/\Jsilica\j (Fe/Si) ratio. The C1 chondrites show refractory element abundance ratios similar not only to those of the \Jsun\j's \Jatmosphere\j, but also to lunar and terrestrial samples.\p
The inescapable conclusion seemed to be that the C1 chondrites represent the original parent material from which the inner \Jsolar system\j was built, and that the terrestrial planets (with the exception of Mercury) reflect the same basic non-volatile element composition. The differences between the mean densities of the different planets were set down to the form in which the elements occurred: metallic iron is much more dense than the same material when it is incorporated into silicate minerals.\p
Previously, scientists thought that Mars might be a problem for the C1 model, but the killer came when the Mars Pathfinder mission brought home a definitive value for Mars's moment of \Jinertia\j, usually referred to as C. The value of C is one of the items of data needed to work out a \Jplanet\j's bulk composition.\p
Taking the C value for \JEarth\j, its mean density and given an understanding of high-pressure mineral phase transitions in its interior, a good case can be made for "a calculated non-volatile element bulk composition equivalent to that of a C1 chondrite". In other words, the C1 chondrite theory held up.\p
Unfortunately, the C value for Mars cannot meet a C1 composition and still conform to known geophysical and geochemical constraints (including the new value for C and a bulk composition derived from a set of Martian meteorites). However you look at it, the elemental composition of Mars was clearly different from that of C1, and of \JEarth\j. And once the model fails on one \Jplanet\j, there is no good reason to apply it to the other planets. As they used to say in the early days of space exploration, "back to the drawing-boards".\p
#
"Photonic crystals open the door to faster computing",783,0,0,0
(Sep '98)
The US Department of \JEnergy\j's Sandia National Laboratories announced an interesting development during September. It involves interlocking tiny slivers of silicon into a lattice, which appears to have solved a major technical problem: how to bend light easily and cheaply without leaking it, no matter how many twists or turns are needed for optical communications or (potentially) optical computers.\p
Because the lattice has a regular repeating structure, the researchers have called this a photonic crystal, and current models work in the infrared range (approximately 10 micrometre) wavelengths. Two researchers at Sandia, Shawn Lin and Jim Fleming, are now working on a 1.5 micrometre (ultraviolet) crystal, since this is the wavelength at which almost all the world's optically transmitted information is passed. They hope to reach this target late in 1999.\p
The importance of this development is that it makes possible tinier, cheaper, more effective waveguides to combine or separate optical frequencies at the beginning or end of information transmissions, because it bends far more light in far less space at considerably less cost than current commercial methods.\p
The key is careful design. By designing the distance between "logs" of the lattice carefully, a chosen wavelength is reflected instead of passing out of the space, as longer or shorter wavelengths can. In other words, Lin and Fleming have created the equivalent of a photonic "band gap" that forbids certain frequencies of light from exiting the lattice. (A "band gap" is a term usually applied to electrons, not photons, and in that context, signifies a range of energies in which electrons are absent because their presence would contradict the laws of \1quantum mechanics.\c) \p
The nearly leak-proof lattices form a cage that trap and guide approximately 95 percent of the light sent within them, as compared with approximately 30 percent for conventional waveguides, and they take only one-tenth to one-fifth the space to bend the light. And most importantly, standard integrated-circuit manufacturing technology can be used to create tens of thousands of waveguides from a single, 6-inch silicon wafer, giving densities 10 to 100 times greater than for the standard \Jgallium\j arsenide used in current commercial technology.\p
The main potential is seen now as lying in low-\Jenergy\j lasers, photonic computers, and communications, but nobody is losing sight of a distant goal which has suddenly become slightly closer: the optical computer, which runs faster and cooler than existing computers. Anybody who has looked inside their new computer and seen the fan attached to the CPU will realise that the heat generated in the latest chips is fast becoming a limiting factor in chip speed, but photons are not only faster, they run cooler as well.\p
The problem is that no one has been able to bend useful frequencies of light around tight corners (as navigated by electrons through a million turns on a computer chip the size of a postage stamp) without incurring large losses in information; with previously used techniques, light leaks, and badly, the more tightly it is turned.\p
#
"Modems with attitude",784,0,0,0
(Sep '98)
Plans were announced during September for trials in \JFlorida\j of a system called universal asymmetric digital subscriber line, or universal ADSL for short. The trials will be carried out by BellSouth on about a hundred University of \JFlorida\j students and staff. The ADSL system allows access speeds of some 30 times any existing dial-up \1modem\c, with downstream speeds up to 1.5 Mbs (megabits per second) and 160 kbs (kilobits per second) downstream.\p
Universal ADSL uses an ordinary phone line, and provides high-speed, always-on \JInternet\j connections via users' \Jtelephone\j line while permitting simultaneous use of the line for voice, \Jfax\j or dial-up data uses. In other words, both regular \Jtelephone\j use and the high-speed data access are possible for the first time over a single \Jtelephone\j line without use of a line splitter device.\p
The University of \JFlorida\j students and staff in the trial have volunteered to take part, and will be installing the equipment themselves, as well as taking part in the evaluation. The \Jtelephone\j line is used to connect a home computer to the university network, while still allowing normal phone calls. At the moment, BellSouth plans to extend the Universal ADSL service to other parts of \JFlorida\j in late 1998.\p
#
"Hiding in a crowd",785,0,0,0
(Sep '98)
A piece of software, designed to disguise the e-mail address of an \1Internet\c user, was announced in early September. Many Web pages use "Cookies" software to track visitors, and the pages they visited, so that if you are a regular visitor, the operator can build up a profile of you which can be used to target e-mail advertising which comes to you later.\p
The software, called Crowds, promises to allow users to share identities, so that the request for data does not go directly to the site you choose, but along a randomly determined route through the crowd, so that it usually arrives at your chosen website with somebody else's address attached to it.\p
While you may then be identified as a visitor to a site you did not visit, the operators at the other end will be unable to build up a valid profile on your usage -- or that of anybody else who is part of the crowd. The next step, it seems, is up to the website operators.\p
#
"Mathematical news",786,0,0,0
(Sep '98)
Sadly, we have to confess to missing a news item: a new Mersenne prime, the 37th known, was found last February, but the news slipped very quietly onto the \JInternet\j, causing hardly a ripple, and we missed it. The new largest-known prime was found by Roland Clarkson, one of over 4000 volunteers around the world who have been participating in the Great \JInternet\j Mersenne Prime Search (GIMPS). This prime number is the third record prime found by the GIMPS project.\p
Mersenne primes, discovered by \1Marin Mersenne\c, all have the form 2p-1, where p is a prime. Only some of the numbers in this form are prime numbers, but finding a new Mersenne prime is a sure-fire way of finding the highest known prime number at any point in time.\p
The new Mersenne prime number, 23021377-1, is 909,526 digits long. Clarkson used a 200 MHz Pentium computer, running part-time for 46 days to prove the number prime. Running uninterrupted it would take about a week to test the primality of this prime number. (GIMPS volunteers are given software to run the tests, and are assigned numbers, so that everybody has an equal chance of being the next discoverer.\p
Anybody can join in if they have a computer: the Web address is http://www.mersenne.org/prime.htm , and as the organisers say, they have " . . . something for it to do between keystrokes and mouse clicks, and many more machines are needed for the next, even larger, prime."\p
In October 1997, we reported on the discovery of the 36th Mersenne prime, and asked if one of our readers would find number 37 -- now we must ask if one of them will be the proud discoverer of Mersenne prime number 38. We must, however, caution readers against following the lead of Aaron Blosser, a computer contractor at US West, an American telecommunications company in \JDenver\j, \JColorado\j.\p
Blosser set up the company's computers to do some extra searching when they were not otherwise being used, but according to US West, he got it wrong, so that directory information services were forced to wait up to a minute to access a \Jdatabase\j, rather than getting information in fifteen seconds.\p
In late September, \INew Scientist\i magazine reported that Blosser's home had been searched by the FBI, and his computer equipment had been confiscated -- this is a common over-reaction by law enforcement agencies when there has been something they regard as a computer-related crime, even though those computers, unlike the network at US West, had not been used in Blosser's activities. No charges have been laid against Blosser so far, but sadly, even though he used 2500 computers in the network, Blosser did not find a new Mersenne prime.\p
#
"Jet-lagged blue-green algae?",787,0,0,0
(Sep '98)
The Cyanobacteria, more commonly and incorrectly known as blue-green \1algae\c, are the simplest organisms to have a circadian rhythm, a "body clock" which allows them to keep time with their surroundings. In humans, our \1circadian rhythm\c is what causes us to be jet-lagged, but up until now, nobody has really been too sure about how it is controlled.\p
The standard approach of experimental \Jgenetics\j is to take the simplest organism showing a characteristic, and try to unravel the mechanism there. The standard evolutionary assumption is that such mechanisms usually only arise once, and are then passed on to all of the descendants, and this assumption is usually (though not always) borne out.\p
The first step has just been taken, with a report in \IScience,\i early in September, revealing no less than three genes essential to circadian rhythms in cyanobacteria, the simplest organisms known to have such "internal clocks". Researchers began with the knowledge that the cyanobacteria need some mechanism to pace the 24-hour cycles of \Jnitrogen\j fixation and amino acid uptake which are crucial for life.\p
The obvious problem: how do you recognise the activity of circadian rhythms in something this simple? The answer is to use a gene for a bioluminescent \Jenzyme\j to indicate the activity of another gene that the researchers knew was controlled by the circadian clock. So whenever the circadian clock is working, the cell will make bioluminescent proteins in a rhythmic pattern. This glowing pattern in the cell is predictable throughout the \Jday\j, and makes it easier for researchers to identify which cyanobacteria have working circadian clocks.\p
Like Sherlock Holmes' dog which did not bark in the night, the researchers had to concentrate on the cyanobacteria which did not glow in the right way. These were the cyanobacteria without such clocks, or whose clocks did not keep the correct time, and so the ones which had faulty copies of the key genes. Then they chopped the cyanobacterial \Jgenome\j and searched for pieces that would restore the normal rhythms when introduced into the mutant \Jbacteria\j, until they zeroed in on the genes responsible.\p
Altogether, they isolated more than 100 strains with mutations that either abolished or altered the organism's daily activity cycles, and from this, they found a cluster of three genes, which they named kaiABC, from the Japanese word for cycle, "kai". KaiABC contains the information the cell will use to make proteins called KaiA, KaiB and KaiC. The Kai proteins, they believe, are integral components of the \Jfeedback\j loop which drives the circadian clock in the cyanobacteria.\p
It looks as though KaiC plays a critical role in setting the phase of the clock. Levels of expression of the KaiC gene (geneticists' jargon for levels of the KaiC protein) increase during daytime and decrease during night time, but an overabundance in either period can measurably shift the timing of the clock. If you add too much KaiC while the amount of the protein is naturally rising, which happens in daytime, this causes the clock to advance, while adding too much KaiC at night, when the levels ought to be falling, causes the clock to slow. Continued high levels of KaiC protein can leave the cell in a state of perpetual twilight. \p
KaiA protein turns up the KaiB and KaiC genes, while KaiC protein turns them down. This suggested a scenario in which, early in the \Jday\j, the kai genes begin to produce RNA that is translated into protein. As KaiA protein accumulates, it turns up the activity of the KaiB and KaiC genes. Then after a delay, KaiC begins to exert the opposite effect, turning the KaiB and KaiC genes off. Once that happens, KaiB and KaiC protein levels fall, KaiC stops repressing the genes, and they come on again.\p
The Kai system may not explain the whole of the clock's operation, but a single \Jmutation\j in any of the Kai genes can alter, or even halt, the timing of the clock. One of our regular evolutionary assumptions is that mechanisms like this usually evolve once only, but this seems to be one of the exceptions. The kai genes do not resemble those that have been previously seen in the circadian workings of mammals and fruit flies and the mould, Neurospora, although the pattern is similar. For this reason, the researchers believe that the cyanobacteria clocks may well have features in common to all the other clocks.\p
The last word belongs to one of the researchers: "Outlining the mechanisms in the simplest creatures known to have a circadian clock is likely to impact our thinking about how all clocks, even ours, function," said Johnson. "From cyanobacteria, we can picture how the circadian rhythms first evolved-when \Jbacteria\j first learned the time of \Jday\j."\p
\IKey Names:\i Masahiro Ishiura, Takao Kondo, Carl Johnson, Susan Golden \p
\BControlling E. coli O157:H7\b\p
A report in \Iscience\i during September indicates that a simple change in the diet of \Jcattle\j in the days before slaughter may reduce the risk of \IEscherichia coli\i infections in humans. The usual American grain-based \Jcattle\j diets promote the growth of \IE. Coli\i that can survive the acidity of the human \Jstomach\j and cause intestinal illness. \p
\IE. Coli\i \Jbacteria\j are commonly found as natural gut inhabitants in humans, but in their proper place. If they are in the upper reaches of the gut, the \Jbacteria\j can cause illness, or worse: disease-causing strains such as E. coli O157:H7 produce toxins that cause bloody diarrhoea or even kidney failure in humans. \p
Yet just by feeding hay to \Jcattle\j for about five days before slaughter, the number of acid-resistant \IE. coli \ican be dramatically reduced. Most \Jbacteria\j are killed by the acid of \Jstomach\j juice, but \IE. coli \ifrom grain-fed \Jcattle\j are resistant to strong acids. Mature \Jcattle\j are unaffected by \IE. coli \iO157:H7, but occasional contamination of meat in slaughter-houses can lead to meat carrying a lethal bacterial dose which can be killed by thorough cooking or irradiation, but which may survive the preparation of a rare steak, and minced or ground meat may have the killer strain lurking, safely away from the searing flames that would kill them on the surface of the meat.\p
Humans have a natural barrier that kills food-borne \Jbacteria\j -- the acidic, gastric juices of the \Jstomach\j -- but \IE. coli \ibacteria can withstand "acid shock" if they have grown in the presence of fermentation acids. Fermentation acids increase when \Jcattle\j are fed large amounts of grain. \JCattle\j fed grain have very large numbers of acid-resistant E. coli. The \IE. coli \iof hay-fed \Jcattle\j are acid-sensitive and are easily killed by gastric juice. \p
Some \Jcattle\j are fed starch-containing grains to increase growth rate and produce more tender meat. Because the bovine gastrointestinal tract digests starch poorly, some undigested grain reaches the colon, where it is fermented. When the grain ferments -- and acetic, propionic and butyric acids accumulate in the animal's colon -- a large fraction of \IE. coli \iproduced are the acid-resistant type, and this includes a proportion of \IE. coli \iO157:H7 types.\p
The grain does not specifically promote the growth of \IE. coli \iO157:H7, but it increases the chance that at least some \IE. coli \icould pass through the gastric \Jstomach\j of humans, and as few as 10 viable \IE. coli \iO157:H7 can cause infection in humans. In some outbreaks, death rates have been as high as 30% of those infected, and once in a human population, the dangerous strains can be passed on by person-to-person infection in \Jday\j-care centres and nursing homes. Recent work in America indicates that swimming pools can be contaminated with \IE. coli \iO157:H7.\p
Another possible source of \IE. coli \iO157:H7 contamination, the use of \Jcattle\j \Jmanure\j as a fertiliser on fruit crops, would not be avoided by this \Jcattle\j-feeding method. \p
#
"Spreading the genes",788,0,0,0
(Sep '98)
When crop plants can stand up to the attacks of pests, or when they produce lots of seeds, that is a plus, and a great deal of genetic \Jengineering\j is aimed at producing just those results. But what happens when less desirable plants acquire these same abilities?\p
In a "letter to \INature\i", Joy Bergelson has indicated that the transfer of useful genes to weeds is a very real prospect. (A "letter to \INature\i" is, in fact, an important scientific communication, although most non-scientists assume it is nothing more than a "letter to the editor": the usage is a 19th century one which has died out everywhere except in this August journal.)\p
Bergelson argues that artificially created plants, just like wild plants, can breed with closely related species to produce \1hybrids\c in what is called out-crossing. For example, corn, which is a grass, can cross with timothy grass, a common American weed. If the corn contains a gene that confers resistance to a pesticide, the resultant "weedy" hybrid may become a pesticide-resistant nuisance that can compete with crops for \Jwater\j and \Jnutrients\j.\p
Usually this is not a concern, because most crop plants self-fertilise, making it less likely that their genes would migrate to other species. Bergelson has shown that plants thought to be "selfing" can outcross with closely related species, and more importantly, the rate of outcrossing appears to be enhanced by the fact that they are transgenic.\p
Bergelson used the common genetic workhorse, thale cress, Arabidopsis thaliana, a selfing mustard plant, and planted three different kinds together on a plot in central Illinois. Two of the types were resistant to a \Jherbicide\j, chlorsulfuron, one of them by way of a point \Jmutation\j in the Csr1-1 gene, the other by having a resistant form of the gene engineered into it, along with another marker gene.\p
After the three types had been grown side-by-side, the seeds from the normal strain were tested to see if they were resistant to chlorsulfuron or not, and if they were, which form of resistance they carried. Bergelson found that twenty times as many resistant progeny received their resistance from the transgenic plants as from the mutant plants.\p
While this study involves transfer within a species, and while the reason for the difference is unclear, Bergelson feels justified in sounding a note of caution that all may not be as simple as we had thought. She warns that the widespread use of transgenic crops could directly cause the creation of weeds with traits intended to increase the fitness of crops, generating a need for new pesticides. \p
We do, however, have a hint of how the genes are travelling, as Bergelson had noticed flies visiting the Arabidopsis flowers, and had wondered if they might not be transferring \Jpollen\j to other plants. This led her to undertake the study in the first place, and while the result is unexpected, it raises some interesting questions.\p
#
"New prostate cancer gene",789,0,0,0
(Sep '98)
The news of a report about a gene for prostate cancer, to appear in October's Nature \JGenetics\j was released onto the \JInternet\j at the end of September. The gene, located on the X \Jchromosome\j, may account for as much as 20% of all cases of the disease in families with a strong history of the cancer. \p
Your gender is determined by your \1sex chromosomes.\c Women inherit an X \Jchromosome\j from each of their parents, while men inherit a Y \Jchromosome\j from their father. So by definition, men inherit their single X \Jchromosome\j from their mothers, explaining an observation, first made in the 1960s, that family cases of prostate cancer showed a stronger link between brothers than between father and son. This inheritance pattern, a form of \1sex \Jlinkage\j\c, is typically found in conditions which are carried on the X \Jchromosome\j.\p
In a typical western population like Britain, the US or \JAustralia\j, there are around 1500 new cases of prostate cancer for every million men in the population each year, and about 10% of these cases appear to be inherited. The gene, labelled HPCX, was found when nearly 1000 men with earlier-than-usual onset of the disease and a strong family history of prostate cancer were studied. Researchers found a specific area on their X chromosomes that was shared more frequently than one would predict as determined by statistical analysis.\p
An earlier gene for prostate cancer, known as HPC1, was found by the same research group in 1996, located on \Jchromosome\j 1, one of 22 pairs of non-sex chromosomes. So far, while the HPCX gene has been mapped to the identified region on the X \Jchromosome\j, the actual gene has still to be isolated and identified. After that, the net step will be to look for diagnostic tests and then, perhaps, for specific therapies.\p
\IKey Names:\i William Isaacs, Jeffrey Trent\p
#
"Bowel cancer gene",790,0,0,0
(Sep '98)
A gene called APC has long been known to be found in families with a history of bowel cancer. In early September, the story was advanced another step when a report in the \IProceedings of the National Academy of Sciences\i revealed that two variants (called 11307K and E1317Q) of APC make people more susceptible to developing pre-cancerous polyps and thus predispose them to bowel cancer. Somewhere between 15% and 30% of all colorectal cancers appear to be hereditary.\p
Doctors from the Imperial Cancer Research Fund in London say that although the variants are hereditary, their effect is not so strong as the other mutations which have already been identified with familial bowel cancer. So some people carrying the gene may have no family history of bowel cancer, yet still be at risk of developing bowel cancer. \p
The previously known APC condition is rare, occurring once in 8000 people, but the condition it triggers, FAP or Familial Adenomatous Polyposis, accounts for a significant number of bowel cancer cases. The new variants may be much more common, making it worthwhile screening for them, even though having the gene only means a five-fold increase in the likelihood of developing the cancer, compared with a forty-fold increase for those carrying the FAP form of the gene.\p
The researchers believe that once the carriers of the gene are identified, it may be possible to find out if environmental factors such as diet cause some people to develop bowel cancer while others are free of the condition. At present, people with a family history of the disease are advised to undergo a colonoscopy every three years to check for polyps, but this could help to refine screening methods to concentrate more thoroughly on those most at risk.\p
At this stage, a clinical test for the gene variants appears to be still some years away, leaving people with no choice but to undergo the colonoscopy procedure. Bowel cancer is described by the researchers as the second most common cause of cancer death in the UK, but it is curable when it is detected at an early stage. It is also preventable by the removal of pre-cancerous polyps.\p
\IKey Names:\i Sir Walter Bodmer, Ian Frayling, Ian Tomlinson\p
#
"Take two aspirin and call me . . .",791,0,0,0
(Sep '98)
A report in the \IProceedings of the National Academy of Sciences\i in mid-September indicates that aspirin may prevent the development of a particular type of common hereditary colorectal cancer in those at high risk for the disease. Aspirin has been known to act against some types of cancer, but there was no convincing explanation of its action until now. The headache drug seems to interfere with colorectal cancer development in those individuals who carry particular gene mutations that makes them very likely to get the disease.\p
The researchers examined human colon cancer cell lines with defective mismatch repair genes, which are necessary to fix normal cell damage that occurs when cells divide and multiply. These genes were shown in 1993 to be the cause of the most common form of hereditary cancer known as Hereditary Nonpolyposis Colorectal Cancer (HNPCC).\p
In the present study, two drugs: aspirin and sulindac, which are both non-steroidal anti-inflammatory drugs and known cancer preventatives, were used on colon \Jtumour\j cells. The drugs largely suppressed the genetic instability that underlies the development of cancer in HNPCC. Cancer cannot happen without mutations, and the aspirin suppresses the accumulation of mutations. It seems to screen for cells that are genetically stable, providing a true genetic selection against such forms of cancer.\p
So will it work in live humans? A group of researchers are currently organising the international hereditary colorectal cancer prevention trial using aspirin. It will be necessary to balance any gains against the effects of aspirin on the gastro-intestinal tract and of sulindac on the liver, to see if the overall gain is worth it.\p
\IKey Names:\i Richard Fishel Josef Rⁿschoff \p
#
"Diabetes and cancer",792,0,0,0
(Sep '98)
A controversial \1diabetes\c drug is causing a certain amount of headache among medical practitioners: separate studies reported in the September issue of Nature Medicine indicate that troglitazone (sold as Rezulin) can both boost and suppress colon cancer in mice. The drug, approved by the US Food and Drug Administration in March, 1997 for treatment of type II diabetes, is now being questioned. It has been widely prescribed in the United States, but the drug has been associated with at least 26 deaths from liver failure.\p
Type II or adult-onset diabetes sufferers continue to manufacture their own \Jinsulin\j, but it loses its effect. Troglitazone is thought to help indirectly by binding to a protein called PPAR-gamma that, among other things, helps speed the maturation of fat cells, making them more effective at removing \Jglucose\j from the blood. This same maturing/aging effect has been of some interest to cancer researchers, since cancer cells gain a kind of immortality and reproduce uncontrollably.\p
One set of studies showed that the drug slowed growth in human cancer cells in culture by 80%, and that mice, injected with \Jtumour\j cells and treated with troglitazone, had tumours after five weeks that were only half the size of mice not given the drug.\p
The other study involved mice with a flawed \Jtumour\j-suppression gene. People with mutations in this gene have a high risk for a type of colon cancer called familial adenomatous polyposis coli (FAP). The mutant mice given troglitazone were more likely to develop tumours than either of two control groups: mutants not given the drug and normal mice.\p
While diabetes researchers are suggesting that it would be unwise to deprive diabetics of a useful drug, based solely on a mouse model, cancer researchers believe they can explain the effect. PPAR, or peroxisome proliferator-activated receptor, may be the molecular switch which allows dietary lipids, or fats, to do their dirty work, to promote colorectal cancer. Troglitazone interacts with PPAR and, in essence, mimics some of the effects of a high-fat diet.\p
PPAR is one of a family of proteins found in the nucleus of cells that bind to certain chemicals, or ligands, and subsequently modulate the expression of specific genes. These \Jligand\j-activated nuclear \Jreceptors\j generally remain dormant until they are switched on in response to biochemical stimuli, which in the case of PPAR include high levels of lipids and drugs such as troglitazone.\p
\IKey Names:\i Bruce Spiegelman, Ronald Evans, Johan Auwerx\p
#
"Diabetes and fat",793,0,0,0
(Sep '98)
Curiously, given that troglitazone is described as being able to mimic some of the effects of a high-fat diet, a report in \JMetabolism\j in September shows that it is possible to reverse type II diabetes in mice simply by feeding them a very low-fat diet, and the researchers believe the same potential exists in humans.\p
While it has long been known that weight loss can control diabetes, this new research is the first scientific study to show that type II diabetes can be completely reversed in animals by lowering dietary fat. Evidence in the past has been anecdotal and open to question, but now a genetic strain of diabetes-prone mice has been placed on a diet cutting fat from 40 percent to 10 percent of their total \Jcaloric\j intake.\p
This caused complete reversal of their diabetes, and while the mice lost weight, this weight loss alone could not account for the reversal, say the researchers. They point out that \Jinsulin\j and \Jglucose\j (blood sugar) levels in the mice began to decrease before their weight did, suggesting that fat reduction acts on \Jinsulin\j and \Jglucose\j levels independent of weight loss.\p
The difference appears to lie in the effects of a high-fat diet which troglitazone mimics: the difference is that between having a large number of matured fat cells, and having a large amount of fat. \p
#
"The Russian TB is coming!",794,0,0,0
(Sep '98)
An urgent joint statement in September on what they call as "Ebola with Wings," issued by the Public Health Research Institute, Doctors Without Borders (MΘdΘcins sans Frontiers) and the Medical Emergency Relief Network International (MERLIN), warns that a major threat to world health may develop in the event of a Russian economic collapse. The threat comes from drug-resistant \1tuberculosis\c (MDR-TB) which is reaching epidemic proportions in \JRussia\j.\p
They say that this local humanitarian disaster is already a direct global public health threat, and they are calling for an urgent worldwide campaign to raise the $100 million needed to prevent the imminent epidemic of MDR-TB in \JRussia\j. They point out that drug-sensitive TB is curable through proper drug therapy, but that this does not apply to MDR-TB. The resistant disease is potentially much more dangerous even than diseases like Ebola, because TB spreads through the air and can move from patient to patient in public places.\p
They believe the current Russian economic crisis will further reduce the already strained resources of public medicine. The resulting shortage of anti-TB drugs in a free-fall economy will inevitably lead to the massive application of substandard antibiotic treatment for patients with TB, which is the principal cause of MDR-TB. Standard treatment of regular TB consists of a daily regimen of four different \Jantibiotics\j for six months. When this treatment is incomplete or interrupted, a patient can easily develop MDR-TB and then spread this potentially lethal form of TB to other people.\p
In simple terms, the resistant form of the bacterium is favoured when the patient is given an "on again off again" treatment, and as this spreads, it becomes no respecter of class, wealth or privilege, so that the well-off are just as likely to carry the germs as the poor, and if these people flee a declining \JRussia\j, they will take their disease with them.\p
Russian prisons are at present the home to some 20,000 cases of MDR-TB, while a further 100,000 inmates with regular TB are subjected to "inappropriate, MDR-causing treatment protocols". Outside of the prisons, TB patients undergoing treatment often are required to pay for their own drugs, even in state run hospitals, and this burden can only add to the number of cases of MDR-TB, because most people will discontinue prescribed treatment as soon as symptoms subside, and before the disease is defeated.\p
The minimum estimate for a nationwide emergency program is estimated to be around $100 million. This cost, say the organisations, is small in comparison to the loss of life and potential global economic damage that can be anticipated in the near future if the problem of MDR-TB in \JRussia\j is not dealt with promptly.\p
#
"Green coffee beans to cure AIDS?",795,0,0,0
(Sep '98)
A September report in the \IJournal of Virology\i describes an interesting chemical which is found in \Jcoffee\j beans, which appears to halt a key \Jenzyme\j used by \1HIV\c particles when they are infecting human cells. Researchers already knew that this happened when the two chemicals were mixed in the test tube, but now they have shown the same effect occurs inside cells.\p
The chemical, chicoric acid, is identical to substances found in medicinal plants that Bolivian (Kallawaya Indian) shamans have used for more than 1,500 years to treat a variety of disorders. In the test tube, at least, chicoric acid renders the \Jenzyme\j, HIV integrase, harmless. The researchers have been pursuing more than sixty plants used as remedies by shamans, and found that several, including chicoric acid, have an effect on the AIDS virus in the laboratory.\p
Two other key HIV enzymes, HIV protease and HIV reverse transcriptase, have been the targets of anti-AIDS "cocktails," for several years. These are mixtures of several chemicals that arrest the action of HIV and are now prescribed routinely for \1AIDS\c patients, but the addition of an attack on the third \Jenzyme\j looks like a promising new avenue for AIDS research.\p
Most importantly, chicoric acid is non-toxic, unlike most of the treatments in use at the moment. In any case, the current "cocktails" soon lose their power because HIV quickly resists individual chemical treatments, so any addition which attacks HIV without increasing the toxicity must be doubly welcome.\p
The chicoric acid works by halting the takeover of healthy cells by the HIV virus. HIV does this by weaving its viral genetic material in with the healthy cell's DNA in a process called \Jintegration\j. The HIV is able to use the cell's own machinery to rapidly make multiple copies of itself, furthering the progression of the AIDS infection. So stopping the process of \Jintegration\j keeps HIV from reproducing and infecting other cells.\p
Can we take \Jcoffee\j to cure AIDS? No, on two counts. Firstly, chicoric acid by itself does not act as a cure, but more importantly, roasting the \Jcoffee\j beans destroys the chicoric acid. But even a retardant brings time, and hope, to those infected with the virus.\p
\IKey Names:\i Edward Robinson, Peter King\p
#
"Mad cow disease detection",796,0,0,0
(Sep '98)
\1BSE\c or bovine spongiform encephalopathy may be detectable earlier than we thought. A late August issue of The Lancet describes the case of Tony Barrett, a 45-year-old \Jcoastguard\j from southwest England, who died of nvCJD (new-variant \1Creutzfeldt-Jakob Disease\c) in June. The name nvCJD is being applied to cases where a BSE infection is suspected.\p
Barrett had his appendix removed at a local \Jhospital\j, eight months before he first showed symptoms of the disease, and a check of the stored tissue has since revealed clear signs of misshapen prions, the tell-tale mark of nvCJD. A worry hanging over British health authorities may now be cleared up, since nobody has known whether or not the 27 diagnosed cases of nvCJD were all that would be seen, or whether they were the mere tip of a much larger epidemic \Jiceberg\j.\p
The problem is that nvCJD is considered to have a long latency period, as much as ten years, but now a study of tissue specimens stored in hospitals around Britain may reveal the truth about the future. In the first instance, only about a thousand samples will be studied, under strict anonymity, to avoid ethical problems about whether or not to warn patients whose discarded \Jtonsils\j and appendices may carry advance warning of an impending death sentence.\p
\B\b\p
Detecting prions\p
Meanwhile, American researchers reported at the end of September that they had developed a highly sensitive, rapid technique for detecting the infectious agents that cause \1prion disease\c. In time, they expect this system to be useful in detecting the prions which cause "mad cow" disease and \1Creutzfeldt-Jakob Disease\c in humans.\p
The report, in the October issue of Nature Medicine, extends beyond the hope for an effective screening tool. With automation, they say, the tool could be applied to commercial testing of meat, biological and pharmaceutical products, but more importantly, it offers an insight into the nature of prions.\p
The test tube immunoassay has been used already to detect very low levels of infectious prion protein, within eight hours, which compares favourably with existing methods. The present standard involves planting suspect protein in the brain of a laboratory animal, then waiting two to six months for a reaction to occur, or not to occur. The time taken rules out any prospect of large-scale commercial applications, but the new method is likely to lend itself to robotic operation.\p
All mammalian brain contains prion protein in its normal harmless form, known as PrPC, but this only becomes a problem when the PrPC changes shape, from a coiled structure to a flat sheet. And this change only happens when the PrPC comes in contact with the infected form, known as PrPSc, so that the PrPC "flips" over into the PrPSc shape, making it able to "infect" other protein molecules with which it comes in contact.\p
The chemical properties of protein molecules depend on their shape, so the "flipped" molecule has different properties, and this is apparently where the trouble starts, but it is also where the assay starts. A portion of the folded PrPSc is detected by a fluorescently labelled antibody, and as this portion is only found on the "flipped" molecule, any \Jfluorescence\j indicates that the antibody has found and attached to a PrPSc molecule.\p
Tests using eight different strains of prions in hamster brains have revealed what scientists had suspected: that the variations in infection rates indicate different protein shapes: each of the eight different strains of infectious prions had unique shapes. According to Fred Cohen, one of the researchers, "We know that PrPC and PrPSc have very distinct shapes. What has become clear is that while all of the strains contain a common molecular sequence, each protein strain has a distinct shape."\p
More importantly, it turns out that the prion strains are all sensitive, to some extent, to proteases, enzymes which destroy proteins, and that the more sensitive prion strains are the ones which take longest to bring about a full disease attack. Previously, the prions were thought to be protease-resistant, but if the body is able to fight back, this suggests an avenue of attack that might help the body to boost its response and defeat the foreign proteins.\p
The problem comes back to "protein X", the so-called chaperone molecule, which is believed to be involved in bringing the infectious molecule in touch with other molecules. They believe that the rate-limiting step in prion replication has little to do with PrPSc, and far more to do with the role of "protein X", so sorting this out becomes the top priority.\p
\IKey Names:\i Jiri Safar, Fred E. Cohen and Stanley B. Prusiner\p
#
"Influenza -- the pigs' disease",797,0,0,0
(Sep '98)
According to a report in the September Virology pig cells have two kinds of \Jreceptors\j: one targeted by \1influenza\c strains that infect birds, the other by flu strains that infect people. Many of the worst flu epidemics of the past are thought to have come from birds to humans by way of pigs, so this discovery has been seized on as an explanation of the pigs' role as middlemen.\p
Only rarely do we see a strain of influenza jumping from birds directly to humans -- the Hong Kong outbreak (Panic as new flu virus appears, December 1997) was one such rare case. The flu strains which attack humans link to cell \Jreceptors\j called a2,6 linkages, while the ones that infect birds recognise a2,3 linkages.\p
Other molecules will also link to these \Jreceptors\j, so Yoshihiro Kawaoka and his colleagues used two molecules, lectins, with fluorescent dyes attached. One of the lectins binds only to the sugars in an a2,3 bond, while the other binds only to the sugars in an a2,6 bond. After they had flooded pig trachae cells with one or the other of the solutions, they washed the cells to remove anything that did not bind, and found that both of the lectin hybrids had attached themselves to the pig trachae cells.\p
In other words, avian influenza can get into pigs, and safe inside the porcine tissues, find other viral forms, and swap genetic material with them, and sometimes, breed dangerous new strains.\p
#
"New flu drug",798,0,0,0
(Sep '98)
All may not be lost, and we may still be able have bacon for breakfast, ham sandwiches for lunch and roast pork for dinner. In late September, reports were just in on a new anti-viral drug which is inhaled, and which appears to stop \1influenza\c dead in its tracks. Because it is inhaled, Zanamivir goes straight to the sites where influenza infects the body -- in the breathing passages. The drug has just been through phase III clinical trials, conducted in \JAustralia\j, New Zealand and South \JAfrica\j.\p
From these trials, it appears that patients treated with the inhaled drug Zanamivir recovered from the illness on average as much as 2.5 days earlier than those who did not receive the drug, mainly due to a significant reduction in the severity of their symptoms such as feverishness, cough, weakness, aches and pains. \p
Those patients at risk of developing complications (which can include \1pneumonia\c, bronchitis and \Jpleurisy\j) as a result of having influenza, experienced a 70 per cent reduction in complications if given Zanamivir. \p
Another trial, conducted in the USA, also showed excellent results, and a phase II treatment trial in the Netherlands showed that although the flu virus can become readily resistant to the effects of some other drugs, resistance to Zanamivir did not develop in this study.\p
#
"And yet another flu drug",799,0,0,0
(Sep '98)
BCTP, another anti-microbial agent, was revealed during September. This is described as being " a quick and efficient killer of influenza A virus in cell cultures and in the nasal passages of laboratory mice" in a press release from the manufacturers. A milky-white \Jemulsion\j of tiny lipid droplets suspended in solvent, BCTP is described as being " made of \Jwater\j, soybean oil, Triton X 100 detergent and the solvent tri-n-\Jbutyl\j phosphate".\p
#
"Transplant news -- the Frankenstein option?",800,0,0,0
(Sep '98)
During September, a New Zealand man had a donor hand and forearm \1transplantation\c operation, performed by an Australian surgeon in \JFrance\j, in an attempt which may or may not prove successful (the news was still good in mid-October). The operation involved joining bone, muscle, nerves and blood vessels, and it intended to replace a hand and forearm lost in an industrial accident. The location was chosen because all dead bodies in \JFrance\j can be treated as a source for donations, while \JAustralia\j, where the man lives, only has voluntary donation.\p
An attention-grabbing headline at the end of September reported that it may soon be possible to replace an entire face, the beneficiaries being people injured and disfigured by disease or accidents. In each case, the prospects of success have been heightened by the development of more powerful drugs to prevent rejection, and better microsurgery techniques. The arm transplant has been attacked by a number of medical researchers who say that the level of drugs required to make the arm "take" leave the patient with about the same defences as somebody with AIDS.\p
And in the United States, doctors have successfully rejoined a patient's pelvis to her lumbar spine with an innovative prosthetic device. Surgeons implanted a prosthesis in the patient, a 49-year-old Californian woman whose \Jsacrum\j (or tailbone) had been destroyed by a giant cell \Jtumour\j. At least that one will not demand heroic levels of anti-rejection therapy.\p
#
"Malaria outbreak in Honduras",801,0,0,0
(Sep '98)
A dramatic increase in falciparum \1malaria\c in northern \JHonduras\j has been reported in the \IPan American Journal of Public Health\i in September. Reported cases rose from 52,110 in 1994 to 75,565 in 1996, a 45% increase in three years. \IPlasmodium vivax\i is reported to be the infecting strain in some 98% of Honduran \Jmalaria\j cases, but now there seems to be a much higher proportionate increase of falciparum \Jmalaria\j in northern \JHonduras\j. This concerns authorities, as this type of \Jmalaria\j is a much more severe form of the disease.\p
A recent survey of 202 blood samples found that 47% of those sampled were positive for \Jmalaria\j parasites, with the milder \IPlasmodium vivax\i present in 79% of the infected rural patients and 100% of the infected city patients. The more severe \IPlasmodium falciparum\i was present in the remaining 21% of the rural patients.\p
The cause is still unclear, but it may be related to any or all of a long rainy season in the area, an increase in the non-immune human population, development of chloroquine resistance by \IPlasmodium falciparum\i parasites, and earlier under-diagnosis, which is now being corrected.\p
\IKey Names:\i Carol J. Palmer, John F. Lindo \p
#
"Kiss of illness?",802,0,0,0
(Sep '98)
Meningococcal disease, a form of \1meningitis\c caused by the bacterium \INeisseria meningitidis,\i is commonly passed on to family members and kissing contacts of patients with the disease, according to a report in the \IBritish Medical Journal,\i in early September. Professor Bjorn-Erik Kristiansen from the University of Tromso and colleagues recommend the use of \Jantibiotics\j to prevent the disease developing in such people, but not in less close contacts, where the benefits may be marginal.\p
#
"Anorexia and zinc",803,0,0,0
(Sep '98)
For many young, weight-conscious women, and often the elderly, not eating becomes a state of mind and is considered a serious psychiatric disorder, given the name anorexia. A series of studies published in \INutritional Biochemistry\i (January) and the \IJournal of \JNutrition\j\i (January, July), suggest that there may be a link between anorexia and a deficiency in the diet of zinc, and that zinc supplements may help people who are trying to recover from anorexia.\p
This is still some way from asserting that a zinc deficiency causes anorexia, but it suggests an avenue for further enquiry. Zinc is an essential trace element, usually obtained by eating red and white meat and shellfish. In addition to the link to anorexia, chronic zinc deficiency can result in reduced growth and sexual development, a weakened immune system, hair loss and skin lesions.\p
#
"Making smokers",804,0,0,0
(Sep '98)
A report in the September \IJournal of Health and Social Behavior\i warns us of one of the unexpected consequences of divorce: boys and girls whose parents divorce are more likely to smoke as adults than are children from intact families. Oddly, only the sons of divorced parents face a higher probability of becoming problem drinkers.\p
The data for these conclusions were derived from the US National Opinion Research Council's General Social Survey, which interviewed more than 11,000 people representing a cross-section of households in the United States between 1977 and 1994.\p
The risk of becoming a smoker in adulthood increased by about one-third. In another gender difference, if their mothers remarried, the divorce-linked effect on smoking was dampened somewhat for girls, but not for boys. On the other hand, while it was only the men whose drinking habits were affected, those men whose mothers remarried had the same level of problem drinking as those who grew up in intact families.\p
\BKey name:\b Nicholas H. Wolfinger.\p
#
"Did we get a kick-start in winter?",805,0,0,0
(Sep '98)
A fearfully cold "volcanic winter" of the sort considered by \1Luis W Alvarez\c may have happened 71,000 years ago, to be followed by the coldest 1,000 years of the last Ice Age, may have brought widespread \Jfamine\j and death to modern human populations around the world, and just incidentally, have forced the development of modern humans. That at least is what Stanley Ambrose argued in the June issue of the \IJournal of Human \JEvolution\j,\i a view which is now attracting some interest on the \JInternet\j.\p
Ambrose believes that the eruption of Mount Toba in \JSumatra\j caused a bottleneck, a decrease, in our ancestors' populations, which in turn brought about the rapid "differentiation" or genetic divergence of the surviving populations. While geneticists have long believed that humans must have passed through a bottleneck of this sort, but this is the first time anybody has tried to attach either a date or a cause to the notion.\p
Ambrose calls his model the Weak Garden of Eden/Volcanic Winter model, and it is an offshoot -- with significant additions -- of the Weak GOE model proposed by Henry Harpending and others. The Weak GOE model proposes an African origin for modern humans about 130,000 years ago, with the invention and spread of advanced stone tool technology, 40,000 to 50,000 years ago, to account for population growth \Iafter\i the bottleneck, some time after the \Jearth\j started to warm once more.\p
In Ambrose's model, a volcanic winter resulting from the super-eruption of Toba "caused the bottleneck, and that populations may have expanded in response to climatic warming 10,000 years before the advent of modern technology."\p
A group of vulcanologists (Michael Rampino, Stephen Self, Greg Zielinski and colleagues) have shown that Toba's blow-out not only created the modern Lake Toba in northern \JSumatra\j, but caused a volcanic winter that lasted six years and significantly altered global climate for the next 1,000 years. According to geneticists, the combination of a substantial lowering of global temperatures, \Jdrought\j and \Jfamine\j, led to a global human population crash during which no more than 15,000 to 40,000 people survived.\p
If the populations left behind were separated, the well-known genetic principles of "founder effect" and genetic drift could account for the racial differences we see among human populations today.\p
#
"Northern exposed dinosaurs",806,0,0,0
(Sep '98)
In recent years, \1dinosaur\c fossils have been found in a part of \JAustralia\j which once lay close to the south pole, providing support for the belief that dinosaurs were warm-blooded. In the interests of fair dealing, this is a suspicion that your reporter shares. September brought us evidence in the form of trackways, a significant variety of single \Jdinosaur\j footprints and sets of prints from five different types of dinosaurs in the foothills of \JAlaska\j's Brooks Range.\p
The impressions and casts from the dig, carried out over the northern 1998 summer, cover a total of 13 new track sites that were located along a 75-mile stretch of the Colville River in National \JPetroleum\j Reserve-\JAlaska\j. Previously, there was only one, poorly documented trackway on the Alaskan Peninsula.\p
These new prints are at least 25-million years older than the well-known \Jdinosaur\j bone beds farther to the north, and provide the first direct evidence that dinosaurs were numerous and diverse in the \JArctic\j 90-110 million years ago.\p
The importance of this find is that the dinosaurs, meat and plant eating, duck-billed and armoured dinosaurs, could only have lived this far north if they had some way of withstanding the cold of northern winter nights. It also supports the view that these dinosaurs were migrating across from Asia to North America during early Cretaceous time, but the finders' claim that "These new \Jdinosaur\j finds . . . establish \JAlaska\j as the premier high latitude \Jdinosaur\j region of the world" would have to be considered a little too enthusiastic, given the Australian finds, but they are certainly important.\p
#
"Coelacanths in a new home",807,0,0,0
(Sep '98)
Up until 1938, the \1coelacanth\c was considered to have been extinct for 80 million years, but then a population was found near the \JComoros\j archipelago in the western Indian Ocean, and they were promoted to \1"living \Jfossil\j"\c status.\p
Now specimens of the fish have been discovered 10,000 km away, off \1Sulawesi\c (formerly \JCelebes\j) in \JIndonesia\j. The fish are probably members of a separate population.\p
#
"A new look at adaptation",808,0,0,0
(Sep '98)
Adaptation, the way in which a species changes to meet the demands of its environment, is central to our understanding of \Jevolution\j. In the 1930s, R. A. Fisher (later Sir Ronald Fisher) asserted that \Jevolution\j worked by small changes in genes, a gradual process of becoming better able to cope. \p
Since it was first proposed in 1972, the "punctuated equilibrium" of Niles Eldredge and Stephen Jay Gould has gained favour with many theorists. Under this model, a species has a long quiet period, followed by a short burst of massive change, punctuating the previous equilibrium. As the theory's supporters see it, life does not evolve gradually but intermittently, with long periods of inactivity, interrupted by bursts of change which are characterised by mass extinctions and the emergence of many new species.\p
Support for the model comes mainly from palaeontologists who see exactly this pattern in the \Jfossil\j record. Others, clinging to the Fisher model of gradual change, suggest that the \Jfossil\j record may look like that, but that may be due to fossils being laid down under a pattern of punctuated equilibrium unlike the pattern of adaptation.\p
The August issue of \IEvolution,\i released in early September features an argument from Allen Orr, for a compromise position with adaptation perhaps being a mix of genetic tweaks, some less moderate mutations, coupled with major changes. He starts with Fisher's belief that major mutations are always deleterious, rather like random tinkering with the circuitry of a \Jtelevision\j set. While minor adjustments may improve a set's performance, wholesale axe-work usually produces a sound-free vision-free TV, and in the same way, major mutations usually result in an individual which cannot survive.\p
Orr goes on to show that the distribution of mutations causing adaptation neatly fits an exponential curve. A few major mutations are needed, but the number of more minor mutations rises exponentially with their genetic insignificance, provided you are looking at a population well-positioned to adapt to environmental pressures.\p
In other words, it now appears that the Punctuated Equilibrists, like the Fisher Loyalists, were right all along -- except to the extent that each declined to recognise the importance of the other side's case.\p
#
"Why is South Africa so high?",809,0,0,0
(Sep '98)
Southern \JAfrica\j is a craton, which makes it the oldest part of the African continent. Most other continents have one or more cratons, but they are no higher than 400 or 500 metres above sea level. Southern \JAfrica\j, however, lies more than 1000 metres, 1 km, above average sea level. A September report in \INature\i appears to explain this odd situation.\p
A hot "upwelling" which begins at the core-mantle boundary causes the mantle above it to flow, all the way to the base of the African plate, where it elevates the southern part of the continent, according to the report. Like a bubble in a vat of maple syrup, they say, it causes the "syrup" (southern \JAfrica\j) to flow as it rises, and also raises the surface.\p
Previous explanations for the anomalous elevation have centred on near-surface phenomena such as episodes of volcanic heating and/or lithospheric thinning, but without much success. The researchers believe that they can see evidence for this active upwelling in images from seismic \Jtomography\j. This has revealed " a large, low-velocity seismic anomaly in the lower mantle directly below the African Plate", and lower seismic velocities generally reflect hotter, and consequently lighter than normal material.\p
Calculations based on the \Jtomography\j data produced a model with a pattern of surface \Jtopography\j that looked remarkably like southern \JAfrica\j's actual \Jtopography\j, strongly suggesting a causal link. Perhaps more interestingly, the upwelling may be strong enough to drive tectonic plates. While some geologists have assumed that the driving forces for \Jplate tectonics\j arise directly or indirectly from the plates themselves, as they sink back into the mantle, others have favoured upwellings as the motive power, a view which the researchers favour.\p
\BKey Names:\b Carolina Lithgow-Bertelloni and Paul Silver.\p
#
"Finding fault",810,0,0,0
(Sep '98)
Scientists from the U.S. Geological Survey, the \JUtah\j Geological Survey, \JArizona\j State University and Cambridge University began the work of digging a trench into the face of the Bulnay fault in northwestern \JMongolia\j during September. The fault is about 800 km (500 miles) west of the Mongolian capital of Ulan Bator. Because the site is so far from paved roads and rail lines, all the excavation will be done by hand.\p
The American \1earthquake\c scientists are interested in comparing the great ruptures on the Bulnay fault that have occurred in the 20th century and compare those ruptures with what is known about the 1811-1812 "great" earthquakes that were centred near New Madrid, Missouri.\p
The face of the New Madrid fault is buried deep beneath the Mississippi River, but the researchers say they believe the Bulnay fault is very much like the New Madrid fault. Better still, it is exposed on the surface, enabling them to study the offsets and other characteristics that they might see on the New Madrid, if only they could get to it.\p
During the 20th century the faults of western \JMongolia\j have produced three "great" earthquakes of \Jmagnitude\j 8 or larger, and six events of about \Jmagnitude\j 7. These earthquakes have occurred about midway, east to west, in the Eurasian plate, just as the New Madrid earthquakes occurred about midway, east to west, in the North American plate. It is important to find out more about the character of large earthquakes that occur in crustal plate interiors, and how often those events might be expected to repeat themselves.\p
The hole in the \1\Jozone layer\j\c over \JAntarctica\j this southern summer is now two and a half times the size of Europe, the largest yet, according to reports in the opening days of October. But is it all the fault of the \Jchlorofluorocarbons\j, the CFCs? Perhaps not, according to a report in the 1 October \IGeophysical Research Letters,\i and announced in late September.\p
When former President \1Ronald Reagan\c slipped up and told us that trees can pollute the air, it turns out he wasn't far off the mark. According to the new report, some leafy green plants churn out methyl bromide, a chemical that helps destroy \JEarth\j's protective shield of ozone. Methyl bromide, an effective soil fumigant, used to be used in museums as a way of keeping unwanted life forms out of precious exhibits, but its use has been discontinued because of the damage it causes to the \Jozone layer\j. The substance is to be phased out over the ten years from 1995 to 2005.\p
In the upper \Jatmosphere\j, sunlight breaks off bromine atoms from the methyl bromide molecule, and these then consume about 20% as much ozone as does the \Jchlorine\j from banned chlorofluorocarbon compounds. This is not the whole story, for only about a fifth of the methyl bromide released into the air comes from industrial production. Atmospheric scientists had suspected that oceans and forest fires were the main additional sources, but now it appears that plants produce large amounts as well.\p
Soil scientists at the U.S. \JSalinity\j Laboratory in Riverside, \JCalifornia\j, measured methyl bromide emissions from \Ibrassica\i plants, a family that includes broccoli, cabbage, and other important crops. The plants were grown in sealed glass jars in a greenhouse, and while the soil in the jars broke down some of the methyl bromide, some still remained.\p
Trace amounts of bromide are commonly found in soils, so the potential is there for large-scale biological production of the molecule, but more research will be needed to see which plant groups are the most efficient producers.\p
#
"Safer air conditioning for cars",812,0,0,0
(Sep '98)
A car air conditioning system uses a small refrigeration system to keep the car cool in summer. At present, car systems use refrigerants which are \1CFCs,\c mainly a synthetic hydrofluorocarbon known as R134a, but while this was developed as an ozone-safe replacement for the chlorofluorocarbon R12, R134a may prove inadequate for future vehicles. It is a very good refrigerant, but it works poorly as a heating fluid, making it something of a dead end for future research.\p
The problem comes from new \Jenergy\j-efficient engines which may not yield enough waste heat to warm a car in winter, which means that the refrigeration unit has to operate as a heat \Jpump\j. This operates like an air conditioner in reverse, and it is in this role that R134a falls down.\p
Now researchers in Illinois are working on a system which uses carbon dioxide as the refrigerant, reducing the risk to our \1\Jozone layer\j.\c A prototype has been tested at the University of Illinois Air Conditioning and Refrigeration Center, and has already shown excellent thermal performance, according to a press release issued during September.\p
#
"Lead in the environment",813,0,0,0
(Sep '98)
A \Iscience\i journal report in mid-September shows that humans have been polluting their environment with heavy metals for longer than anybody imagined. An international group of researchers has been looking at the \1lead\c levels in a Swiss \Jpeat\j bog in the Jura Mountains, and they find that the last 6000 years of a 14,000 year sample all show signs of lead \Jpollution\j.\p
The surface layers of a \Jpeat\j bog are effectively isolated from the waters below, so any build-up of heavy metals in the bog can be assumed to have arrived as atmospheric dust, and the researchers have found that everybody, from the first plant cultivators in Europe, to miners of the Roman Empire, to medieval silversmiths in \JGermany\j, to European \Jpetroleum\j companies in the 1970s, has left a mark on the \Jpeat\j bog they studied.\p
The first 8000 years show a natural time-frame, the most recent 6000 years, with metal-using humans, shows the effect of that extra dimension in the release of atmospheric lead. They were even able to judge whether the lead was from dust in Scandinavia released as the glaciers receded or from lead ore brought into Europe from \JAustralia\j in recent times and released through industrial use.\p
The original build-up probably came as people began to clear land and plant crops, 6000 years ago. The record also shows large fluctuations that match up very well with historical events such as the rise and fall of the Roman Empire, and onset of the Industrial Revolution in Europe. The analysis suggests that lead levels appear to actually be decreasing today, as lead has been removed from petrol (\Jgasoline\j), and lead emissions from industrial sources have been reduced, but they still remain at levels several hundred times higher than the natural levels the researchers found in samples over 6000 years old.\p
In the same issue of \IScience,\i another report looked at lead levels and children's health in America, suggesting that too little attention is being paid to questions of children's health. A paediatrician has argued that federal standards regarding prevention of lead in the home ignore existing scientific evidence and are unlikely to protect inner-city children from lead poisoning.\p
The effects of lead exposure are largely irreversible, yet authorities are accused of waiting until children are exposed before taking action. Proposed EPA standards set floor dust at too high a level, thereby not protecting children, and there is limited data showing it to be either safe or beneficial for the vast majority of children with elevated blood lead levels, according to Bruce P. Lanphear.\p
Lead toxicity, defined as a blood lead level of 10 micrograms per decilitre or higher, is estimated to affect one of every 20 children in the United States. Studies have demonstrated serious harmful and irreversible effects of low-level lead exposure on brain function, such as lowered intelligence and diminished school performance, especially from exposures that occur early in life.\p
Since 1980, average blood lead levels in children in the United States have fallen more than 90 percent, due largely to the elimination of lead from \Jgasoline\j (petrol) and dietary sources, although these are related, since a great deal of the lead in food originated in the fuel tanks of cars, being belched out of exhaust pipes to fall on nearby fields. Still, nearly 900,000 preschool children in the United States are believed to have a blood lead of 10 micrograms per decilitre or higher.\p
#
"Ladybird, ladybird, please stay at home",814,0,0,0
(Sep '98)
A \1ladybird,\c also called a ladybug, is a gardener's best friend, because the larvae and the adults all eat pests on plants. There is only one problem: the adult ladybirds are beetles, so they can fly, and all too often, they fly away from your garden in search of food somewhere else. A report in \INew Scientist\i at the end of the month describes a solution: French researchers have patented a way of breeding ladybirds that can't fly, which ensures that they stay around to eat the pests.\p
Predator control is less harmful to the environment than spraying, but flying predators can spread through the environment even faster than pesticide sprays. When an American species was introduced onto lime trees in the Netherlands, for instance, most of the 30 million adults flew away in just three weeks.\p
AndrΘ Ferran and his colleagues at the National Institute of Agronomic Research in Antibes have now found a way to produce mutant multicoloured Asian ladybirds (\IHarmonia axyridis\i) which cannot fly. They used ionising radiation and a mutagen called \Jethyl\j methyl sulphonate on a wild population, then selected the individual which could not fly, and by selective breeding, developed a population of 95 per cent flightless ladybirds.\p
While \1mutants\c like this occur naturally among ladybirds in the wild, most of them have deformed wings, and so are useless for pest control, but Ferran's mutants have a normal wing morphology and reproduce normally. The intention is to introduce limited numbers onto crop plants and let them breed up when more food -- pests -- can be found. So far, it is unclear just what pests they will control. And whether or not they will be as efficient in the field as they seem to have been in the greenhouse tests so far.\p
#
"Snowballs may be figments after all",815,0,0,0
(Sep '98)
The "small comets" of Louis Frank (see Raindrops keep falling on my head, May 1997 and follow-ups in the following three months, and again in December 1997) are to be called in question. Announcements and preprints of two papers to appear in early October have circulated during September. The papers use Frank's own data to demonstrate that the "atmospheric holes" in \Jsatellite\j imagery are caused by instrument noise in the \Jspacecraft\j's own cameras, not by the presence of comets the size of a house bombarding the \JEarth\j's \Jatmosphere\j every few seconds. \p
One paper investigates the characteristics of the dark pixels in relation to expected noise from the individual components of the two cameras, that the dark pixels seen in the \Jsatellite\j data from both cameras are entirely consistent with instrumental noise. The second investigates the distribution of the dark pixels by altitude, and shows that there is no appreciable height dependence: the size of the "comets" does not vary as the altitude of the observing \Jspacecraft\j changes. \p
\IKey Names:\i Forrest S. Mozer, James P. McFadden \p
#
"Laima update",816,0,0,0
(Sep '98)
Laima, the trans-Atlantic pilotless plane, arrived back at the laboratories during September so researchers were finally able to measure the fuel and oil. The fuel remaining was just on 1 kg, so the total burn for the trip was 4 kg -- though the researchers say they used a postal scale which "makes the uncertainty in this figure 100 gm or so."\p
The oil remaining was 190 cc, meaning total loss of just 30 cc. Their initial belief was that there was zero oil remaining, but they say now that they were mistaken. It "is fairly dark around the tank, and the level was so high that the free surface could not be seen clearly."\p
#
"SOHO looking good",817,0,0,0
(Sep '98)
In early September, NASA and European Space Agency (ESA) engineers announced that they have not detected permanent damage to the Solar and Heliospheric Observatory (SOHO), which spun out of control and lost power after a series of ground control errors in June. By late September, they were able to report that the craft's thrusters were back in operation, and the probe was once again pointing at the \Jsun\j.\p
It will be late October before they know for sure whether excess cold or heat has damaged any of the telemetry, power, or control systems of the $1 billion craft. Getting control of the thrusters has been a delicate process, involving the use of electric heaters to thaw SOHO's frozen hydrazine propellant. Engineers first thawed the main tank and then warmed the pipes connecting the hydrazine tank to the thrusters outside. This had to go slowly, over about two weeks, since a quicker thaw could have burst the pipes, but in the end, everything worked, and the controllers were able to stop the craft's slow spin.\p
A NASA/ESA panel investigating the temporary loss of the probe released a report at the early September press conference confirming that human error, not mechanical failure, caused the \Jspacecraft\j's problems. The report blamed overloaded staff and data displays that were "not user-friendly", a problem that was recognised in 1994 but never solved.\p
#
"Richard Leakey rises again",818,0,0,0
(Sep '98)
At the end of September, \JKenya\j's President Daniel arap Moi reappointed one of his most prominent critics, anthropologist Richard Leakey, to the job of overseeing some of \JAfrica\j's best known wildlife parks.\p
Leakey had resigned from the KWS in 1994 after complaining of political interference by Moi's cronies, and the reappointment follows close on Moi's sacking and rehiring and resacking of Leakey's successor in the post, conservationist David Western as head of the \JKenya\j Wildlife Service.\p
Leakey says he has an assurance from Moi that the service will be insulated from political meddling, but most observers believe that the appointment has more to do with politics, and Moi's efforts to shore up his sagging regime and \JKenya\j's shattered economy. The service has an accumulated deficit of US $3.5 million, and declining tourist income, as well as problems with crime and poaching in the parks. It will not be an easy ride for Leakey.\p
#
"Jonathan Mann and Mary Lou Clements-Mann",819,0,0,0
(Sep '98)
As reported last month, this husband-and-wife \1AIDS\c research team was killed in the September 2 crash of a Swissair jet on its way to Geneva from New York. Mann was a highly-regarded leader who had to resign in 1990 as director of the \1World Health Organization\c Global Program on AIDS (GPA), due to internal wrangling and politics within WHO. He was totally dedicated to the fight against AIDS, resulting in his sometimes scathing criticisms of public health leaders, especially Hiroshi Nakajima, the former WHO chief, accused by Mann of lacking commitment to fighting AIDS. Nakajima was finally replaced by Dr. Gro Harlem Brundtland as Director-General of WHO, earlier this year.\p
At the time of his death, Mann held the post of Dean of Allegheny University School of Public Health in Philadelphia. Mary Lou Clements-Mann was working at Johns Hopkins University in Baltimore, where she was regarded as an expert on AIDS vaccine development. The two were travelling to attend a meeting in Geneva on AIDS vaccines convened by UNAIDS, the United Nations' special program on AIDS, which replaced the GPA in 1994. \p
UNAIDS executive director Peter Piot praised Mann as "a visionary global leader in the fight against AIDS" who "tirelessly promoted a response to the epidemic based on respect for human rights and human dignity." \p
Jonathan Mann was asked to set up the GPA in 1985, and until 1990, it grew rapidly under his energetic leadership. Nakajima forced him out of his office after his resignation, before he had completed his contractual term, and in 1993, Nakajima was re-elected for a second term, almost entirely on the votes of African and South American countries, leading to allegations that \JJapan\j had used trade and aid as a lever to "buy votes", an allegation which was predictably denied by \JJapan\j.\p
Mann did not cease to work and campaign during those years, but it is a matter of regret to all those who have met him and fallen under his spell (including your reporter) that his contributions to this struggle have been cut short. All deaths diminish us to some extent, but some deaths diminish the whole of humanity.\p
#
"October, 1998 Science Review",820,0,0,0
\JNobel Prize for Physiology or Medicine\j
\JNobel Prize for Chemistry\j
\JNobel Prize for Physics\j
\JNobel Prize for Economics\j
\JNobel Prize for Peace\j
\JNobel Prize for Literature\j
\JAnd now, the Ig-Nobels!\j
\JWhere are the scientists and engineers working?\j
\JScience education goes back a step?\j
\JMonsanto takes the blame\j
\JGenetic researchers ahead of target\j
\JChlamydia sequenced\j
\JGenetic control of malaria sought\j
\JSpeech and inheritance\j
\JDoes a virus cause autism?\j
\JDelivering p53\j
\JSuper-penicillin?\j
\JPioneer's odd movements explained\j
\JAnd today's weather on Neptune . . .\j
\JEuropa fly-by\j
\JAn odd brown dwarf\j
\JA new particle?\j
\JA new planet is born\j
\JAerogel and John Glenn\j
\JDIY MRI\j
\JStopping bullets with goat milk and spider's web\j
\JFake pots\j
\JReptile nests in Arizona\j
\JThe first Europeans?\j
\JKennewick man to be examined\j
\JWhitewater iguanas\j
\JHow well can a bat 'see' with sound?\j
\JThe smile on the face of the dinosaur\j
\JThe African landscape shaped by single magma plume\j
\JLayers in the earth's core\j
\JTermite controls\j
\JNitrates in streams may be coming from bedrock\j
\JKill the whales to save the otters?\j
\JNo butterfly effects around El Ni±o \j
\JGiant iceberg spotted\j
\JSheba free at last\j
\JLemur news for 1998\j
\JVampires and rabies\j
#
"Nobel Prize for Physiology or Medicine",821,0,0,0
(Oct '98)
The names of the winners of the \INobel Prizes\i for 1998 were released during October. The winners will receive their awards, and amount of 7.6 million Swedish crowns (not the 1.6 million crowns, as mistakenly reported last month), on December 10, the anniversary of the death of Alfred Nobel in 1896. The prize is equivalent to just under US$1 million, or about AUD$1.6 million.\p
\BNobel Prize for \JPhysiology\j or Medicine\b\p
This prize was awarded jointly to Robert F. Furchgott, Louis J. Ignarro and Ferid Murad for their discoveries concerning "nitric oxide as a signalling molecule in the cardiovascular system". In laymans terms, they have managed to uncover the fact that \Initric oxide\i, NO, can carry a signal from one cell to another, and they have explained how the signal is carried.\p
An enthusiastic press coverage made much of the fact that their discovery led directly to the development of the drug Viagra. The tabloid press, which failed to have much to report about the chemistry and physics prizes, seized on this.\p
Oddly enough, the work leading to the prize began with \Initroglycerine\i, the explosive that earned Alfred Nobel his vast wealth and set up the funding for the Nobel Prizes. \p
\JNitroglycerine\j has long been used to treat heart pain because it makes blood vessels dilate, meaning they expand to a larger diameter. Nobel made his fortune from dynamite, in which the explosion-prone \Jnitroglycerine\j is controlled by being absorbed in kieselguhr, a porous soil rich in shells of diatoms. The effects of \Jnitroglycerine\j were known even in Nobel's time, and when he was taken ill with \Jheart disease\j, his doctor prescribed \Jnitroglycerine\j. \p
Nobel declined the treatment, writing "It is ironical that I am now ordered by my physician to eat \Jnitroglycerine\j". But if the use of the explosive for heart problems was known 100 years ago, it is only now that its operation is fully understood, though unravelling the story has taken two decades so far.\p
In 1977, Ferid Murad showed that \Jnitroglycerine\j relaxes smooth muscle cells around the blood vessels by releasing a gas, nitric oxide, which is better known as a pollutant released by motor vehicles. Other drugs often had an effect on blood vessels as well, but the effects were often contradictory.\p
Some time later, Furchgott was studying the different effects of drugs which dilate blood vessels, and he concluded that the key requirement for \Jacetylcholine\j to dilate blood vessels depended on the surface cells on the inner surface, the endothelium of the blood vessels. If the endothelium was undamaged, the blood vessels dilated, but when the endothelial cells were damaged, the blood vessels stayed as they had been.\p
Furchgott concluded that blood vessels are dilated because the endothelial cells produce an unknown signal molecule that makes vascular smooth muscle cells relax. He called this signal molecule EDRF, the endothelium-derived relaxing factor.\p
Murad believed that nitric oxide might also be involved in other hormonal actions, but the evidence was missing, and remained missing until Louis Ignarro became involved in a study aimed at identifying the chemical nature of EDRF. In 1986, he established that EDRF was identical to NO, both on his own, and also in collaboration with Furchgott.\p
They revealed their findings at a conference in July 1986, starting an avalanche of research activities in many different laboratories around the world, including the work which led to Viagra. This was the first discovery that a gas can act as a signal molecule in the organism, but now we have a much more complete view of the ways in which the gas is, or may be, used in living things. \p
From one point of view, nitric oxide may seem like an improbable chemical messenger, since it is unstable, and converted to \Jnitrate\j and \Jnitrite\j within 10 seconds. NO was known to be produced in \Jbacteria\j, but this simple molecule was not expected to be important in higher animals such as mammals. Yet looking at the matter logically, it seems inevitable that a messenger molecule would have this instability, since there needs to be some way of stopping the signal from being detected.\p
Among other things, we now know that NO acts as a signal molecule in the \Jnervous system\j, as a weapon against infections, as a regulator of blood pressure, and as a gate keeper of blood flow to different organs. NO is present in most living creatures and is made by many different types of cells. \p
The operation of NO in dilating blood vessels is comparatively simple. The NO is produced by the innermost cell layer of the arteries, the endothelium, and then it rapidly spreads through the cell membranes to the underlying muscle cells. Their contraction is turned off by NO, and the arteries dilate. Since the NO comes from the endothelium, this explains Furchgott's finding that the endothelium had to be intact if a drug was to have any effect at all.\p
Nitric oxide also plays a part in controlling \Jbacteria\j. When it is produced in white blood cells such as macrophages, huge quantities are produced and the NO becomes toxic to invading \Jbacteria\j and parasites. In extreme bacterial infection, this increased level of NO in the blood can lead to large-scale dilation, and a drop in blood pressure, the effect that we call shock. Armed with this knowledge, medical workers may be able to treat shock with drugs which act as NO inhibitors.\p
White blood cells also attack cancerous growths, and research is now under way to see if NO can be used to stop tumours, since the gas is able to induce programmed cell death, also called \Iapoptosis\i.\p
Intensive care patients may be able to be dosed with NO if they are suffering from dangerously high blood pressure in the lungs, a situation which sometimes happens with infants. Controlling the dosage is important here, as the gas can be toxic at high concentrations.\p
When people suffer hardening of the arteries, called \Jatherosclerosis\j, the endothelium has a reduced capacity to produce NO. In such a case, NO can be provided by treatment with \Jnitroglycerine\j, but now the search is on for more powerful and selective cardiac drugs, based on our new knowledge of NO as a signal molecule. \p
Then there is the "Viagra effect". Nitric oxide can initiate erection of the penis by dilating the blood vessels to the erectile bodies. Finally, there is the controversy: strict rules say that no more than three researchers can be rewarded for a discovery, and many scientists active in the field have criticised the committee for not recognising pharmacologist Salvador Moncada at University College London, who carried out key experiments in the chain of discovery.\p
#
"Nobel Prize for Chemistry",822,0,0,0
(Oct '98)
For their work in quantum computational chemistry, this prize went to Walter Kohn, University of \JCalifornia\j at Santa Barbara, USA and John A. Pople, Northwestern University, Evanston, Illinois, USA, a British citizen.\p
Using the laws of quantum physics, it is possible to understand and calculate how electrons and atomic nuclei interact to build up matter in all its forms. Quantum chemistry exploits this knowledge to describe the molecular system. The citation reads: "to Walter Kohn for his development of the density-functional theory and to John Pople for his development of computational methods in quantum chemistry". The two Nobel Laureates have each made pioneering contributions in developing methods for theoretical studies of the properties of molecules and the chemical processes in which they are involved.\p
One of the keys to understanding chemistry involves understanding how bonds between the atoms in molecules function. For the most part, this means being able to calculate the properties of molecules, and how they interact.\p
Once quantum mechanics developed as a branch of physics at the start of the 20th century, the potential was there for new methods to be developed, but applications within chemistry were long in coming. In simple terms, it was not practically possible to handle the complicated mathematical relations of quantum mechanics for such complex systems as molecules.\p
As far back as 1929, one of the "greats" of quantum physics, \IP. A. M. Dirac\i, wrote that "The fundamental laws necessary for the mathematical treatment of large parts of physics and the whole of chemistry are thus fully known, and the difficulty lies only in the fact that application of these laws leads to equations that are too complex to be solved". (An equation that Dirac regarded as "too complex" must have been \Iseriously\i complex!) \p
By the start of the 1960s, computers had developed enough power to deal with this complexity, so that quantum chemistry, the application of quantum mechanics to chemical problems, could begin. Now, more than thirty years later, all of chemistry is being revolutionised by this new field of science. Kohn's theoretical work provided a basis for simplifying the \Jmathematics\j used to describe the bonding of atoms. While this provided the starting point for many of today's calculations, Pople developed the entire quantum-chemical methodology which is now used in various branches of chemistry. \p
In simple terms, Kohn showed that, rather than trying to consider every \Jelectron\j in a calculation, chemists simply need to know the average number of electrons located at one point in space. This opened up the option of a much simpler method of calculation, the \Idensity-functional theory\i. Putting it another way, if the electrons' spatial distribution (\Ielectron density\i) is known, the total \Jenergy\j for a system described by the laws of quantum mechanics can be theoretically calculated. The question then becomes a matter of working out how the \Jenergy\j depends on the density. \p
Kohn gave important clues based on what this dependence looked like in an imaginary system with free electrons. The advantage of this type of solution is that very large and complicated molecules can be tackled. This means that a researcher wanting to explain how an \Jenzyme\j works, can do so with a computer, once practical methods have been found for applying the theory, and this was John Pople's contribution. The first version of Pople's GAUSSIAN program was published in 1970, and it has since been developed and is now used by thousands of chemists in universities and commercial companies all over the world. Versions which have followed since have been progressively upgraded until the early 1990s, when Pople incorporated Kohn's density-functional theory into the most recent versions.\p
Pople took the fundamental laws of quantum mechanics, and used these to develop the practical methods which put Kohn's insight into effect. A computer can be given the details of a molecule or a \Jchemical reaction\j, and the computer can then proceed to describe the properties of the molecule, or how a \Jchemical reaction\j may take place.\p
\BThe uses of quantum chemistry\b\p
Quantum chemistry offers us quantitative information on molecules and their interactions, and the theory also affords deeper understanding of molecular processes that cannot be obtained from experiments alone. This allows chemists to combine theory and experimentation as they try to understand what is going on inside a molecule.\p
Take a comparatively simple example, the amino acid cysteine, which is made up of a carbon atom attached to an amino group (NH\D2\d), a carboxyl group (COOH), a \Jhydrogen\j atom, and a thiolathomethyl group (CH\D2\dSH). If these are plotted onto a computer screen from a simple menu system, a rough model can be created immediately. Then the computer can be instructed to calculate the \Jgeometry\j of the molecule with a quantum-chemical calculation.\p
The point to remember here is that the chemical properties of a molecule, especially in biochemistry, depend largely on the shape of a molecule, its size, and the charge distribution on it. So the computer can now give us a quick look at a rough result after a minute or so, or if we need high accuracy and high definition, it may take as long as a \Jday\j to find the detailed structure.\p
Given the same sort of information for a number of amino acids, we can begin to build up a highly detailed picture of an entire protein, so that the sort of modelling work carried out over many months by \Iwatson\i and \Icrick\i when they were unravelling the structure of DNA, can now be done much faster.\p
Another application is found when astronomers try to identify the chemicals in clouds between the stars. While we can get some information from absorption spectra, the key details are in the radiation emitted by these space molecules as they rotate. This radiation tells us enough to identify the composition and appearance of the molecules, but this is an immensely difficult task, especially since the suspected molecules cannot always be produced in the laboratory to give us a basis for comparison and confirmation.\p
Using quantum chemistry methods, an assumed structure can be analysed, and the predicted radio emission frequencies can be directly compared with data collected by the radio \Jtelescope\j.\p
The technique can also be used to work out the way in which the \Jearth\j's \Jozone layer\j is attacked by various molecules, so all in all, the efforts of Walter Kohn and John Pople seem to be remarkably deserving of the recognition they have now achieved.\p
Walter Kohn was born in Vienna in 1923. He was a professor at the Carnegie Institute of Technology in \JPittsburgh\j, USA between 1950 and 1960 and at the University of \JCalifornia\j in San Diego from 1960 to 1979. He was Director of the Institute of Theoretical Physics in Santa Barbara, where he is still active, from 1979-1984. \p
John A. Pople was born in Burnham-on-Sea in Somerset, U.K. in 1925. He is a British citizen. He gained his \JPh\j. D. in \JMathematics\j at Cambridge, U.K., in 1951. In 1964 he became Professor of Chemical Physics at Carnegie-Mellon University, \JPittsburgh\j, USA and subsequently Professor of Chemistry at Northwestern University, USA, in 1986, where he is still active.\p
#
"Nobel Prize for Physics",823,0,0,0
(Oct '98)
This prize went to Professor Robert B. Laughlin, Stanford University, \JCalifornia\j, USA, Professor Horst L. St÷rmer, Columbia University, New York and Bell Labs, New Jersey, USA, and Professor Daniel C. Tsui, Princeton University, Princeton, New Jersey, USA. \p
They are being awarded the Nobel Prize for discovering that electrons acting together in strong magnetic fields can form new types of "particles", with charges that are fractions of \Jelectron\j charges. The citation reads: "for their discovery of a new form of quantum fluid with fractionally charged excitations". In effect, what they did was to come up with a fascinating new aspect of the \IHall effect\i.\p
In 1982, St÷rmer and Tsui carried out an experiment with unusual results, at low temperatures, using extremely powerful magnetic fields. Within a year, Laughlin had explained their observations, providing an exciting new insight into the realm we call quantum physics.\p
While nobody has ever succeeded in dividing up the charge on an \Jelectron\j, the 1982 experiment seemed to point to that happening, as the crowds of electrons in a semiconducting solid "danced" together as if they carried those forbidden fractional charges. The puzzling result was dubbed the "fractional quantum Hall effect", and reported to the world.\p
Taking up the issue, Laughlin showed that the electrons in a powerful magnetic field can condense to form a kind of quantum fluid, related to the quantum fluids that occur in \Jsuperconductivity\j and in liquid \Jhelium\j. The key point is that the events which take place inside a drop of quantum fluid tell a much fuller story about the general inner structure and dynamics of matter than events happening in normal matter.\p
Here, we need to digress and consider the original, non-quantum, Hall effect, which was discovered in 1879. Edwin Hall found that if a thin gold plate is placed in a magnetic field at right angles to its surface, an electric current flowing along the plate can cause a potential drop at right angles both to the current and the magnetic field. The voltage drop happens because the charged electrons are moving in a magnetic field, which puts a force on them, and deflects them sideways. This is because the electrons try to \Jorbit\j around the field lines, forcing current across the conductor, so that charges build up on one side of the conductor, producing a voltage difference from one side of the conductor to the other.\p
In most cases the voltage drop is linearly proportional to the magnetic field for a given current, making the Hall effect a useful measuring tool. In fact, the Hall effect can be used to determine the density of charge carriers (negative electrons or positive holes) in conductors and semi-conductors. As a result, it has become a standard tool in physics laboratories around the world.\p
\IKlaus von Klitzing\i, a German physicist and 1985 Nobel laureate, showed in 1980 that there was a quantum Hall effect. He found this in a "gas" of highly mobile electrons trapped between two layers of semiconductor in a strong magnetic field. \p
As he increased the strength of the magnetic field, von Klitzing found the voltage drop across the layer increased in steps rather than steadily. Clearly, these steps are derived from the electrons' quantum mechanical nature.\p
Just as the electrons in an atom are restricted to certain energies, so, too, are the \Jelectron\j orbits around the magnetic field lines limited to certain sizes. Increasing the magnetic field strength had no effect on the \Jorbit\j sizes until a critical level was reached, and then a new permissible \Jorbit\j opened up, bringing about a change in the inclination of the electrons to drift.\p
So now we have the Hall effect, and the quantum Hall effect. In 1982, St÷rmer and Tsui found that when there are even greater \Jelectron\j mobilities and stronger magnetic fields, the resistance steps developed subdivisions, giving us the fractional quantum Hall effect. \p
According to Laughlin, this was because the fixed sizes of the orbits lead to the formation of a kind of incompressible, quantum fluid of electrons and magnetic fields. As this field is fed \Jenergy\j, as it becomes excited in the language of the physicists, waves develop, and the interaction of these waves produces behaviours quite different from either the particles or fields alone: it performs like "quasiparticles" which appear to have fractional charge.\p
#
"Nobel Prize for Economics",824,0,0,0
(Oct '98)
There is no Nobel Prize for economics in the strict sense, but the 1998 Nobel Memorial Prize in Economic Science has been awarded to an Indian scholar who pioneered the theory behind the economics of poverty. The citation praises 64-year-old \IAmartya Kumar Sen\i for his contributions to the field of welfare economics and for restoring "an ethical dimension to the discussion of vital economic problems".\p
Sen earned a doctorate from Cambridge in 1959, and has spent three decades on problems ranging from how individuals make spending choices to how poverty levels are best calculated. His best-known work, and presumably the major work that led to his prize, is found in his fresh look at the economics of \Jfamine\j.\p
In studies of disasters in India, \JBangladesh\j, \JEthiopia\j, and Saharan \JAfrica\j in the 1970s, Sen challenged the conventional view that famines are almost always caused by food shortages. Instead, he showed that other factors, such as declining wages and rising food prices caused by bad \Jweather\j, influence the distribution of food and aggravate \Jfamine\j conditions for the poorest people.\p
Sen has excelled in applying theory to practical problems such as his poverty work, where he developed new ways of calculating how many people fall below a nation's poverty line and how far below the line they are. This, of course, is essential information for policy-makers as they devise solutions to economic woes: if economics is the "dismal science", at least it can now quantify some of the most dismal aspects, allowing them to be undone.\p
#
"Nobel Prize for Peace",825,0,0,0
(Oct '98)
This prize was awarded to John Hume and David Trimble for their work in bringing about something close to peace in \INorthern Ireland\i, culminating in the peace agreement signed on Good Friday, 1998. A number of other leaders were also recognised in general under the heading: "other Northern Irish leaders, and . . . the governments of Great Britain, Ireland and the United States".\p
#
"Nobel Prize for Literature",826,0,0,0
(Oct '98)
As mentioned last month, Jose Saramago of \JPortugal\j, a novelist and poet who lives in the Canary Islands, is the first Portuguese language author to receive the Nobel Prize for Literature. \p
#
"And now, the Ig-Nobels!",827,0,0,0
(Oct '98)
While the Nobel Prizes celebrate the best and brightest humans, the Ig-Nobel awards confer a somewhat more dubious honour on its winners. Some of the prizes are given because the work involved sounds funny-to-hilarious, like Peter Fong's work on feeding Prozac to clams (see \BCalm clams?\b, March 1998), or the work of another researcher in making a suit of armour to wear while engaging in close contact with grizzly bears. Other work is listed for recognition because the Ig-Nobel group (closely related to the \IAnnals of Improbable Research\i ) regard the work as suspect.\p
You can normally judge which work falls into which category by looking at the responses of the winners. Troy Hurtubise was proud of his safety \Jengineering\j prize for designing a high-tech suit of armour to wear during encounters with grizzly bears, and appreciated the way in which the committee was particularly impressed by Hurtubise's thorough testing methods, which included throwing himself off a cliff while wearing the armour, being smacked by \Jbaseball\j bats, bashed by a truck at 65 kilometres an hour, and blasted with an AK-47 at a 3 metre distance. \p
A previous winner of the Ig-Nobel chemistry award took out his second Ig-Nobel for claiming not only that \Jwater\j has memory but also that this information can be transmitted via e-mail. Not to be outdone, a La Jolla, \JCalifornia\j New Age guru Deepak Chopra, took out the physics prize for his interpretation of quantum physics, "as it applies to life, liberty, and the pursuit of economic happiness".\p
#
"Where are the scientists and engineers working?",828,0,0,0
(Oct '98)
According to a survey released by the US National Science Foundation, a majority if graduates in those disciplines are actually working in non-science/\Jengineering\j positions. Those working outside their field of training outnumbered those working in the science/\Jengineering\j field by 2 to 1 in 1995.\p
The US had a science and \Jengineering\j (S&E) workforce of nearly 3.2 million in 1995, with 2.6 million or 83% of them having their highest qualifications in the S&E field. But another 4.7 million people with their highest qualification in that area were working in other, non-S&E occupations.\p
Engineers made up 42% of the workforce, while mathematicians and computer scientists made up 30%, with life scientists and social scientists each about making up about 10%, and physical scientists about 9%. More than half the S&E degree holders employed in non-S&E occupations were in fields such as management/administration, sales and marketing, and non-S&E-related teaching.\p
Most of these said that their work was at least partly related to their degree. And the good news: S&E degree holders had less than half the US average level of unemployment.\p
#
"Science education goes back a step?",829,0,0,0
(Oct '98)
The single state of \JCalifornia\j has a large enough education budget to influence the form and content of textbooks across America, and indirectly, in much of the rest of the English-speaking world. So science educators were more than a little alarmed to learn in early October that Californian third graders will in future be taught about the \Jperiodic table\j, while 6th graders will have to cope with "lithospheric plates". That, says the state Board of Education, is how it is going to be.\p
The decision was made in a unanimous vote, in spite of a flurry of lobbying and letter-writing by professional and competent scientists and science educators, mainly arguing that the decision was targeted too much at factual knowledge, with no attention to concepts.\p
According to Bruce Alberts, president of the US National Academy of Sciences (NAS), the Californian scheme does not match the national standards put forward by the NAS in 1996, and may in fact force teachers to concentrate harder on factual material, at the expense of more in-depth learning activities that would give students a better understanding of the scientific process.\p
While the standards had some (fairly weak) support, the November election of Gray Davis, a Democrat, as Californian state governor, may see the standards scrapped.\p
\BKey names:\b Vijendra Singh, Victor Yang and Sheren Lin \p
#
"Monsanto takes the blame",830,0,0,0
(Oct '98)
According to a story in \INew Scientist\i in late October, Monsanto is being blamed by other biotechnology firms for the bad press that genetic \Jengineering\j is getting at the moment. Monsanto, they say, has triggered a consumer backlash that could cripple the prospects for genetically engineered food in Europe. \p
In 1997, people in Europe learned that stocks of Monsanto's \Jherbicide\j-resistant Roundup Ready \Isoya bean\i had been shipped to Europe mixed with ordinary soya. Since then, consumer acceptance of engineered food has collapsed throughout Europe, as consumers interpreted the move as a ploy to force transgenic soya down European throats. \p
Monsanto counters that it was farmers, not the chemical giant, which chose not to keep the stocks separate. That being said, the company concedes that it was wrong when it assumed that the US calm acceptance of the new soybean would be matched in Europe. All the same, whoever made the decision, the whole "transgenic" industry is bearing the consequences. \p
While there is no evidence that "genetically engineered" crops are any more (or less) dangerous than other specially bred strains, there is an element of "\JFrankenstein\j fear", which has led other major players to label their products, allowing consumers to opt not to use them if they wish. There is also a more legitimate concern that the Roundup Ready beans may contain larger amounts of residues from the \Jherbicide\j Roundup, making the beans potentially more dangerous.\p
One result has been that British retailers have been buying more soya from \JBrazil\j and Argentina, two countries where the modified strains have been banned, but \JBrazil\j has recently approved the use of Roundup Ready soya beans, and Monsanto aims to capture 20 per cent of the Brazilian market within three years.\p
#
"Genetic researchers ahead of target",831,0,0,0
(Oct '98)
The US Department of \JEnergy\j, Joint \JGenome\j Institute (JGI), reported during October that it came out ahead of its ambitious goal of sequencing 20 million base pairs for fiscal year 1998. In fact, they were delighted to report that they had achieved an unprecedented ten-fold increase in production output over the previous year.\p
The JGI, established in 1996, is a consortium of scientists, engineers and support staff from the Lawrence Berkeley, Lawrence Livermore, and Los Alamos National Laboratories. \p
The \IHuman \JGenome\j project\i involves an international effort to determine all 3 billion base pairs that comprise the human \Jgenome\j. At the 1998 rate, it would take 150 years for the JGI alone to complete the entire sequence, but other laboratories are also involved, and the pace is still accelerating: in 1999, the JGI plans to sequence another 70 million base pairs, 30 million finished bases and 40 million "draft" bases. By the year 2000, they hope to break the "100 million barrier". This will build on the existing \Jdatabase\j of some 195 million base pairs by the end of October, 1998, about 7% of the total.\p
The first step in sequencing involves building a low-resolution "map" which captures features along vast stretches of DNA, from thousands to millions of individual units. The next step involves methodical sequencing of portions of that map.\p
While the first high-quality set of human \Jgenome\j sequences was expected to be completed by 2005, experts now predict that they will reach this target by 2003. There should be a "working draft" of the genomic sequence by the year 2001 and a highly accurate and comprehensive "finished product" by 2003. \p
The release of GeneMap '98 on the \JInternet\j during October is yet another sign of progress in unravelling the human \Jgenome\j. This compilation highlights the "gene-rich" regions of the chromosomes, showing the regions which will best reward sequencing efforts.\p
The new map identifies the positions of more than thirty thousand genes of the estimated sixty to eighty thousand that scientists expect to find. It can be found at http://www.ncbi.nlm.nih.gov/genemap The site, which has been in existence since 1996, and has gained some three million "hits" in that time.\p
The updated version is likewise expected to attract numerous scientists, physicians and others interested in \Jgenetics\j. A special section for the public, entitled "Genes and Disease", is featured on the website. \p
#
"Chlamydia sequenced",832,0,0,0
(Oct '98)
The entire \Jgenome\j of \IChlamydia trachomatis\i, the bacterium described as the leading cause of venereal disease in the United States, was published in the journal \IScience\i in late October. Nearly 4 million new cases of \Ichlamydia\i are reported in the US each year, affecting almost 2% of the population, and costing some US$2 billion a year.\p
Untreated, the infection can cause \Iurethritis\i and pelvic inflammatory disease, leading in some cases to ectopic (tubal) \Jpregnancy\j, and sterility in some women. In \JAfrica\j and Asia, the disease is transmitted more by hand-eye contact among children rather than sexual intimacy, and it is a leading cause of \Jblindness\j when it triggers off \Itrachoma\i. Chlamydial infections also increase a person's vulnerability to HIV infection.\p
\IChlamydia\i only grows inside another cell, rather like a virus, so it cannot be cultured in the laboratory like other \Jbacteria\j, making the task of vaccine production harder: the sequence of 1,042,519 base pairs may serve to assist in developing a vaccine, as the sequence also reveals details of the surface proteins on the bacterial cell.\p
Curiously, this procaryotic bacterium turns out to have around twenty genes which are typical of eucaryotic life, suggesting that, over time, it has "borrowed" genes from its host or hosts. Other sequenced \Jbacteria\j have either had no eucaryotic genes, or many fewer, suggesting that the \Ichlamydia\i \Jgenus\j has been associated with eucaryotic cells over very long evolutionary periods.\p
One of the questions which fascinates biologists is the pathway that led to a parasitic existence. Here, the sequence gives us some hints, because the eucaryotic genes are like plant genes, suggesting that the organism developed first as a parasite of plants.\p
Another interesting finding is that, contrary to the belief of many biologists, \Ichlamydia\i can make \Iadenosine triphosphate\i (ATP), the basic fuel for every cell. Up until now, biologists have believed that the parasitic \Ichlamydia\i had to live inside a eucaryotic cell so as to get ATP. There is clearly something that the cell needs, but for now, we can rule out ATP as that certain "something".\p
\BKey names:\b Richard S. Stephens, Ronald Davis, web site: http://www.stdgen.lanl.gov\p
#
"Genetic control of malaria sought",833,0,0,0
(Oct '98)
With \Imalaria\i making a comeback all over the world, a number of researchers are chasing after novel strategies to control the disease. One solution may be to find a weak point in the genes of the mosquitoes which spread the disease. \p
A mosquito is infected when it bites an infected human or other host. Of the hundreds of species of mosquito, only about 70 are able to support the parasite over its two-week life cycle inside the insect. The rest are somehow able to resist the parasite, and this ability is most probably coded in the genes of the mosquito. \p
There are eight drugs commercially available to treat \Jmalaria\j, and parasites have been found which resist each of these drugs. At the same time, many of the carrier mosquitoes are developing resistance to the normal control methods. To make matters worse, many of the countries where \Jmalaria\j is on the rise are also countries where there is considerable social breakdown.\p
Shirley Luckhart of Virginia Tech is one of the experimenters searching for a solution right now. Describing her work on the \JInternet\j during October, she said that she believed the answer was to develop transgenic mosquitoes which are incapable of transmitting the parasite. The first step is to identify the genes that affect the life cycle of the parasite. Next, she will need to find ways to manipulate the gene and ways to introduce the modified gene into a population of mosquitoes.\p
According to Luckhart, the immune response in mosquitoes involves \Initric oxide\i (see Nobel Prize in Medicine or \JPhysiology\j, this month), and humans also produce nitric oxide in response to malarial infections. While the nitric oxide kills some of the parasites, others survive to be transmitted back into a new host. Enough nitric oxide to kill all the parasites may well be enough to kill the mosquito as well, but perhaps, she says, it may be possible to change the timing, so that the nitric oxide is produced earlier.\p
The gene for the \Jenzyme\j controlling nitric oxide production in mosquitoes is similar to the human gene, so the next question would seem to be unravelling the signal that turns on NO production in mosquitoes. In the longer run, she speculates that the methods which work against malarial parasites may also work on other pathogens such as viruses. Like most good science, there is an interesting question to be answered, but what it will lead to is anybody's guess at this stage.\p
Watch out for the news, next month, of the complete \Jgenome\j of \IPlasmodium falciparum\i, which was announced in early November.\p
#
"Speech and inheritance",834,0,0,0
(Oct '98)
The world's largest study of language acquisition in twins suggests that speech acquisition is under strong genetic control among children at the low end of the scale of development. In other words, where children are in the bottom 5% of the range for learning to speak, the ability to develop speech seems to be largely controlled by their genes.\p
The study, published in \INature Neuroscience\i, looked at more than 3000 pairs of twins aged two years, across England and Wales. In general, twins, whether they were identical or fraternal, scored very similarly on language acquisition. But where one twin scored in the bottom 5%, an identical twin sibling had an 81% probability of also falling in that 5% range, while a fraternal twin sibling had only a 42% probability of also falling in that range. The key to interpreting this information is that identical twins have the same genetic make-up, while fraternal twins have only a 50% common inheritance.\p
For most children the important factor is "nurture", their environment, while "nature" (their genes) plays a much smaller part. It is only in the bottom range that the genes come into effect as the major cause of variation, and this is why it is important to distinguish between the two types of twin. The contribution of the genes may be even greater, if we make the reasonable assumption that parents treat twins, whether they are fraternal or identical, in the same way. And given that the 42% probability for fraternal twins holds up whether they are of the same or opposite sexes, it appears that boys and girls are treated similarly.\p
Yet while there is an apparent genetic link to a delay in learning language for some children, the researchers emphasise the need to work on improving the environment in which children learn language. \p
The study uses a subset of the 7756 pairs of twins born in England and Wales in 1994, with 1044 pairs of identical twins, 1006 same-sex fraternal twins, and 989 pairs of opposite-sex twins. The study involved parents checking the words used by each twin, and ticking off words on a representative sample of 100 words commonly used by two-year-olds. Some two-year-olds will already use all of these test words, and across the sample, the average number used was 48, with just 4.2 words per child among the lowest 5% of the sample. Overall, 61 children produced no recognisable words.\p
The aim of the program is to identify, if possible, the half of the slow developers who will later show signs of language problems. The remainder will develop normally, and it would be useful to find ways of identifying these children at risk as soon as possible, but if the result is partly genetic, there may be less that can be done for them.\p
\BKey names:\b Robert Plomin, Philip Dale\p
#
"Does a virus cause autism?",835,0,0,0
(Oct '98)
\IAutism\i is a developmental disorder which affects brain function, interfering with reasoning ability, imagination, communication, and social interaction. Autistic children start talking later than other children, and when they do speak, their communication skills are limited.\p
According to a report in the October issue of \IClinical \JImmunology\j and Immunopathology\i, there is now evidence to suggest that the condition may be partly caused by an autoimmune response to a viral infection. When this happens, the body's normal immune defences go wrong, so that they attack the body's own cells, rather than attacking the invading pathogens which set off the reaction.\p
Researchers found that autistic children who had been exposed to certain viruses (\Jmeasles\j and human herpes virus-6) in the past showed unusually high levels of \Jantibodies\j to brain proteins, suggesting an autoimmune response. The \Jantibodies\j are anti-MBP and anti-NAFP, which react to \Jmyelin\j basic protein and neuron-axon filament protein.\p
While the virus antibody levels in autistic children are similar to those in other children, there was a close association between the levels of viral \Jantibodies\j in the autistic children and the levels of brain \Jantibodies\j, and none of the non-autistic subjects had brain autoantibodies. The strongest link was between \Jmeasles\j \Jantibodies\j and anti-MBP.\p
One controversial aspect: parents of autistic children have claimed in the past that the first signs of autism came soon after the children were immunised with \Jmeasles\j-\Jmumps\j-rubella (MMR) or \Jdiphtheria\j-pertussis-tetanus (DPT) vaccines, but no scientific studies have shown a link between vaccines and autism. Almost all the autistic children in the study had been given MMR immunisations, and none had ever been diagnosed with a case of \Jmeasles\j. This is an area which will need to be closely investigated, as it is possible that the children were infected by \Jmeasles\j, but never developed the symptoms of the disease. Right now, the jury is still out.\p
#
"Delivering p53",836,0,0,0
(Oct '98)
The p53 gene works in our bodies as a general-purpose housekeeper, repairing damaged cells, or destroying them if they are beyond repair. Many solid tumours are caused by p53 mutations, and more than half of all ovarian cancers involve p53 mutations.\p
Now a \JTexas\j research group is looking at using the p53 gene, inserted into an inactive common-cold virus to attack ovarian cancer in a patient. Ovarian cancer patients have large amounts of abdominal fluids, and during an operation to withdraw this fluid, a saline mixture of the engineered virus can be introduced into the \Jabdomen\j.\p
This delivers the operative p53 gene to all cells, including the cancerous cells, and once inside, the housekeeper goes to work, and once it recognises the cancer cell's DNA damage, the p53 gene "condemns it to death", in a process called \Iapoptosis\i.\p
A trial is under way to assess the best mix of engineered viruses, the sequence of doses, the length of treatment and the manner in which the patient reacts to it. \p
One participant commented that the worst she has suffered so far is something like a bad case of the flu, but with the benefit that her hair did not fall out, as had been the case when she had \Jchemotherapy\j.\p
\BKey names:\b Carolyn Muller, Robert Coleman \p
The same cleverly engineered adenovirus is also being injected directly into head and neck cancers in another study. The patients in this study have exhausted all other options. As with the ovarian cancers, more than half of all head and neck cancers are linked to mutations in the p53 gene.\p
#
"Super-penicillin?",837,0,0,0
(Oct '98)
The first \Jbacteria\j with the power to resist the threat of \Ipenicillin\i were reported just a few years after \Jpenicillin\j was brought into use. Over the years, germs have grown more and more capable of withstanding \Jpenicillin\j, but now the medical scientists are fighting back.\p
In an October Web edition of \IThe Journal of Organic Chemistry\i, published by the American Chemical Society, new work was reported on altering \Jpenicillin\j's structure and making it effective against antibiotic-resistant strains of \Jbacteria\j.\p
The researchers say that their prototype structure offers a "checkmate stratagem" against resistant \Jbacteria\j. Researchers at the University of Limerick, led by Timothy Smyth, have looked at the way in which \Jbacteria\j resist \Jpenicillin\j. The first step is for the bacterium to develop an \Ienzyme\i which splits the \Jpenicillin\j molecule, so it becomes ineffective. The standard response is to build a chemical into the administered drug which attacks the bacterial \Jenzyme\j, but at least some \Jbacteria\j have developed resistance to this as well.\p
The Limerick solution is to build a unique fragment into the \Jpenicillin\j, so that when the bacterium tries to cleave the \Jpenicillin\j molecule, the result is a fragment which kills the \Jbacteria\j anyway. Meanwhile, normal \Jbacteria\j are attacked by the \Jpenicillin\j in the normal way. This, say the researchers, is a first step, and while they have some distance to go before they can offer a successful drug, they believe they are onto a winner.\p
#
"Pioneer's odd movements explained",838,0,0,0
(Oct '98)
In the far reaches of the \Jsolar system\j, the two Pioneer \Jspacecraft\j have been moving in odd ways, slowing more than they should do, due to gravity, leading some scientists to speculate that a strange force was acting on them, perhaps even a force unknown to science. Einsteinian relativity might need updating, suggested some . . .\p
Now it looks as though the solution is nothing more than a tiny gas leak in the control system. As valves on the space-craft open and close to adjust the craft's spin and orientation, there have been some anomalous speed changes which tie in with the gas flows, suggesting that gas leaks are braking the craft's progress. This means that it is likely that the other slowing-down may be caused by other, more constant, gas leaks.\p
As well, the European "Ulysses" craft is also slowing down in an anomalous way, and is known to be "leaking all over the place", probably because of sticking valves. The small change is tiny but measurable, and could be accounted for by a leak of just grams of gas per year.\p
#
"And today's weather on Neptune . . .",839,0,0,0
(Oct '98)
It makes sense for a \Jplanet\j close to the \Jsun\j like our \JEarth\j to have \Jweather\j. All that solar \Jenergy\j has to go somewhere, but for a \Jplanet\j as far away as Neptune, where the \Jsun\j's \Jenergy\j levels are reduced to just 1/900 of what they are at the \JEarth\j's \Jorbit\j, there simply should not \Ibe\i any \Jweather\j.\p
Theory often turns out to be less than accurate, and this theory is no different. Observations from powerful ground-based and space-based telescopes have revealed some of the wildest and weirdest \Jweather\j in the \Jsolar system\j, and it is all on Neptune. Simultaneous observations of Neptune from the Hubble Space \JTelescope\j and NASA's Infrared \JTelescope\j Facility on Mauna Kea in Hawaii have captured images of monster storms and equatorial winds of 1,500 kph (900 mph).\p
The images have been used to build a time-lapse picture of Neptune's \Jweather\j, and the results are puzzling scientists. Putting it bluntly, the \Jweather\j there seems to run on no \Jenergy\j at all. Some of the clouds appear to be higher than others, and from this, observers are now building up a picture of the wind speeds involved. Interestingly, while the planetary probe \IVoyager\i detected the Great Dark Spot, a pulsating feature nearly the size of the \JEarth\j itself, in 1989, Hubble observations two years ago showed the spot had disappeared, and that another, smaller spot had emerged, and this appears to have been trapped at a fixed latitude and may even be declining in intensity.\p
There are distinct bands of \Jweather\j that run parallel to the Neptunian equator, and scientists are now speculating that these may be similar to the equatorial region of the \JEarth\j where tropical heat provides abundant \Jenergy\j to make clouds. Some regions of latitude seem to generate bright clouds consistently, but there is no clear explanation for this.\p
\BKey names:\b Lawrence Sromowsky, Pat Fry, Sanjay Limaye, Kevin Baines and Timothy Dowling\p
#
"Europa fly-by",840,0,0,0
(Oct '98)
On September 25, 1998, the \Jspacecraft\j \JGalileo\j completed a close-up fly-by of \Ieuropa\i on schedule and on target. In early October, information was received from \JGalileo\j's skim-past, at just 3624 km (2226 miles) above Jupiter's icy moon. The fly-by was performed in cruise mode without \JGalileo\j's gyroscopes, because the gyros activated a fault protection program a week earlier.\p
This meant using the craft's on-board scanner to maintain \JGalileo\j's orientation, but the flight has now been declared a success. Scientists are increasingly confident that Europa has a deep, underground (or under-ice) ocean of liquid \Jwater\j, where life may be found. The surface of Europa is around -160░C (-260░F), but deeper down, the tidal friction generated by Jupiter and the other moons could be keeping large parts liquid. Similar tidal friction is generally believed to be behind the volcanic activity on Europa's neighbour Io.\p
Images of Europa from the \JGalileo\j \Jspacecraft\j reveal a complicated terrain of grooved linear ridges and crustal plates which seem to have broken apart and rafted into new positions, supporting the belief that there is subsurface of \Jwater\j or slush. Further images were retrieved during October, and on into November. There will be another fly-by of Europa on November 22, 1998.\p
From the pictures already received, we now know that the strangest places on \JEarth\j carry hardy microbes called the \Iarchaebacteria\i, (or "Archaea"), and scientists have started to assume that places like the hypothetical seas of Europa may carry similar microbes, a belief that seems to be strengthened by the knowledge that there is a lake of \Jwater\j, deep under the ice in \JAntarctica\j, which may be as much as a million years old.\p
The Archaea have their genetic material floating freely through the cell like procaryotes, but they are otherwise genetically more like the eucaryotes of \Jearth\j. But as we have seen this month (\BThe smile on the face of the \Jdinosaur\j\b), scientists often look for the familiar in unfamiliar surroundings, and it is unlikely that Europa ever offered the conditions which led to the Archaea evolving on \Jearth\j. All the same, the conditions on Europa seem to be in the range which might have supported the \Jevolution\j of living forms. Any life which may one \Jday\j be found in Lake \JVostok\j certainly did not evolve there in the last half-million or million years.\p
The lake was found by seismic surveys in 1996, and a hole is slowly being drilled down to the liquid: the most recent samples came from just 100 metres above the liquid, which means 3600 metres (more than two miles) into the ice. But Lake \JVostok\j is at least reachable by human operators: NASA is looking at a possible Europa orbiter which might use radar soundings to find what is below the ice. This project might launch in 2003, and is seen as a preliminary survey for a later trip to send undersea explorers into the Europan oceans.\p
Watch for the update on further picture analysis and the November 22 fly-by in next month's science updates.\p
#
"An odd brown dwarf",841,0,0,0
(Oct '98)
German astronomers at the Max Planck Institute for Extraterrestrial Physics and at the European Southern Observatory, reported in \IScience\i in early October that they had achieved the first X-ray detection of a \I\Jbrown dwarf\j\i, and at the same time, found the youngest \Jbrown dwarf\j known so far.\
The \Jbrown dwarf\j, called Cha Ha 1, is a very young member of the \Ichamaeleon\i dark \Jcloud\j number I (Cha I), a star forming region located 550 light-years away from the \JEarth\j.\p
\JBrown dwarf\js, with a maximum mass of around 7.5% that of our \Jsun\j, which is about eight times that of Jupiter, are obviously in the mass range between stars and planets. With such a mass, they cannot reach the \Jtemperature\j and pressure at their centres to get nuclear reactions going.\p
\JBrown dwarf\js do not even have enough mass to collapse under their own weight, so they merely fade away, and because they are of low luminosity at the best of times, they are very hard to detect, so that the first confirmed \Jbrown dwarf\j was only identified in 1995. Even now, only about a dozen \Jbrown dwarf\js are known.\p
The glow that comes from a \Jbrown dwarf\j is generated as they shrink, releasing gravitational potential \Jenergy\j, which makes them, in the world of the \Jbrown dwarf\j, comparatively bright and hot when they are young, which is why astronomers have been searching in areas where stars are forming, like Chamaeleon 1. Cha Ha 1, from the data gathered so far, has a mass of only 4-5 % of the mass of the \JSun\j, and an age of one million years. The next youngest \Jbrown dwarf\j seen so far is estimated to be between three and ten million years old.\p
#
"A new particle?",842,0,0,0
(Oct '98)
There is an upper limit to the \Jenergy\j levels of \Icosmic rays\i which come from more than 60 million light-years away. Called the Greisen-Zatsepin-Kuzmin (GZK) cutoff, this theoretical result concludes that cosmic rays above a certain \Jenergy\j, cannot travel more than about 60 million light years because the protons that form them steadily lose \Jenergy\j as they collide with photons which have littered the cosmos since the Big Bang.\p
A theory is only as good as the data that surrounds it, and in late October, physicist Glennys Farrar suggested in \IPhysical Review Letters\i that the very highest \Jenergy\j cosmic rays appear to come from extremely distant quasars. With astronomer Peter Bierman, Farrar has been studying the available data on the five most energetic cosmic rays ever detected on \JEarth\j.\p
The researchers have been able to retrace the trajectory of each of the five cosmic rays, using data collected by the Fly's Eye detector in \JUtah\j, England's Haverah Park detector and \JJapan\j's AGASA shower array. The points of origin in each case coincide with the location of a compact radio loud \Jquasar\j. These quasars are among the most powerful sources of \Jenergy\j in the universe, but the nearest of these quasars is 4-5 billion light years away. The furthest is 13-16 billion light years away.\p
So if Farrar is right, the GZK cutoff is somehow being violated, or there is a need for a new assumption, and that is what Farrar has suggested. He has offered us a possible new particle, the SO, which has not suffered interactions either with particles or with intergalactic magnetic fields, suggesting that the rays are not made of protons. Farrar's proposed SO is a neutral particle made up of three quarks (up, down and strange) bound together with a gluino.\p
A neutral particle would be less likely to interact with the big-bang photons, and its predicted mass, around 2-3 times that of a proton, would also help the particle resist \Jenergy\j loss.\p
The \Jcorrelation\j between the directions of these cosmic rays and the position of these unusual quasars is extremely unlikely to be a chance event. If any future energetic cosmic rays prove to come from such sources, then the case will be even stronger: right now, there is only one chance in 200 that the \Jcorrelation\j happened by chance.\p
#
"A new planet is born",843,0,0,0
(Oct '98)
The busy Hubble Space \JTelescope\j has spotted dust coalescing in discs around stars in Orion, and astronomer Larry Esposito and his graduate student Henry Throop believe that they may be witnessing the birth of new planets. They looked at dust discs against the bright background of the Orion nebula, and an infrared camera revealed that the rings around three stars contain dust at least 10 micrometres across. \p
This is nearly 100 times the size of interstellar dust in the region, and suggests that the region is at a critical phase in planetary \Jevolution\j. Stars form when interstellar dust clouds collapse, but some of the dust remains in \Jorbit\j around the new star, and the dust particles, which are a fraction of a micrometre across, should clump together into \Jmillimetre\j-wide particles within 10,000 years. Over tens of millions of years these particles can stick together to form planets.\p
Throop announced the results at a meeting of the American Astronomical Society's Division for Planetary Sciences during October, and pointed out that as many as 30% of the young stars in the Orion nebula appear to have discs, making it seem as though \Jplanet\j formation may be relatively common. Another brief October report in the journal \INew Scientist\i indicates that Helen Walker of the Rutherford Appleton Laboratory has evidence from the Infrared Space Observatory of even larger dust particles, 200 micrometres across, around the star Vega.\p
#
"Aerogel and John Glenn",844,0,0,0
(Oct '98)
A fairy tale of science was created in late October when veteran \Jastronaut\j John Glenn returned to space, 36 years after his first flight, and came safely home again at the end of the month. While presented as a scientific study of the effects of space flight on older people, the trip also served as a convenient vehicle to showcase NASA.\p
Among other activities, Glenn and the other astronauts were scheduled to make a first attempt at space-manufacture for a product called aerogel. Aerogel is the lightest known solid, so much like air that it's sometimes called "frozen smoke" - it is only three times as dense as air. Its insulating properties are nothing short of remarkable, protecting virtually anything from heat or cold. The hope was that aerogel, when formed in zero gravity, might be a fundamentally different material-and the name has already been chosen: "astrogel".\p
Existing aerogel offers good \Jinsulation\j, but its use in windows and skylights is limited because it is not transparent, having a slight blue haze to them. Space-manufactured aerogel, on the other hand, is more transparent, and might be useable in place of window glass. \JEarth\j-formed aerogel contains tiny, irregular pores that make aerogel hard to see through, but these irregularities should be less in zero gravity.\p
Aerogel has an index of \Jrefraction\j in the range from 1.0 to 1.05, and it was first used to insulate Sojourner during the Mars Pathfinder mission. The first \Jsilica\j aerogels were manufactured in space in April 1996 on a Conquest rocket. A 2.5 cm (1 inch) pane of Aerogel has the same insulating power as 32 panes of ordinary glass.\p
#
"DIY MRI",845,0,0,0
(Oct '98)
The sort of magnet that you need for \Imagnetic \Jresonance\j imaging\i (MRI) in a \Jhospital\j is rather superior to the magnet you use to hold your memos on a filing cabinet or \Jrefrigerator\j. These are the sorts of magnets that strip the data from your credit cards from the other side of the room, and snatch the screw-driver you are holding from your hands at the distance of a metre.\p
That, at least, was the picture in the past: now it looks as though do-it-yourself MRI may one \Jday\j be a possibility, given a report in \IPhysical Review Letters\i in late October. It said that high-quality images, of at least some parts of the body, such as the lungs, can be made using a magnet 10 times weaker than those that pin papers to refrigerators. Well, maybe "DIY" is a bit of an exaggeration, but the advance could well lead to cheaper and more portable MRI devices in the future.\p
Traditional MRI involves feeding the patient into a giant superconducting magnet with a field some 20,000 times stronger than \JEarth\j's. This field aligns the nuclei of \Jhydrogen\j atoms inside the body, leaving the atoms spinning like tiny synchronised bar magnets. A brief electromagnetic pulse starts the spinning atoms wobbling like a run-down top. If they are then tweaked by other fields, investigators can map the density of \Jwater\j molecules, revealing the structure of internal tissues and organs.\p
The new method involves first aligning the atomic nuclei in rubidium gas with a \Jlaser\j, then allowing the rubidium atoms to mix with \Jhelium\j-3 gas (that is, normal \Jhelium\j minus a neutron). As the atoms collide, some of the alignment is passed across to the \Jhelium\j atoms, after which time the rubidium atoms (which are toxic) can be removed from the gas.\p
\JHelium\j is an inert gas, and this means that the \Jhelium\j atoms can remain aligned for minutes at a time even without any magnetic field. Then, at least in one of the tests carried out so far, the aligned \Jhelium\j gas can be used to inflate an organ such as a rat's lung. The result was an image with a resolution of about one \Jmillimetre\j, using a magnetic field with a strength just 40 times stronger than \JEarth\j's.\p
The next stage could see the \Jhelium\j gas replaced with \Jxenon\j. \JXenon\j is absorbed into the bloodstream in much the same way as oxygen is absorbed, allowing detailed and cheap images of arteries and the brain.\p
#
"Stopping bullets with goat milk and spider's web",846,0,0,0
(Oct '98)
"Biosteel", made from the milk of genetically engineered goats, may protect people from bullets in the next century. It may even become the green alternative to the high-strength plastics used to package shampoos, or to make commercial fishing nets, according to an October report in \INew Scientist\i.\p
The material is biodegradable, meaning that it would need to be sealed in an inert container when it is used in critical applications such as body armour or \Jspacecraft\j, because \Jbacteria\j could get in and digest it. The first goats with the \Jspider\j gene for the protein into their mammary cells are expected to start producing the protein in the next few months. Nobody has yet made a fabric of the fibre, but the promoters believe it shows a great deal of promise.\p
\JSpider\j web protein needs to be almost invisible, yet strong enough to hold a large insect, to stop it dead when it hits the web. The \Jspider\j's silk is made of a rock-solid protein, capable of making many bonds with its neighbours. After the \Jspider\j has spun the web, it dries and pulls tight, so the proteins become a nearly crystalline and completely insoluble cable. Natural \Jspider\j silk can be stronger and more elastic than high-tensile steel or the \Ikevlar\i commonly found in today's body armour, making \Jspider\j silk an ideal product for use where steel or Kevlar is used now.\p
The problem lies in making the protein. Spiders do not make enough, but if the \Jspider\j's genes are inserted into \Jbacteria\j, the germs will make the protein, but it forms a disordered, insoluble mess. The protein's ability to link with its neighbours turns out to be its undoing.\p
Luckily, mammals make milk proteins in much the same way that spiders make silk proteins. In each case, the proteins are produced in skin-like epithelial cells, then held in a space, or lumen, where shear stresses on the protein are minimised. With luck, goats with the proteins will produce similar material to \Jspider\j's web.\p
#
"Fake pots",847,0,0,0
(Oct '98)
The 18th century was a time when decorative pottery became more common in Britain, with exports to many other parts of the world, many of them now regarded highly as "collectible", especially in the US and \JAustralia\j, where there are strong historical ties to the later part of the century.\p
But are these antiques all that they appear? A \Jgeology\j student in Georgia believes, based on \Jelectron\j microscopy of sherds found at archaeological sites in the U.S. and Canada, that there may have been many faked pieces of "18th century \Istaffordshire\i-style earthenware", imitations of the real thing, passed off to the unwary colonial north Americans as the real thing.\p
Michael Douglas has been looking at the \Joxides\j used in the pots, and believes that cheap knock-offs were being made and passed off as real Staffordshire pottery. Manufacturers would usually keep their production processes secret, so that in the absence of a maker's mark, the \Joxides\j present should give a hint of the maker's identity. But there seem to be some remarkable variations, Douglas told the Geological Society of America in Toronto during October.\p
Fragments of Staffordshire-type earthenware are common in digs from the north American colonial period, and they are useful as a starting point in dating and describing a site. The main difference between products was in the concentrations of the metal \Joxides\j used to achieve the same colour in the pots of different makers. In some cases, pieces apparently from the same maker showed the same sort of variation, and while there may be other explanations, a degree of fakery seems to be a reasonable suspicion to harbour. Even in those rather primitive times, scientifically speaking, quality control should not have been a problem when it came to selecting \Joxides\j to produce a particular colour.\p
Staffordshire pottery comes from the English Midlands, an area where fine clays are common. Colours on these pots generally come from \Jcobalt\j for blue, and tin for green. Douglas found that the proportions used varied considerably, supporting the view that there was some dirty work afoot.\p
#
"Reptile nests in Arizona",848,0,0,0
(Oct '98)
The same meeting of the American Geological Society in Toronto was told about the discovery of scores of ancient \Jreptile\j nests in \JArizona\j's Petrified Forest National Park. The nests, dated to 220 million years ago, are said to be the oldest such nests ever found. In one step, the find has doubled the \Jfossil\j history of \Jreptile\j nests. The previous oldest \Jreptile\j nests were sea turtle nests dated to 110 million years and hadrosaur nests dating back to 90 million years.\p
The find consists of 62 bowl-like depressions in \Jsandstone\j, formed above the waterline on the shore of an ancient river. The depressions are around 30 to 45 cm (12 to 18 inches) across, similar to modern \Jcrocodile\j and turtle nests. They seem to have been made by hole-nesting reptiles such as phytosaurs (primitive, \Jcrocodile\j-like animals), aetosaurs (armoured reptiles from that period) or possibly ancient turtles.\p
No eggs or egg shells have been found in the depressions, but there appear to be body impressions above some of them, and some of the \Jfossil\j chambers contain what appear to be scratch marks from digging activity and egg impressions.\p
\BKey names:\b Stephen Hasiotis and Anthony Martin\p
#
"The first Europeans?",849,0,0,0
(Oct '98)
Human remains found at Atapuerca in the early 1900s were dated to around 780,000 years ago in 1995. The remains may have been direct ancestors of modern humans, or a dead end like the Neanderthals, but Josep Pares, an expert on palaeomagnetic dating, remains convinced that his date is accurate. He remains convinced even though no \Jhominid\j fossils or tools older than 500,000 years had ever been discovered in Europe before the find in a cave called Gran Dolina in the Atapuerca Mountains of northern \JSpain\j.\p
Pares described his work at the American Geological Society in late October. He dated the Gran Dolina fossils by measuring the orientation of magnetic minerals in the rock layer in which the fossils were found. The magnetic minerals behave like tiny compass needles, pointing toward the direction of the \JEarth\j's magnetic field at the time the rock was formed.\p
While the \JEarth\j's magnetic field points north these days, we now know that in the past it has "flipped" and changed direction every so often. The last major switch from south to north, which scientists call the Matuyama/Brunhes reversal, took place about 780,000 years ago. The Gran Dolina fossils were found below the Matuyama/Brunhes boundary, setting the date of the fossils back to that time.\p
#
"Kennewick man to be examined",850,0,0,0
(Oct '98)
Who owns "Kennewick Man"? Once the matter was unimportant, because this apparently European skeleton was assumed to be quite recent. Views changed suddenly when dating showed the remains to be 9300 years ago. The skeleton is now at the centre of a legal wrangle between the Umatilla tribe of Oregon, a group claiming the skeleton as an "ancient Viking", the U.S. Army Corps of Engineers, which had \Jjurisdiction\j of the site where the remains were discovered in 1996, and scientists.\p
Because the skeleton has obvious European features (a narrow skull, a light-boned face, and a receding forehead), it is of considerable interest to scientists studying the origins of humanity in the Americas, but the Umatilla tribe want the remains buried without any further study. At the end of October, the remains were transferred to the Burke Museum of Natural History and Culture on the University of Washington campus in Seattle for "a private inventory that is expected take about nine hours."\p
U.S. Magistrate John Jelderks ordered the transfer, and what are described as non-invasive tests to be performed at the Burke. These are to be conducted by an independent scientist, and are intended to help determine whether the remains are American Indian. This is, to say the least, regrettable, as the best available test would involve gathering mitochondrial DNA from tooth enamel, and that would have to be regarded as an invasive test.\p
The probability that any descendants of Kennewick Man still live anywhere in the area where the remains were found is almost as remote as the remains being those of an ancient Viking. While the present Umatilla people may feel strongly about the matter, their descendants will not thank those who sought to destroy evidence that belongs to all Native Americans, a source of information about the origins of all of them. With luck, extreme political correctness, fuelled by ignorance, will not win the \Jday\j.\p
#
"Whitewater iguanas",851,0,0,0
(Oct '98)
How do large animals get from one land mass to another? \ICharles Darwin\i was interested in this problem, recording spiders coming aboard H. M. S. \IBeagle\i, far from the nearest land. But while spiders can parachute, and some animals can fly, reptiles and land mammals do not have access to the same modes of transport.\p
The standard theory has been that animals trapped by a flood may take shelter in tangled masses of vegetation as it washes down the river. Then, at the mercy of wind and currents, some of the animals may cling on long enough to make a landfall somewhere else. But while "rafts" have been seen forming, and also at sea, complete with inhabitants, no case has ever been observed of animals travelling long distances - until now.\p
A report in \INature\i during October describes the adventures of a group of at least 15 green \Iiguanas\i, \IIguana \Jiguana\j\i, who travelled from \JGuadeloupe\j to \JAnguilla\j on a mat of vegetation after hurricanes Luis and Marilyn in the Caribbean in 1995. The journey, 300 km (200 miles) long, was carried out on a mat of logs and uprooted trees, and its arrival was seen by local fishermen who reported seeing iguanas on both the beach and the logs in the bay. Now, more than two years later, the iguanas are still flourishing in their new home, where they had never previously been seen.\p
\BKey names:\b Ellen Censky, Karim Hodge, Judy Dudley\p
#
"How well can a bat 'see' with sound?",852,0,0,0
(Oct '98)
Insect-eating bats hunt down their prey by \Iecholocation\i, making high-pitched sounds, and homing in on the echoes from those sounds. This has been known for many years, but now comes the news that this system is amazingly accurate.\p
A report in the \IProceedings of the National Academy of Sciences\i during October describes research on an insect-eating bat, \IEptesicus fuscus\i, which can process overlapping echoes that arrive just 2 microseconds, two millionths of a second apart. Translated, this means that bats can use their \Jecholocation\j system to distinguish objects which are just 0.3 millimetres apart, about the width of a pen line drawn on paper.\p
Bats "see" their environment by listening to the echoes from their \Jsonar\j, sending out high-frequency sound waves and registering returning echoes from surrounding objects, whether those are buildings or insects. Depending on its size, an object returns a varying spectrum of reflected echoes. The bat uses this spectrum to achieve fine time resolution.\p
The experiments involved big brown bats, which sat on a Y-shaped elevated platform and made their normal \Jsonar\j sounds. These were picked up by microphones to their right and left, then played back after a delay as altered artificial echoes from loudspeakers. The controlled echoes were provided either as single or double sounds with various two-point spacings. A bat's task was to decide whether the electronic echoes varied in delay or were stationary in delay from one broadcast to the next. \p
The researchers rewarded the bat with a mealworm when it correctly identified the "correct origin" by moving toward the direction of variable echoes rather than the other side, where the echoes were regularly spaced.\p
The next step was to test the limits of the bat's powers of discrimination by playing the echoes back closer and closer together. Eventually, the animal could no longer distinguish between them, signalling that the limit of discrimination had been reached.\p
By way of comparison, the best human-made sonars have a best discrimination range of between 5 and 10 microseconds. It now appears that bats can form \Jsonar\j "images" which are of a higher quality, and suited to a wider variety of orientation tasks than just catching insects. It follows that the bat's brain must be rather more complex than scientists had assumed, pointing to an interesting direction for further research.\p
The discovery has also interested a number of naval researchers in the US, given the emphasis on submarine warfare in the US Navy, coupled with the use of dolphins' \Jsonar\j and artificial \Jsonar\j to identify objects such as mines.\p
\BKey names:\b James Simmons, Michael Ferragamo, Cynthia Moss\p
#
"The smile on the face of the dinosaur",853,0,0,0
(Oct '98)
Mr Spielberg's model makers and graphic artists are not going to like it, but it appears they may have got the dinosaurs wrong. According to Lawrence Witmer, \ITyrannosaurus rex\i probably didn't have lips and \Itriceratops\i most likely didn't have cheeks. Witmer thinks that the finding may be less important to palaeontologists than it would be to museum preparators, toy manufacturers, movie set designers and artists whose recreations of dinosaurs now seem to be inaccurate.\p
While this would not be the first time that a \Jdinosaur\j reconstruction has been revised, it is a major change because of the popularity of the dinosaurs being revised. Witmer dropped his bombshell at the annual meeting of America's Society for Vertebrate Paleontology in early October in \JUtah\j when he described his research, based on high-tech scanning of \Jdinosaur\j fossils and dissection of their modern-\Jday\j relatives. He told the society that his method may be a better, more accurate way to rebuild dinosaurs using basic comparative anatomy. \p
What has happened, he suggests, is that artists who are reconstructing extinct animals draw something that "looks right" because it resembles animals we see today.\p
Palaeontologists have equally fallen into the analogy trap, assuming that a member of the \Iornithischia\i, the size of a sheep and eating plants, will have the same muscular cheeks of a sheep. Both \Itriceratops\i and \Ileptoceratops\i have been assumed to have muscular cheeks, partly from their rough equivalence with sheep, but more importantly because \Jfossil\j skulls reveal features on their jaw bones that require explanation, "excavated areas on the upper and lower jaws resulting in the teeth being set in from the surface of the skull".\p
Since the presence of cheeks would explain this jaw structure, scientists' claim that ornithischians had cheeks was strengthened, but Witmer believes that these jaw features supported an extended beak, similar to the beaks on eagles or crocodiles.\p
In a similar way, Witmer questions the evidence for the tyrannosaurs' lips. But while lips may matter to toy and movie makers, their presence or absence makes no difference to the ecological role played by the tyrannosaurs as large and vicious hunters, lipped or unlipped.\p
Using CT scans of \Jfossil\j \Jdinosaur\j material and dissections of fresh bird and \Jcrocodile\j material, Witmer says he has come up with an alternative interpretation, but points out that the method can be tested, and so has within it the possibility of being proved wrong. "I like to think it can survive testing, but at least it can be tested", says Witmer in a press release.\p
#
"The African landscape shaped by single magma plume",854,0,0,0
(Oct '98)
The structure of \JAfrica\j is a continuing fascination for geologists (see \BWhy is South \JAfrica\j so high?\b, last month), but some of the most interesting parts of all are further north than our September focus. The slopes of snowy Mount \JKilimanjaro\j, the Great Rift Valley that was the cradle of humanity, like the burning deserts of \JEthiopia\j, all of these demand an explanation.\p
A report in \INature\i during October seems to provide the required explanation. At the conclusion of a geophysical modelling project, two researchers believe that all these, and many of the other striking geological features found in north and central \JAfrica\j, are the result of a single giant plume of \Jmagma\j which rose up from \JEarth\j's mantle about 45 million years ago and which is still present today. \p
If this model is confirmed, it will strengthen the hand of those \Jearth\j scientists who believe that large \Jmagma\j plumes have played a major role in creating many of the \Jplanet\j's outstanding geological features. This model sees a world in which giant plumes of hot liquid rock rise up from the molten mantle to within 15-150 km (10 to a 100 miles) of \JEarth\j's surface to form "hot spots". \JLava\j melted from this material rises up every so often and drives a hole through the \Jlithosphere\j, the upper layers of \JEarth\j that behave like a solid.\p
When a rigid tectonic plate above such a hot spot moves, the resulting episodes of vulcanism occur at different places, and evidence for this can be found in a sequence of extinct volcanoes in \JAustralia\j covering more than 10 degrees of latitude from \JQueensland\j to Victoria, and in the formation of the Hawaiian islands. In each case, the pattern is consistent with a tectonic plate sliding over the top of a stationary plume. The African plume, according to the report, is comparable in size with the Hawaiian one. Some geologists do not accept the hot spot/plum model, but this single theory seems to explain a wide range of observations.\p
Still, ask the critics, how does one plume influence geological events over a land-mass as large as the continental United States or \JAustralia\j? The answer say the plume supporters, lies in the hot \Jmagma\j being channelled under the \Jlithosphere\j, sometimes 2000 kilometres before it bursts through. Right now, the plume is believed (by its supporters) to be still active, and located near the southern border between \JEthiopia\j and \JSudan\j, but the seismic network that might locate it more accurately is unavailable in that poverty-stricken corner of the world. And until that network is available, the critics of the plume theory will be on reasonable ground as they look for other explanations of the observations.\p
\BKey names:\b Norman H. Sleep and Cindy J. Ebinger\p
#
"Layers in the earth's core",855,0,0,0
(Oct '98)
Our view of the \IEarth's structure\i may need to be slightly revised. A report in \IScience\i at the end of October reveals that the \JEarth\j has a core with upper and lower regions having different material properties, and is not a uniform crystal of iron, as scientists had thought. These findings will probably change the way scientists explain the origins of the \JEarth\j's magnetic field.\p
The new model is derived from \Jearthquake\j data, and it shows that the 2500 km-wide (1500 mile) sphere of solid iron, rotating in an outer core of liquid iron has two distinct parts: a lower area surrounded by a thin, uneven upper layer with different material properties. Seismic data from 11 historic earthquakes were used to infer the new structure. The researchers gathered German data on \Jearthquake\j waves travelling west to east from earthquakes around \JFiji\j and other South \JPacific islands\j, while waves travelling south to north from South America were measured in \JAlaska\j and Canada.\p
The key result was the discovery that seismic waves travel faster from south to north than from east to west, because the iron crystals in the \JEarth\j's inner core are arranged in a way that allows waves to move faster in one direction that the other. Even so, the waves from certain earthquakes reached some northern seismographs before others. Computer analysis of the data indicated that there had to be a transition point 200 km (120 miles) below the surface of the inner core. Above the transition layer, the inner core is \Jisotropic\j, meaning that waves can travel equally fast in any direction, but below the transition layer, the inner core is anisotropic, which is a label used to describe material which has different properties in different directions. In this case, the anisotropy takes the form of waves being hindered more in some directions than in others.\p
\BKey names:\b Xiaodong Song and Don V. Helmberger \p
#
"Termite controls",856,0,0,0
(Oct '98)
In \JAustralia\j, houses are often placed at risk by a native \Itermite\i (or "white ant") called \ICoptotermes acinaciformis\i. This termite burrows long distances through soil, looking for more timber to eat, often finding its way to fences or buildings. In the US, another member of the \Icoptotermes\i \Jgenus\j, \iC. Formosanus\i, has been offering a similar threat since it was introduced into the US from war materiel being returned after World War II.\p
It took a long time for the Formosan termites to become a problem, but they are now a significant problem in southern port cities, particularly New Orleans and Lake Charles, Louisiana; Charleston, South Carolina; San Diego, \JCalifornia\j; and \JHonolulu\j, Hawaii.\p
Now a new baiting system may mean an end to the threat. Developed by Dr. Gregg Henderson and Dr. Jian Chen of the Louisiana State University, the bait system lures termites into a feeding chamber and then entices them into a second chamber that contains toxin-laced material, which the invaders carry back to their nest to kill the entire colony.\p
The baiting system consists of a plastic tube containing baited cardboard, sealed behind paper. The researchers have been "conditioning" the unbaited end by placing termites in there. The tube is then placed near a mud-walled shelter tunnel which the termites use for travel between their colony and food sources. The tunnels are easier to find than the nest itself, which may be a long way off and difficult to find.\p
The introduced termites will find their way into the tunnels, laying down a scent trail that will lead other termites back to the food source, where they will break down the paper barrier and carry the bait back into the nest. The actual poison is a \Jchitin\j inhibitor, which makes it safe around humans and animals which do not have any chitinous parts, but a moulting termite will be unable to form an exoskeleton, its outer shell, so it will eventually die, though the action will be slow.\p
The one drawback is that the inhibitor may reduce the size of the nest to a point where the termites no longer reach the bait, so the system may offer more of a control than an elimination, but even that will be welcome to Americans faced with choosing between the destruction of their homes or the use of dangerous poisons.\p
#
"Nitrates in streams may be coming from bedrock",857,0,0,0
(Oct '98)
A major \Iconservation\i problem in bodies of fresh \Jwater\j comes from the influence of high \Jnitrate\j levels. High \Jnitrate\j levels can cause massive algal blooms that rob surface waters of oxygen and lead to large fish kills. As well, high \Jnitrate\j levels in drinking \Jwater\j have been linked to some human cancers and also to an infant blood disorder commonly known as "blue baby syndrome".\p
The blame for these \Jnitrate\j levels has been directed at a variety of causes: atmospheric emissions, livestock feeding, agricultural runoff, timber harvest and even industrial discharges, but now it appears that part of the problem may be due to nitrates being leached out of local bedrock. A report in nature in late October describes a study of a watershed in \JCalifornia\j's Sierra Nevada mountain range, southeast of Sacramento. The aim of the work was to pinpoint the primary source of elevated \Jnitrate\j levels in downstream reservoirs.\p
Samples from 35 streams showed varying \Jnitrate\j levels, and a careful geological study revealed that there appears to be a direct \Jcorrelation\j between high levels of stream-\Jwater\j \Jnitrate\j and the presence of \Jnitrogen\j-rich bedrock. The highest concentrations of stream-\Jwater\j \Jnitrate\j occurred in the lower portion of the watershed, an area dominated by metavolcanic and metasedimentary rocks such as phyllite, slate, biotite schist, breccia and greenstone. These rocks were found to contain a significant amount of \Jnitrogen\j that could be released as the rocks weathered.\p
Higher up in the watershed, where the rocks were mainly diorite and granite, both low in \Jnitrogen\j, the stream-\Jwater\j \Jnitrogen\j levels were lower. They also found \Jnitrogen\j peaks in early fall (autumn) and winter, suggesting that early rains flush out \Jnitrogen\j which has weathered from the rocks during the previous summer. \p
This rules out the suspicion that the \Jnitrate\j levels were somehow related to \Jforestry\j activities in the upper part of the watershed, although one possible solution may be to engage in careful \Jforestry\j lower down. The researchers suggest that the native oaks, once found in the lower Mokelumne River watershed, could be restored. The oaks have deep root systems which can draw up the \Jnitrogen\j released into the soil by weathering of the rocks. When the oak wood or leaves fall to the ground, they will eventually be burned in periodic fires, returning the \Jnitrogen\j to the \Jatmosphere\j.\p
\BKey names:\b JoAnn Holloway, Randy Dahlgren \p
#
"Kill the whales to save the otters?",858,0,0,0
(Oct '98)
Early November saw Japanese "scientific whaling" fleets setting forth to plunder Antarctic whales in the only form of science where you get to eat the evidence. The 1996-97 catch of 2000 tonnes was derived from around 500 whales, and was wholesaled for 3.5 billion yen, and probably retailed for three times that amount.\p
Science has not been kind to the pretence of "scientific whaling", with DNA testing showing that some of the meat on sale came from totally protected species of whale, and not from the \IMinke whale\i that they claim to be studying all the way to the bank.\p
Yet it could be argued that whales may be in need of some sort of control, sooner or later, but only if they are killer whales. There has been a sudden drop in the numbers of Steller sea lions and harbour seals in western Alaskan waters, possibly because fisheries have been declining. As a result, killer whales are believed to be eating large numbers of sea otters, according to a \IScience\i report in mid- October. Jim Estes, one of the biologists involved, says "we estimate that between 40,000 and 45,000 sea otters have died since 1990 from \Ikiller whale\i predation in roughly 3,300 kilometers of shoreline". \p
Killer whales' usual diet is mainly Steller's sea lions and \Iharbour seals\i taken in the ocean, but the numbers of these animals have dropped. The cause is uncertain, but usually assumed to be related to over-fishing, as both species eat fish. The problem may also be a change in the make-up of fish species, caused partly by human fisheries, compounded by rising ocean temperatures and a drop in the number of large whales, but whatever the cause, the killer whales have switched to coastal predation on sea otters.\p
A drop in the number of sea otters means there are more \Isea urchins\i flourishing in the \Ikelp\i forests along the coast, as the otters normally eat these animals. This in turn means that the kelp is taking a battering from the plague of sea urchins. With each killer whale eating up to five sea otters each \Jday\j (based on the whales' known \Jenergy\j needs), the ecological balance is unlikely to be restored for some time. On balance, it would probably be better if "scientific harvesting" was kept to a minimum.\p
Other typical causes of die-offs such as this can be ruled out by the lack of dead bodies washed up on the shore, as usually happens when toxins, disease or starvation are involved, while the increase in sea urchin numbers also indicates that starvation is an unlikely cause.\p
As well, killer whale attacks on sea otters have been seen since 1991, although killer whales had often been seen near sea otters over the previous twenty years, without attacks ever taking place. About a dozen attacks have been witnessed since then. Significantly, while sea \Jotter\j populations have fallen 76% in Kuluk Bay in the west-central Aleutian archipelago, Clam Lagoon is an adjacent area uniquely inaccessible to the whales. In Clam Lagoon, sea \Jotter\j numbers held steady from 1993 to 1997. Tagged otters have provided evidence that marked animals do not move between the two areas.\p
The problem is that the sea \Jotter\j is a "keystone species", the top predator in a food web having a three-level system made up of the sea \Jotter\j, sea urchin and kelp. Just as taking the keystone from an arch brings the whole arch down, taking out the keystone species brings the whole \Jecosystem\j down. Kelp densities have dropped twelve-fold in some cases.\p
In contrast, the killer whale is a top predator in the oceanic ecosystems. By shifting to the sea \Jotter\j as a food source, the whale makes a fourth level in the sea \Jotter\j's coastal system, and upsets the balance in the coastal food web.\p
\BKey names:\b Jim Estes, Terrie Williams, Tim Tinker and Daniel Doak\p
#
"No butterfly effects around El Ni±o",859,0,0,0
(Oct '98)
For more than thirty years, chaos theory and the "butterfly effect" of \IEdward Lorenz\i have been considered to rule over \Jweather\j forecasting over any long period. We might be able to predict \Jweather\j for up to a week, but in the longer term, the effects of a Brazilian butterfly flapping its wings could be magnified into a major storm somewhere, and the standard view has been that there was no way to make scientific and valid long-term \Jweather\j forecasts.\p
That view has been questioned by a report in \IScience\i during October, which argues that fluctuations in the \Jearth\j's climate from year to year, such as those that are associated with \IEl Ni±o\i, are considerably more predictable than had been previously believed. It appears there are important exceptions to the butterfly effect and that certain aspects of climate are far more predictable than previously thought.\p
Research by J. Shukla (no first name is given) and his colleagues has shown that, although \Jweather\j cannot be predicted beyond a few days away, atmospheric circulation and precipitation, averaged for an entire season, are potentially predictable. According to Shukla, there is predictability in the midst of chaos, so that the large scale effects of all future major El Ni±o events should be predictable several months in advance.\p
Tropical seasonal averages can be predicted because the tropical \Jatmosphere\j responds directly to slowly varying conditions at the \Jearth\j's surface. Shukla and his colleagues say they have run models of the global climate to show that seasonal mean \Jweather\j conditions are determined by sea surface \Jtemperature\j, soil wetness, vegetation and snow cover. Most importantly, variations in sea surface \Jtemperature\j such as those that are associated with El Ni±o, can significantly alter \Jweather\j in the tropics for an entire season, or longer.\p
#
"Giant iceberg spotted",860,0,0,0
(Oct '98)
It was reported in mid-October that a large \Jiceberg\j, labelled A-38, covering roughly 6,500 square kilometres had "calved off" from \JAntarctica\j's second-largest ice shelf, the Ronne Ice Shelf, in the southern \IWeddell Sea.\i It is only the second \Jiceberg\j of this size to have split off from an ice shelf in the past eleven years.\p
The \Jiceberg\j was spotted by \Jsatellite\j, and like any \Jiceberg\j with a long axis greater than 10 nautical miles, it is to be tracked by the US National Ice Center (NIC). The mapping will continue, so long as the \Jiceberg\j is sighted at least once every 30 days.\p
The name of an \Jiceberg\j is derived from the quadrant it is found in: \p
A = 0-90W (Bellinghausen/Weddell Sea)\p
B = 90W-180 (Amundsen/Eastern Ross Sea)\p
C = 180-90E (Western Ross Sea/Wilkes Land)\p
D = 90E-0 (Amery/Eastern Weddell Sea).\p
The number which follows is issued in a simple sequence. If A-38 breaks up, the pieces will then become A-38A, A-38B etc.\p
#
"SHEBA free at last",861,0,0,0
(Oct '98)
The Canadian Coast Guard icebreaker \IDes Groseilliers\i (see \BSHEBA's winter\b, September 1997) is finally out of the ice floe which has been its home for more than a year, and it anchored off Prudhoe Bay in mid-October to offload personnel and cargo. Over the past year, scientists have monitored the ice, the \Jatmosphere\j and the ocean over a full annual cycle covering the physical variables in all three systems.\p
The ship was deliberately frozen in to provide a base for Ice Station SHEBA, a slightly contrived \Jacronym\j for the Surface Heat Budget of the \JArctic\j Ocean. The \JArctic\j ice pack goes through huge changes every year, swinging from a winter high when the ice pack is about the size of the continental United States, down to summer, when there is only half as much ice. Understanding what controls this annual freeze and meltdown is a key to predicting future climate change and assessing the toll of global warming.\p
The \Jtemperature\j changes affect all the life forms, from the microscopic \Jalgae\j on the \Jsea ice\j to the 750 kg (1600-pound) \Jpolar bear\js that roam the surface. Observing all of the changes called for a station which included of a collection of plywood research huts, cold-\Jweather\j tents, meteorological towers, automatic buoys and stands of instruments surrounding the ship.\p
The last time a ship was "frozen in" like this, it was the Norwegian \IFram\i in 1893, and the crew were stuck there for three years. Life this time was less onerous than in the 19th century: small planes able to land on the ice and U.S. icebreakers scientists and crew to be rotated during the experiment. The number of scientists working on site ranged from 15 in mid-winter to about 35 last northern spring. Altogether, more than 170 scientists were at the station for varying periods of time. One American stayed six months straight and one Canadian biologist never left the station.\p
The station was less than static during that time, moving along a 1700 km course and ending up about 700 km north-west of where it started. The ship began at 75 degrees North and 143 degrees West, described as being about 300 miles north of Deadhorse, \JAlaska\j, and it completed the experience at 80 degrees north and 166 degrees west. As well as the overall drift, there were local rearrangements as well, and large changes could occur when the ice moved. One blizzard moved some of the station components 400 metres (a quarter mile) from the ship overnight. The coldest \Jtemperature\j recorded at the station was about -42░C on New Year's Eve, while the warmest was 0.8░C on July 20, 1998.\p
#
"Lemur news for 1998",862,0,0,0
(Oct '98)
Late in October, four more black-and-white ruffed lemurs (see \BAnd the lemurs are doing well . . .,\b November 1997) began their journey back into the wilderness of their homeland in Madagascar in the second year of a project to replenish wild stocks of their species. This is part of a three-year project by the international Madagascar Fauna Group (MFG) to systematically repatriate as many as 20 of the adaptable animals to their ancestral island nation.\p
Two of the animals came from Hogle Zoo in Salt Lake City, while the others are from the Wildlife Conservation Society breeding facility on St. Catherine's Island, off the coast of Georgia, USA. One of the females, Tricia will be released near one of last year's males, Sarph, in the hope that they will mate. Two other females, Dawn and Jupiter will be released with the male Barney in an area of the reserve where no ruffed lemurs live.\p
Two of the five 1997 animals have since died: Letitia was killed in March by a \Ifossa\i, a \Jpanther\j-like animal that is itself endangered, and in July, Janus died from a broken neck, apparently from a fall. The encouraging aspect is that the animals had managed to survive for quite some time after their release. Better still, the female Praesepe came into oestrus and mated, and since the animals' reproductive receptivity is based on the daily photoperiod, this means that she has already shifted her cycle from a North American to a Southern Hemisphere \Jday\j length. Unfortunately, the newly released animals had not had time to mix with the wild lemurs, and Praesepe bred with her brother Zuben'ubi.\p
#
"Vampires and rabies",863,0,0,0
(Oct '98)
In late September, too late for our September updates, came the news that a Spanish neurologist, Juan G≤mez-Alonso, had suggested in the journal \INeurology\i that a major \Irabies\i epidemic in Europe in the 1700s lies behind the \Ivampire\i legends which developed at that time. He claimed in various news stories to have come up with this theory while watching a Dracula film, "more as a doctor than as a spectator", after which he says he became impressed by some obvious similarities between vampires and what happens in \Jrabies\j, such as aggressiveness and hypersexuality.\p
\JRabies\j is seven times as common in men as in women, and vampires are usually male. Again, about a quarter of all rabid men have a tendency to bite others, and G≤mez-Alonso noted also that early tales of vampirism often came at the same time as reports of \Jrabies\j outbreaks in the area around the Balkans, stretching back to a particularly devastating epidemic of \Jrabies\j in dogs, wolves and other animals in \JHungary\j from 1721 to 1728.\p
Even the \Jgarlic\j and mirrors reactions of vampires can be explained through \Jrabies\j. Rabid men often react to stimuli such as smells, \Jwater\j, and light with spasms of the facial and vocal muscles that can cause hoarse sounds, bared teeth and frothing at the mouth of bloody fluid. In fact, he notes, one test for a man being rabid was to see if he could stand the sight of his own image in a mirror.\p
\JRabies\j acts on the brain, influencing both sleep cycles and sexual behaviour, and according to G≤mez-Alonso, "the literature reports cases of rabid patients who practiced intercourse up to 30 times in a \Jday\j". The vampire's fatal kiss can also be explained through \Jrabies\j, for the \Jrabies\j virus is found in \Jsaliva\j and other bodily secretions. Vampires are said to roam at night, sometimes appearing in the form of dogs, wolves or bats. This association of bats (and maybe even werewolves) with the vampire legend could be related to the part these animals play as carriers of the disease.\p
#
"November, 1998 Science Review",864,0,0,0
\JInternational Space Station makes a start\j
\JEuropa's frosty plains and hidden seas\j
\JParkes telescope finds 1000th pulsar\j
\JA breakthrough with human embryonic stem cells \j
\JPhantom pain explained\j
\JTardigrades may make organ transplantation easier\j
\JArsenic can fight cancer\j
\JGarlic in the news\j
\JMalarial genetic code starts to become available\j
\JA new Phylloxera strain in California\j
\JHow does a nematode tell its sex?\j
\JBeyond Jurassic Park: real science with ancient DNA\j
\JGetting advance notice of colon cancer\j
\JPhototropism explained\j
\JHow plants get the message\j
\JLoggerheads are good for the coastline\j
\JAlum in the water is safe after all\j
\JTeraflops and terabytes\j
\JIt's in the contract?\j
\JA chemical bloodhound to sniff out landmines\j
\JCro-Magnon Venus figures explained\j
\JDinosaur eggs, dinosaur embryo skin\j
\JNew predatory dinosaur\j
\JThe world's oldest flowering plant - for now\j
\JFinding a hotspot\j
\JPlumbing the depths\j
\JAntarctic volcano\j
\JThe evaporation paradox\j
#
"International Space Station makes a start",865,0,0,0
(Nov '98)
The main news in November must surely be the launch in late November of the first section of the International Space Station. The first element of the International Space Station was launched from Star City, \JRussia\j on Friday November 20. Zarya (Russian for sunrise), also known as the Functional Command Block, or from its Russian \Jacronym\j, FGB, will be the first component launched for the International Space Station. This unit will provide the station's initial propulsion and power.\p
Zarya weighs in at 42,600 pounds, around 19,400 kg, and the pressurized module was launched by a Russian Proton rocket. The Zarya module is 41.2 feet long and 13.5 feet wide at its widest point, and has much the same layout as the earlier Mir space station's core module. Zarya has an operational lifetime of at least 15 years. Its solar arrays and six nickel-cadmium batteries can provide an average of 3 kilowatts of electrical power. \p
According to NASA, the living accommodation on the Service Module includes " . . . personal sleeping quarters for the crew; toilet and hygiene facilities; a galley with a \Jrefrigerator\j-freezer; and a table for securing meals while eating. The module will have a total of 14 windows, including three 9-inch diameter windows in the forward Transfer Compartment for viewing docking activities; one large 16-inch diameter window in the Working Compartment; an individual window in each crew compartment; and additional windows positioned for \JEarth\j and intra-module observations. Exercise equipment will include a NASA-provided treadmill and a stationary \Jbicycle\j."\p
The good news for those on board: their waste \Jwater\j and condensation \Jwater\j will be recycled for use in oxygen-generating devices on the module, but it is not planned to be recycled for use as drinking \Jwater\j.\p
Early December saw the first shuttle payload for the station launched into space. A number of leading astronomers have slammed the project, arguing that it will not offer a hard vacuum, because of leakage, it will not offer zero gravity except at the station's centre of mass, and vibration will interfere with other observations.\p
Your reporter shares the dream of the International Space Station, while wishing to retain scientific impartiality. Here is what he gleaned from an \JInternet\j correspondent (and friend) who is part of the ISS project: \p
\I"This is a wonderful opportunity to promote international cooperation to build something that could benefit all mankind. To tell you the truth, we don't really know all the far-reaching benefits that will come from this space station. Just like when we first started building Mercury capsules, we didn't know that our research would give the world new materials, processes, computer technology, teeny tiny electronic chips, Velcro, Nomex, etc. Even the mere fact that we all sit around these days in our homes and yak with one another via email is due directly to the space industry, to our need for smaller and more compact and reliable electronic systems.\p
We hope that living and working on the space station will bring new insight into our \Jphysiology\j; that medical experiments can lead to new breakthroughs in finding cures for life-threatening illnesses; that new science will emerge which can make our lives better and help us save this \Jplanet\j.\p
This is what we hope for. But if history has a way of repeating itself, we shall come away with much, much more. Things we never dreamed of."\i\p
So even if all of the criticisms proved to be valid, even if none of the hoped-for chance discoveries came off, the hope that is raised by this project, its symbolic worth, should serve to make it all worthwhile. It will be a sad \Jday\j for humanity if Russian economic problems stop the station, which will require about forty more launches over the next five years. And to some dreamers, it will be worse, the loss of a dream which has been a long time coming.\p
The idea of humans in \JEarth\j \Jorbit\j dates from just after the US Civil War. In 1869, American writer \IEdward Everett Hale\i published a \Jscience fiction\j tale called "The Brick Moon" in the \IAtlantic Monthly.\i Hale envisaged the manned \Jsatellite\j as a navigational aid for ships at sea. Oddly enough, the fictional designers of the Brick Moon encountered many of the same problems with redesigns and funding that NASA would with its station a century later.\p
In 1923, \IHermann Oberth,\i a visionary and pioneer of rocketry, coined the term "space station." Oberth's station was the starting point for flights to the Moon and Mars. Herman Noordung, an Austrian, published the first space station blueprint in 1928.\p
In April 1961, the Soviet Union launched the first human, Yuri Gagarin, into space in the \JVostok\j 1 \Jspacecraft\j. President John F. Kennedy reviewed many options for a response to prove that the US would not yield space to the Soviet Union, including a space station, but putting a man on the moon won out. It was left to the Russians to develop space stations, through the \ISalyut (Salute) space station\i series, and on to the \IMir (Peace) space station,\i but now the world is joining in.\p
\BLate report:\b as these notes were being completed in mid-December, the first people had entered the newly joined first segments of the space station, and been filmed, free of their space suits, in the spacious quarters that are already available to them.\p
#
"Europa's frosty plains and hidden seas",866,0,0,0
(Nov '98)
The evidence for Europa's oceans continues to grow. In early November, some of the pictures taken during the September 26 Europa flyby were still being analyzed. The terrain types shown in photographs show "strong evidence of a liquid ocean or mobile substance similar to an \Jiceberg\j found on \JEarth\j."\p
The next Europa flyby was scheduled for November 22, and took place as planned. Some of the results were released onto the \JInternet\j in early December as these updates were being prepared: once again, the evidence is pointing at oceans on Europa, and there will be a review article on the search for life on Europa next month.\p
#
"Parkes telescope finds 1000th pulsar",867,0,0,0
(Nov '98)
A group of researchers using the Parkes radio \Jtelescope\j, run by \JAustralia\j's CSIRO, announced in early November that they had found the thousandth \Ipulsar\i known to science. No other \Jtelescope\j in the world has found so many, with the Australian \Jtelescope\j's tally now standing at over 200, and increasing by about one for each hour that the \Jtelescope\j is in use for this search.\p
Which has to raise the question: with a thousand found, and even though our galaxy may contain 300,000 of them, why do the astronomers bother? The answer is that just as biologists hunt for new species to build up a picture of the \JEarth\j's \Ibiodiversity,\i so astronomers hunt for new pulsars to understand astrodiversity. They will never find all of the possibles: many have signals that are too weak to pick up, or their beams are not pointing towards us, but there is every chance that some of those not yet found will reveal interesting new facts.\p
There are many different types of pulsar, and we have only a few examples of some types, so the continuing surveys will, at the very least, fill out our picture of the rarer types of pulsar. For example, the current survey has already found a pulsar orbiting another neutron star, only the sixth such object known.\p
The other important question relates to the rotation speed of the pulsars. The center of a pulsar is denser than an atomic nucleus, and the equations which describe pulsar matter put a limit on how fast a pulsar can spin without it breaking apart. The fastest pulsar known so far spins at around 600 times a second. If the survey found one spinning faster, say at 1200 times a second, that would better pin down what pulsars are made of.\p
#
"A breakthrough with human embryonic stem cells",868,0,0,0
(Nov '98)
Early November saw a great deal of excitement when it was announced that for the first time, human embryonic stem cells had been cultured in the laboratory. The November 6 issue of \IScience\i carried reports from Geron, the company which holds the licenses to the technology, Johns Hopkins University and the University of \JWisconsin\j-Madison on the breakthrough.\p
Human embryonic stem cells (hES cells) have the potential to form any sort of cell at all in the human body. In technical terms, these unique cells have now been successfully derived and maintained in culture for the first time. Because the stem cells are the source of all cell types, they hold great promise for use in transplantation medicine, drug discovery and even in human developmental \Jbiology\j.\p
The way is now open, for example, to develop the methods which will let us grow a new organ to replace a diseased or missing organ, or to produce heart muscle cells for use in repairing the tissue damage inflicted by heart attacks, or blood forming cells for use in bone marrow transplantation procedures for cancer patients, and nerve cells for use in treating patients with Parkinson's disease, stroke or Alzheimer's disease.\p
As a human or any other embryo develops, it forms into a clump of around 140 cells, in a shape called a blastocyst. This is the preimplantation embryo in mammals, a sphere of cells with an outer cell layer, a fluid-filled cavity, and a cluster of cells on the inside which is called the inner cell mass. Some of the cells of the inner cell mass become separated: these are the embryonic stem cells.\p
Embryonic stem cells are described as either "totipotent" or "pluripotent" (see note below), meaning that they have multiple potential, because they can develop into virtually any and all cells and tissues in the body. And more importantly, there is no limit on their ability to divide, over and over again. This ability is enhanced by Geron's previously developed "cell immortalization" technology. This stops the process of aging which eventually makes cells wear out.\p
At the end of each \Jchromosome\j in our cells, there are telomeres, non-coding pieces of DNA which sit at the ends of chromosomes. The telomeres are made of the same six-base sequence (TTAGGG), repeated thousands of times. Scientists believe that each replication removes somewhere between five and twenty of these six-base units, leading to a "life" of between sixty and a hundred generations.\p
While stem cells and other embryonic cells start out with complete telomeres, replication of the chromosomes slowly erodes the telomeres, until they are worn away, leaving a cell which can no longer survive. In laboratory cultures, unless special steps are taken, the cells in a culture will eventually run out of telomeres - and when that happens, they run out of steam.\p
The telomeres seem to be needed to protect the chromosomes in some way, to allow them to replicate and divide. This effect puts a natural limit on the number of cell "generations" possible, away from the germ cell line, which suggests that the telomere erosion process must be a powerful tool in protecting the telomeres' owner from the effects of copying errors - and also preventing the explosive growths that we call cancer.\p
One way of stopping this erosion is to apply an \Jenzyme\j in the cell called telomerase. Telomerase is made up of a section of protein and a small \IRNA\i template which provides the genetic code needed to replace a missing telomere sequence. All human cells carry the gene for this \Jenzyme\j, but few of our cells actually switch it on to produce telomerase. In cancers, the cells of virulent tumors do switch on the telomerase gene, and current theory says that the \Jenzyme\j helps the cancer cells to achieve something close to immortality by preserving the telomeres, or by rebuilding them.\p
This lack of telomerase activity may be useful in some parts of the body, but mammalian testes produce new sperm cells every \Jday\j of the adult life of a male \Jmammal\j. So it is hardly surprising that telomerase is found in testicles, and no matter how many times the cells divide in a testis, the telomeres always stay the same size.\p
In simple terms, there is now good evidence supporting the view that the "clock" which counts cell divisions is in the telomere length. Aside from the evidence that the telomeres are maintained in areas such as the testicles, where cell division rates are high, cases have been recorded of abnormally fast telomere shortening in tissues where the demand for cell replacement is often high. Examples include the inside walls of middle-aged arteries which are subjected to high levels of stress.\p
The hES cells reported this month are self-renewing, making them a potentially limitless source of cells. This self-renewing power comes from an application of telomere and telomerase technology. Geron already has plans to produce a variety of different cell types for use in drug screening and \Jtoxicology\j testing. In addition, hES cells can potentially be engineered to create in vivo models of human disease for drug development as a superior alternative to current mouse models. \p
From here, Geron's research will be directed at "enabling technologies," which they describe as including the techniques for the production and scale up of hES cells; the identification of cell differentiation factors; techniques for genetically \Jengineering\j hES cells, and the development of models to test proposed transplant products. \p
\BThe legal and ethical minefield that is science\b\p
Everything about this story reveals the care that scientists have to exercise at the end of the twentieth century. Normal embryonic stem cells in other animals are referred to as "totipotent," because they are capable of being cultured up to a complete organism. For obvious ethical reasons, this possibility will not be explored with human cells, and so the alternative term "pluripotent" is used. Instead of asserting that the cells have all powers (as they probably do), the researchers only claim that the cells have multiple powers. A careful semantic distinction will, they hope, keep them safe from the attacks of those who oppose such research.\p
As well, the releases which have appeared carry careful legal disclaimers to avoid anybody jumping in and investing in the company, and then later suing Geron if the work turns out to be less effective than the company believes. "The company desires to take advantage of the 'safe harbor' provision of the Private Securities Litigation Reform Act of 1995. Specifically, the company wishes to alert readers that the matters discussed in this press release constitute forward-looking statements that are subject to certain risks and uncertainties . . ."\p
Patents issues:\p
We might also consider here the patent involvements, listed as including:\p
\I"A worldwide license to the stem cell patent estate of Dr. James Thomson and the \JWisconsin\j Alumni Research Foundation. Licensed patent applications include primate and human embryonic stem cells and related diagnostic and therapeutic products. A broad claim has already been allowed in the US from this estate covering primate embryonic stem cells with the characteristics described in the Thomson Science paper just published. \p
A worldwide license to the stem cell patent estate of Dr. John Gearhart and the Johns Hopkins University. This license includes all applications of human embryonic stem cells including diagnostic and therapeutic products. \p
A worldwide license to patent applications of Dr. Roger Pedersen and the Regents of the University of \JCalifornia\j covering diagnostic and therapeutic products based on hES cells. \p
A license of the Genpharm patents on genetic modification of human and primate embryonic stem cells by homologous recombination (gene targeting). This estate consists of four issued US patents and eight pending US patent applications. \p
Geron-owned patent applications including screens for hES cell growth factors, media formulation and preferred growth conditions, and also relating to telomerase gene transfer and recombinant telomerase expression in embryonic stem cells and their derivatives."\i\p
Ethical safeguards:\p
Geron's press releases stress, over and over again, that they are being extra-careful with the ethical issues:\p
\I"These hES cells are derived from in vitro fertilized blastocysts. IVF clinic patients voluntarily donated the blastocysts with informed consent. The University of \JWisconsin\j-Madison Institutional Review Board approved all of the research protocols. Geron's intended hES cell research as well as that of its collaborators is conducted within the suggested guidelines of the 1994 report by the NIH Human Embryo Research Panel. \i\p
They go on to point out that hES cells are derived from the blastocyst inner cell mass by a laboratory process and are not the cellular equivalent of an embryo. The small cluster of cells (inner cell mass) within the in vitro fertilized blastocyst from which the hES cells are derived have not yet differentiated and are unspecialized. \p
This is, of course, a key issue. Given the emotive problems of technology which is in any way "tainted" by being associated with cells taken from "a fetus", Geron rightly see it as essential that they distance themselves from any such accusation. Over the past twenty years, geneticists have learned too many harsh lessons from assuming that the public will understand complex issues.\p
It simply is not safe to assume that protest groups will approach the question with scientific understanding and detachment. So Geron have set their position out in this way:\p
\I"The inner cell mass does not form discrete parts of an individual embryo as one or more of these inner mass cells can be removed from the blastocyst (for preimplantation diagnosis) without affecting subsequent fetal development after transfer to the uterus. If hES cells were to be transferred to a uterus, the hES cells would not form an embryo because other cells necessary for implantation and embryogenesis have been lost in the derivation process. \p
Geron has adopted the conclusion of the NIH Human Embryo Research Panel in its 1994 report. The cells will not be used for (i) \Jcloning\j humans, (ii) transferring to a uterus, or (iii) generating human-human or human-animal chimeras (live animal hybrids produced by mixing embryonic stem cells of different individuals or species)."\i\p
\BTechnical issues 1: Pluripotency\b\p
The hES cells can form virtually any cell in the body. They can form derivatives of all three layers found in the early embryo: endoderm tissues such as gut \Jepithelium\j, mesoderm tissues such as cartilage, bone, and smooth and striated muscle, and also ectoderm tissues like neural \Jepithelium\j, embryonic ganglia and stratified squamous \Jepithelium\j. This distinguishes the hES cells from later stage human stem cells, which have only a limited capability to form certain cell types such as blood cells (CD34+ stem cells) or connective tissue (mesenchymal stem cells). \p
\BTechnical issues 2: Self-renewing\b\p
In culture, the hES cells can be made to reproduce themselves without any differentiation into committed tissues. If this ability can be maintained, hES cells may provide a continuous source of normal pluripotent human stem cells. \p
\BTechnical issues 3: Telomerase expression\b\p
Telomerase is an RNA-dependent DNA polymerase demonstrated by Geron Corporation and its collaborators to be the \Jenzyme\j which operates in normal cells to allow their continual proliferation. The hES cells normally express the \Jenzyme\j telomerase, and this continual steady level of activity of telomerase in the hES cells gives them the next best thing to immortality. Other stem cells express telomerase at low levels or only periodically, and so they age and stop dividing with time. \p
A normal \Jchromosome\j structure (karyotype) is maintained in the hES cells, which keep a structurally normal set of chromosomes (including the sex chromosomes, XX or XY) even after prolonged growth in culture. They do not, for example, have any additions, deletions or rearrangements in their chromosomal structure as is commonly found in cell lines "immortalized" by viruses.\p
\BImplications and prospects\b\p
Some of the suggested applications for the new hES cells would have been dismissed as \Jscience fiction\j just a few years ago. Now, we must take them seriously, but the time-scale must still be regarded as one of years, or perhaps even a decade, before we see them in practical operation. The following list of prospects is probably the best explanation of the excitement which greeted the announcement.\p
For example, blood-forming stem cells could be developed from hES cells as has been done using mouse embryonic stem cells in 1997. This would increase the availability of these cells and reduce reliance on donors. Further, these hES cell-derived hematopoietic stem cells could perhaps be genetically engineered to resist infection by such agents as the HIV virus and used in a transplant setting for the treatment of AIDS, or possibly used for the treatment of patients with sickle cell anaemia.\p
Endothelial cells, which can form blood vessels, have been observed in hES-derived teratomas in mice and mouse endothelial cells have been derived from mouse embryonic stem cells. These blood vessel forming cells could be generated from hES cells and used to re-line blood vessels for the purpose of treating \Jatherosclerosis\j.\p
It is possible that \Jinsulin\j-secreting \Jpancreas\j (Islet of Langerhans) cells might be produced from hES cells, allowing people with IDDM (\JInsulin\j Dependent Diabetes Mellitus) to be cured for life.\p
Mouse neurons were derived from mouse embryonic stem cells in 1995. Neurons derived from hES cells potentially could be prepared for the treatment of people who suffer from Parkinson's disease, those who suffer a stroke, and even Alzheimer's disease victims.\p
Gene targeting may also be possible with hES cells. This is a sophisticated technique of genetic \Jengineering\j which allows the gene of interest to be "targeted to" a specific and desired location in the \Jchromosome\j. This allows the gene of interest to be controlled by the normal regulators of that gene's expression. Because gene targeting requires several rounds of selection, each requiring multiple cycles of cell division, only immortalized cells can survive long enough to be suitable for this form of genetic \Jengineering\j. \p
Other applications range from sources of cartilage to treat osteoarthritis to the development of normal human cell cultures on which new pharmaceuticals can be tested. New and suspected teratogens can be screened in an ethical way, and many fertility and \Jpregnancy\j problems may be unravelled through the use of this technology.\p
\BKey names:\b James A. Thomson, John D. Gearhart, Roger A. Pedersen, Geron Corporation. \p
#
"Phantom pain explained",869,0,0,0
(Nov '98)
Many amputees find that they are dogged by a lifetime of disturbing effects as their \Isenses\i send them what seems to be false information in the form of \Ipain.\i For them, a touch on the face feels like a touch on the lost limb; the missing fingers or toes seem to be moving toward the remaining stump; and pain can persist in the limb that is long gone.\p
A report in \IScience\i in early November described a new way of looking at the brain, pain, and how we perceive these effects. Edward Jones and Tim Pons studied the brains from eight monkeys in which, 12 to 20 years before their deaths, the nerves in one arm had been surgically severed at the spinal column. All the animals were from a group called the Silver Spring monkeys, which were the subjects of an intense debate in 1985 about animal experimentation.\p
They found that brain cells which would normally have received messages from the cut-off area reduced in size, and were then replaced by neighbouring nerve cells that normally carried information from the face. This meant that sensory messages from the face were carried to the part of the cerebral \Jcortex\j that normally receives sensations from the arm. There was also increased activity in adjacent brain cells that normally carry painful messages.\p
Combined with recent news that adults, even people in their 60s and 70s, grow new brain cells, the report adds to the growing awareness that the adult brain is far more responsive and adaptable than had been believed. The next step: to try to find ways to harness this understanding, so the phantom pain problem can be reduced.\p
#
"Tardigrades may make organ transplantation easier",870,0,0,0
(Nov '98)
A tardigrade is a tiny \Iarthropod\i with four pairs of stumpy legs which end in claws, and no definite circulatory or respiratory systems. Sometimes called the tortoises of the \Jinvertebrate\j world, the slow-moving tardigrades are very small, and they are typically found in the sediment of rain gutters, in moss, and in other moist environments.\p
They feed by piercing the cell walls of individual plant cells, sucking out the cell contents. But most importantly for this story, tardigrades live for long times in suspended animation, sometimes for more than a century. Using these animals, Japanese researchers have invented a new technique for storing human organs for transplant.\p
Organs which have been removed from a body can usually be stored for only 30 hours before they have to be used, and hearts and lungs have a time limit of just four hours. The obvious answer, using cold storage, has a major drawback: \Jwater\j damages cell membranes at low temperatures, but removing the \Jwater\j from tissues usually causes at least as much damage.\p
Kunihiro Seki and his colleagues at Kanagawa University in Hiratsuka-shi, \JJapan\j, considered that tardigrades can withstand extreme conditions by losing most of the \Jwater\j in their bodies, and then springing back to life again. In one case, a moss sample which had been stored for 120 years in a museum was flooded with \Jwater\j, and soon after, tardigrades were found crawling over the moss.\p
The tardigrades use a sugar called trehalose to stabilize the structure of their cell membranes, and this gave the Japanese team an idea for preserving mammalian organs. They flushed rat hearts with trehalose solution before packing them in \Jsilica\j gel to remove the \Jwater\j from their cells, which they then stored in perfluorocarbon, a biologically inert compound, at 4░C in airtight jars.\p
Ten days later, they took the hearts out and revived them: within half an hour, the hearts were beating again, and measurements of the hearts' electrical activity suggested that the heart cells had survived intact. The team are now planning a longer-term test, storing organs for up to a year. If this is successful and the method can be extended to human organs, they believe that transplant recipients could have a greater choice of donor organs, improving their chances of a good match.\p
#
"Arsenic can fight cancer",871,0,0,0
(Nov '98)
Following up on a Chinese report from 1996, American researchers have shown that treatment with low doses of \Iarsenic\i trioxide induced remission in patients with acute promyelocytic leukemia (APL). The confirmation was reported in an early November edition of the \INew England Journal of Medicine.\i APL is a potentially fatal type of cancer that affects the blood and bone marrow.\p
Eleven of twelve patients who had relapsed from conventional therapy achieved remission anywhere from 12 to 39 days after treatment started, experiencing only mild side effects. The single patient who failed to reach remission died from a complication related to the disease five days after \Jarsenic\j treatment began and could not be evaluated in the study.\p
The courses of \Jarsenic\j trioxide therapy were repeated every three to six weeks thereafter, and after two cycles, only three of the eleven surviving patients tested positive for molecular evidence of the disease and later relapsed with APL. The remaining eight tested negative for molecular evidence of APL and remained in remissions that lasted as long as 10 months. So far, several patients have received up to six courses of \Jarsenic\j treatment without experiencing cumulative side effects.\p
The treatment is not a cure, but \Jarsenic\j trioxide appears to be better than any other treatment for the condition. It works by killing the cancerous cells that cause APL, including those that have become resistant to the most successful form of treatment, a drug called all-\Itrans\i retinoic acid, which forces the cancerous cells to mature and die normally.\p
Around 30% of the patients who receive all-trans retinoic acid develop resistance to it and relapse with the disease. \JArsenic\j trioxide appears to bypass this resistance by forcing APL cells to partly mature, and then causing them to self-destruct.\p
\BKey names:\b Raymond P. Warrell, Jr., Steven Soignet\p
#
"Garlic in the news",872,0,0,0
(Nov '98)
November was a big month for \Igarlic\i. Dr. Yu-Yan Yeh reported on a group of chemicals in \Jgarlic\j which decreases \Jcholesterol\j production by rat liver cells in laboratory tests, Glenn Birrenkott recommended it as the perfect material to kill the odor of large poultry and pig farms, and two Penn State colleagues of Yu-Yan Yeh, Kun Song and John A. Milner, celebrated the powers of \Jgarlic\j over cancer, and the ways these powers can be affected by cookery.\p
Yeh's paper, "Allyl Sulfur Compounds of \JGarlic\j Inhibit \JCholesterol\j Biosynthesis," was written with Lijuan Liu, a doctoral candidate at Penn State. They identified a group of three \Jwater\j soluble, sulfur-containing, \Jgarlic\j constituents (S-allyl cysteine, S-\Jethyl\j-cysteine and S-\Jpropyl\j cysteine) that decreased \Jcholesterol\j production in cultured rat liver cells by 40% to 60%.\p
While fresh \Jgarlic\j was used, deodorized aged \Jgarlic\j extract consists mostly of the same \Jwater\j soluble, sulfur-containing chemicals, said Yeh, presenting their paper at a conference on "Recent Advances on the Nutritional Benefits Accompanying the Use of \JGarlic\j as a Supplement" at the Marriott Newport Center, Newport Beach, \JCalifornia\j.\p
Birrenkott recommends feeding chickens 3% \Jgarlic\j powder, to make "the poultry house smell like a pizzeria instead of \Jmanure\j." Your reporter has had a pizza like that, but we will let that pass. As urban populations expand into rural areas, Birrenkott says, there is an increased risk of conflict, with people wanting meat and eggs in their \Jrefrigerator\j, but not in their backyards. The \Jgarlic\j masks the smell of the chicken waste materials, but also serves to make better tasting eggs!\p
Tasters reported that the \Jgarlic\j eggs were milder than the eggs from the control hens, leading Birrenkott to speculate that it might be reducing the sulfur content of the eggs. It takes three weeks to reduce the poultry house odor compared to the odor from a control group of laying hens, but there is no indication of how long it takes to get the eggs up to the new \Jgarlic\j-mild standard.\p
Pigs, on the other hand, refused to eat the \Jgarlic\j-supplemented food for several days, and then "cooperated", with a loss in smell following. Tests are now under way, with chemical analyses of the \Jcholesterol\j content of the eggs and pork being carried out. The cost of the \Jgarlic\j pork and eggs is higher, due to the cost of the \Jgarlic\j supplement, but if there is a health advantage, producers may be able to charge a premium for their eggs and pork.\p
Kun Song and John Milner have shown that \Jgarlic\j's established anti-cancer powers are destroyed by microwave heating or roasting - unless the herb is chopped or crushed, and allowed to "stand" for at least 10 minutes before cooking. They presented their results at the same conference which featured Yu-Yan Yeh.\p
The research was the first to show that as little as one minute of microwaving or 45 minutes of oven roasting can completely block \Jgarlic\j's ability to retard the action of a known cancer-causing agent in rats. \JGarlic\j's anti-cancer activity was retained, however, if the herb was first chopped or crushed and allowed to stand for 10 minutes before being heated. In the case of roasted whole \Jgarlic\j, anti-cancer activity was partially retained if the top of the bulb was sliced off prior to heating.\p
Song believes that the 10-minute "standing period" after chopping or crushing the \Jgarlic\j enables an \Jenzyme\j naturally present in certain \Jgarlic\j cells to come in contact with and act on chemicals in other cells. Chopping or crushing the \Jgarlic\j opens the cells and enables the \Jenzyme\j to start a reaction that produces chemicals called allyl sulfur compounds that possess anti-cancer properties. If the \Jgarlic\j is heated or roasted before the enzymes get out of the cells and drive the reaction, the \Jenzyme\j is de-activated by the heating process and \Jgarlic\j's anti-cancer effects are blocked.\p
#
"Malarial genetic code starts to become available",873,0,0,0
(Nov '98)
The entire sequence of \Jchromosome\j 2 of the \Imalaria\i parasite \IPlasmodium falciparum\i was announced in \IScience\i in early November. Some reports (including an advance notice here last month) suggested that the entire \Jgenome\j had been teased out, but an international consortium of agencies is still at work on efforts to sequence the other 13 chromosomes of this parasite. Work is also proceeding on the sequencing of \IPlasmodium vivax\i, another of the four different plasmodium parasites that cause human \Jmalaria\j, and also from \IP. Berghei\i, which causes \Jmalaria\j in rodents and which provides a laboratory model of human disease.\p
This is the first report to describe the complete genetic sequence of a parasite \Jchromosome\j, and the work offers us the hope of treatments which may even terminate the threat to human health caused by \Jmalaria\j in many parts of the world. There are between 300 and 500 million new cases of \Jmalaria\j each year, mainly in sub-Saharan \JAfrica\j. Every 30 seconds, a child dies of the disease, somewhere in the world.\p
The researchers say they have identified more than 200 genes, many of which are probably essential to parasite functions, including genes that mediate interactions with host cells and contribute to disease. As an example, they found a group of genes that encode a large family of surface proteins called rifins, which may help the parasite escape the immune response. If this is correct, even that single \Jchromosome\j sequence may offer a solution to the \Jmalaria\j problem.\p
One of the reports on this work carried the news of a new communications network on the World Wide Web for the \Jmalaria\j research community. The URL is http://www.\Jmalaria\j.org\p
\BKey names:\b Malcolm J. Gardner, Stephen L. Hoffman, NMRC, TIGR\p
#
"A new Phylloxera strain in California",874,0,0,0
(Nov '98)
Late in November, a new strain of the underground insect \Iphylloxera\i was reported. The existing strains have cost \JCalifornia\j's premium wine-grape industry more than $1 billion in replanting costs during the last decade, but now this work will pay off, since the new type of phylloxera does not pose a major threat to growers who have switched to resistant rootstock.\p
The news broke at an address to grape-growers at the Napa Valley Viticultural Fair, where its habits were described in detail. The new form of phylloxera is a leaf-feeding pest. Its tell-tale galls form on the underside of fresh grape leaves, although it also attacks roots like the typical \JCalifornia\j strains of the pest. The finds thus far have all been on the leaves of rootstock varieties. To the relief of the grape-growing community, however, the pest does not appear to have an affinity for vinifera varieties such as merlot, chardonnay and cabernet sauvignon.\p
This leaf-attacking habit gives the strain its name, "foliar phylloxera." The type has been seen on wild grape plants in Southern \JCalifornia\j, and it is common on the US east coast and in Europe, but it seems to need a combination of high \Jhumidity\j with summertime rainfall. Nobody is sure where it came from, but some people suspect that it traveled with rootstock from the east coast.\p
The phylloxera is a small orange-colored, \Japhid\j-like \Jlouse\j which sucks \Jnutrients\j out of the roots until they crack and split and become vulnerable to soil fungi and \Jbacteria\j, eventually killing the vine. They reproduce asexually, by \Jcloning\j, which is usually believed to slow down \Jevolution\j, but it appears that the insects are slowly adapting.\p
The foliar-feeding phylloxera burrows itself into the leaf, where one female lays a couple of hundred eggs. It flies poorly, so it must rely on transport by humans moving infested roots around. The foliar forms may move inside bud scales or under the bark near the buds of roots being transported from one area to another.\p
#
"How does a nematode tell its sex?",875,0,0,0
(Nov '98)
Mammals have no trouble telling what sex they are, because if a Y \Jchromosome\j is present, a developing embryo becomes male. \INematode\i worms lack that convenient marker, so it has been assumed that their solution was to count the X chromosomes in their cells, which is not an easy task.\p
According to a paper in \INature\i in mid-November, Barbara Meyer and her colleagues at the University of \JCalifornia\j, Berkeley have found a simpler answer, at least for the standard genetic test animal, \ICaenorhabditis elegans.\i It is in the form of a signal, which they call SEX-1. This is a hormone receptor found in the cell nucleus. It belongs to a family of proteins widely used by animals to translate environmental cues into decisions on how to regulate gene expression. A similar molecule may be involved in the first steps of mammalian sex determination, they say.\p
\IC. Elegans\i worms which have two X chromosomes, XX animals, usually develop into hermaphrodites, while those with a single X \Jchromosome\j, known as XO animals, develop as males. Hermaphrodites prefer to mate with males, but unlike females of other species, the \Jhermaphrodite\j worms produce sperm which they can use to fertilize their own eggs in the absence of males.\p
The mechanism appears to work like this: the worm's X \Jchromosome\j has at least four genes, probably five or six, which signal the presence of an X \Jchromosome\j. The X signals are believed to reduce the activity of a key protein, called XOL-1, which processes the incoming X signals and steers organisms' sexual fates. In other words, when the X signals are strong enough to overwhelm XOL-1, a worm will develop as a \Jhermaphrodite\j. In XO animals, XOL-1 is not suppressed by many X signals and so the XOL-1 can trigger normal male development.\p
The entire \Jgenome\j of this \Jnematode\j was completed and published in early December: more on this next month.\p
#
"Beyond Jurassic Park: real science with ancient DNA",876,0,0,0
(Nov '98)
The \IJurassic Park\i dream of recreating one of the \Idinosaurs\i seems to be just that, at least for the moment (see \BJurassic Park ruled out\b, April 1997). Even recreating a simple bee from its DNA after it was trapped in amber for millions of years is too hard a task. But if the entire creature cannot be recreated, a bacterium extracted from the \Jabdomen\j of these long-dead bees has, for the first time ever, been successfully brought back to life.\p
Dr. Raul Cano, of the \JCalifornia\j \JPolytechnic\j State University, has revived \Jbacteria\j from the gut of bees preserved in 20-45 million year old amber from the Dominican Republic. He revealed this during a symposium titled, "Beyond Jurassic Park: Accessing Genetic Information Hidden in Herbaria and Archival Plant, Microbe and Insect Specimens," during a joint meeting of the American Phytopathological Society and the Entomological Society of America in mid-November in Las Vegas.\p
Cano suggests hopefully that such work might provide new organisms that could produce life-saving pharmaceuticals or be used in valuable industrial processes.\p
In another paper at the same conference, Dr. Craig Liddell described how he has been working with colleagues to sift through herbarium (dried and pressed plant) specimens collected since 1891 to identify those with a specific rust disease, caused by the \Jfungus\j \IPuccinia grindeliae\i. Examining this century old \Jfungus\j and analysing its genetic \Jevolution\j over that time could help identify potential biocontrol uses for eradicating the range-land weed.\p
#
"Getting advance notice of colon cancer",877,0,0,0
(Nov '98)
There is a genetic abnormality, "loss of \Jimprinting\j" or LOI, which is often found with common forms of colon cancer, and which may be found in patients' normal cells. This discovery may offer medical workers the possibility of predicting as many as 40% of new colon cancer cases before they start.\p
LOI refers to an abnormal switching-on or off of genes. People inherit copies of the same gene from both parents, and both genes or sets of genes work together in a person's cells. In normal development, in a process called \Jimprinting\j, the copy from one parent, in certain genes, is turned off. This inherited preferential silencing usually holds throughout all cells in a person's body.\p
In cancer cases, some genes can lose their \Jimprinting\j. LOI has been repeatedly found in cancers, especially in genes that either encourage cell growth or that suppress cancer. In this new study, researchers focused on a specific growth-promoting gene called IGF2 (for \Jinsulin\j-like growth factor), one of several cell growth genes normally imprinted so that only the father's gene gets expressed. In 44% of the cancer patients, that \Jimprinting\j was lost in their tumor cells.\p
Genetic \Jimprinting\j was first linked to cancer about a decade ago, with the childhood tumor, rhabdomyosarcoma. LOI appears frequently in a cancer called Wilms' \Jtumour\j, which is the most common non-blood cancer in children, and it is also definitely implicated in Prader-Willi syndrome, which causes retardation, and LOI can also be seen in some adult lung cancers.\p
Now researchers have detected the appearance of LOI in cancer patients' healthy colon tissue and blood cell samples. As well, four of 31 control patients with no sign of cancer, also tested positive for LOI in colon tissues or blood. The inference is that these are patients likely to develop colon cancer later, but researchers are being careful, saying that more research will be needed to establish LOI's predictive value.\p
Another defect found in actual cancers is called microsatellite instability (MSI). This involves an unusual duplication of certain stretches of DNA - a sign of a glitch in the way DNA reproduces or repairs itself. Typically, MSI is found in cancers that come at a relatively early age and run in families. Some oncologists use MSI to predict a patient's outcome.\p
The researchers at Johns Hopkins found that those patients with abnormal \Jimprinting\j were "for the most part, the same people with MSI." And, they added, those same patients got colon cancer 14 years earlier than the others, on average.\p
So does LOI cause the cancer, or does it just follow along? Nobody knows yet, but it would be nice if it did, because unlike conventional mutations, LOI is potentially reversible.\p
\BFootnote:\b The most striking case of large-scale \Jgenome\j \Jimprinting\j involves crosses between horses and donkeys. If you cross a female horse and a male donkey, you get a mule, but cross a male horse and a female donkey and you get a hinny, altogether a different creature. Clearly, the same genes act out different roles, depending on whether they come from the mother or the father.\p
\BKey names:\b Andrew P. Feinberg, Hengmi Cui, Stanley R. Hamilton and Rolf Ohlsson\p
#
"Phototropism explained",878,0,0,0
(Nov '98)
Plants have an accurate sense for where the light is. A sprouting \Jpotato\j, placed in a closed cardboard box with several cardboard baffles in the way, will drive its shoot through the simple maze, and out into the full light. This sort of thing is well known, and written up in textbooks, under the name phototropism, where a \Itropism\i is any movement in response to a stimulus, and phototropism is just a tropism which involves light.\p
Plants rely on light to gather \Jenergy\j, so this ability is obviously important. But while we refer to the shoots on a \Jpotato\j as "eyes," plants have no organs of vision, so how do they detect the light? Late in November, John Christie and Winslow Briggs reported in \IScience\i that they had isolated the protein that is the UV-A/blue light photoreceptor for phototropism.\p
Plants have a number of photoreceptors, and the red light photoreceptors have been studied for many decades, but until we can identify all of them, we have little chance of understanding how plants grow, and how that growth is regulated. All of the known photoreceptors are proteins with some colored compound, known as a chromophore, bound to them. It is the chromophore that absorbs visible light.\p
One of the bones of contention in this area is related to the phototropism chromophore, which might have been a carotenoid or a flavin (flavins are related to the B-vitamin riboflavin.) Now Briggs and Christie have shown that the protein NPH1, acting with so-called LOV domains, binds a flavin, and is the long-sought photoreceptor.\p
The LOV domains are found in all sorts of proteins that detect changes in redox status as affected by light, oxygen, or voltage, hence the name LOV. In the plant, NPH1 becomes phosphorylated (which is a complicated way of saying that phosphate groups are added to the protein) when it is exposed to blue light. This is one of the earliest biochemical steps in phototropism.\p
Earlier work had shown that NPH1 is essential to phototropism. When Briggs and Christie expressed the NPH1 gene in insect cells, they found that a protein was produced with exactly the same light sensitivity as the protein from the plant, and they established that a flavin was bound to the protein as its chromophore. No other plant protein was present in the insect cell, so NPH1 has to be the photoreceptor for the phosphorylation reaction.\p
#
"How plants get the message",879,0,0,0
(Nov '98)
So now we have agreed that the humble plant has the ability to "see," should we be all that surprised to find that they can send messages in a way similar to the biochemical signalling methods used by animals? A \INature\i report in mid-November reveals that \IArabidopsis,\i a plant which is frequently used in laboratory research, uses one of the same systems of communication as does the human brain. A group of New York University researchers have turned up the sequence for proteins called glutamate \Jreceptors\j in a gene hunt through the DNA of the lowly weed.\p
In our brains, glutamate is a chemical messenger which also plays a role in everything from acquiring and storing memories to possibly contributing to certain mental health ailments. In the past, post mortem studies have revealed suggestions of glutamate overload in the brains of people with \Jschizophrenia\j, and faulty glutamate signalling has also been linked to Alzheimer's disease.\p
Glutamate and other \Ineurotransmitter\i chemicals are squirted out by nerve cells and exert their effects through protein molecules called \Jreceptors\j that are nestled within the outer layers of nearby nerve cells, where they serve as sentries that permit the passage of only certain molecules. There are other plant chemicals which trigger \Jreceptors\j in our brains, \Jcaffeine\j and \Jcocaine\j among them, but scientists have always assumed that the \Jreceptors\j for these molecules, which are essential if the chemical is to transmit messages between cells, were found only in animals. Now, with a single discovery, that assumption has come unstuck.\p
The discovery began when Dr. Gloria Coruzzi noted that the uptake of \Jnitrogen\j from the environment into glutamate and other related molecules was regulated by light. This suggested to her that some plant cells might have a light-activated glutamate "sensor." With her collaborator at the Chinese University of Hong Kong, Dr Hon-Ming Lam, she searched for fragments of the glutamate receptor genes out of the \IArabidopsis\i \Jgenome\j by searching for gene sequences in the plant similar to the human glutamate receptor sequence, which was already known in part through efforts of the Human \JGenome\j Project.\p
The next step was to investigate what effect the plant \Jreceptors\j in \IArabidopsis\i cells have, compared with the human \Jreceptors\j in brain cells. The easy way to get an answer to something like this is to turn the effect off. When plants were grown in the presence of a compound called DNQX, which blocks human glutamate \Jreceptors\j, the plants grew noticeably longer stems and made less than half the normal amount of \Jchlorophyll\j, the pigment that gives green plants their characteristic color. This discovery suggests that inactivating glutamate \Jreceptors\j blocks the ability of a plant to respond to light.\p
While this is an interesting effect in its own right, further development may provide us with what the scientists call a "plant model" for the investigation of glutamate \Jreceptors\j. Even more importantly, many plants produce neurotoxins, substances which damage animal \Jnervous system\js. It is quite possible that these substances arose first as simple communication systems, which turned out to have the useful side effect of discouraging predators.\p
And if that seems a bit dry, Coruzzi has another suggestion. We might be able to use these plants as a screen for new drugs," she says, suggesting that growing \IArabidopsis\i in the presence of candidate drugs and simply keeping an eye out for longer stem growth may be a useful and cost-effective first pass at sifting through thousands of potentially therapeutic compounds.\p
\BKey names:\b Gloria Coruzzi, Hon-Ming Lam\p
# \p
Memories and false memories\p
How far can a witness be trusted? How about a witness who has been confused in some way? The answer to a US study published in the October 1998 issue of the \IJournal of Memory and Language\i is that you should not trust witnesses very much at all. If you are a swindler, will there be another victim along soon? Almost certainly!\p
According to Kathleen McDermott and Henry L. "Roddy" Roediger III, even if you give people fair warning that you are about to trick them into recalling something that never happened, most of your subjects will still fall prey to the deception, creating "illusory" or "false" memories that sometimes include vivid details.\p
Psychologists have been using word association tests, asking questions such as "What is the first word that comes to mind when I say the word dog?" to construct lists of related words since early this century. Roediger and McDermott were the first to use these lists to study false memories, in work which they published in 1995, and which they are still following up.\p
Undergraduate students (after rats, probably the most common experimental animal, although arguably less intelligent) exhibit "remarkable levels of false recall and false recognition" when asked to identify words that were included in a previously viewed list of associated words. Ask them to look over a list of related words such as bed, dream, blanket, doze and pillow, as many as half of the students will incorrectly answer "yes" when asked if the word "sleep" was in the list.\p
In this most recent work, students were fully informed on the procedures, and even given simple concrete examples which should have left them in no doubt about what to expect. Even so, they came up with a regular blast of false recollections. They suggest that this happens because the mind has a strong compulsion to make inferences and fill in blanks as it processes incomplete data.\p
#
"Loggerheads are good for the coastline",880,0,0,0
(Nov '98)
Nesting sea turtles may be doing more than just ensuring that their species continues. The University of \JFlorida\j suggests that eggs laid by threatened loggerheads along \JFlorida\j's Melbourne Beach hold essential \Jnutrients\j that may strengthen vegetation along the shore and could be preserving the delicate dune \Iecosystem.\i\p
A nesting \Ibeach\i is typically nutrient poor because sandy soils are unable to retain \Jnutrients\j, and salt spray can limit vegetation growth. So if the eggs and their remains are injecting \Jnutrients\j such as \Jnitrogen\j and \Jphosphorus\j, and also adding lipids and \Jenergy\j, this could have a strong influence on the way the beach ecosystems are maintained.\p
It is a common error to argue that "beaches are sand, and deserts are sand, so beaches must be deserts." Apart from the fact that even deserts are often full of tough life, beaches are complex ecosystems, with huge variations in \Jwater\j availability, \Jtemperature\j range and \Jsalinity\j, but most importantly, the life forms there have to be able to scrounge, to take advantage of even the smallest pieces of food. So any regular visitor which leaves material on the beach is an important aspect of the \Jecosystem\j.\p
Sea turtles nest along the east coast of the United States from \JFlorida\j up into the Carolinas, with about 65,000 nests laid along \JFlorida\j's beaches each year. The turtles come from as far away as 2500 km (1500 miles), carrying \Jnutrients\j from their feeding grounds near the \JBahamas\j, \JCuba\j, the Dominican Republic, and the \JFlorida\j Keys, to the sandy beaches.\p
Some of the eggs may be broken by predators such as raccoons, crabs or birds that eat the eggs and scatter them across the dune, other eggs may be damaged when the roots of plants grow towards them and break through the shells to reach the \Jnutrients\j inside. Even if the eggs survive and hatchlings emerge, the fluid inside the eggs remain in the ground and still provide nourishment for the dune \Jecosystem\j.\p
The implications of a loss of turtles could be enormous. If dune plants rely on the \Jnutrients\j from eggs, and if these \Jnutrients\j are not available, this could lead to a loss of plant growth, which can lead to dune destabilization and erosion. There are other sources of \Jnutrients\j, such as material washed ashore, and wastes from terrestrial animals on the dunes, but these appear to be minor when compared with the turtles' contribution.\p
Research is continuing on these matters.\p
\BKey names:\b Sarah Bouchard, Karen Bjorndal, Archie Carr Center for Sea Turtle Research\p
#
"Alum in the water is safe after all",881,0,0,0
(Nov '98)
One solution to fine sediment in the \Iwater supply\i is to add \Ialum\i to drive it from solution. The drawback has been a lingering suspicion that \Jaluminium\j dissolved in \Jwater\j may be associated with \IAlzheimer's disease.\i A report from \JAustralia\j's CSIRO indicates that this is no problem. Only 1-2% of our daily intake of \Jaluminium\j comes from \Jwater\j and of this, only the barest trace is absorbed, according to the report's author. Much of the \Jaluminium\j that is absorbed is then excreted in urine.\p
While the \Jaluminium\j-Alzheimer's link has never been better than a suspicion, some conflicting evidence in earlier studies suggested that \Jaluminium\j that is left in treated drinking \Jwater\j may be more readily taken up by the body than \Jaluminium\j from other sources.\p
If \Jaluminium\j from \Jwater\j could cause a significant increase in the total amount of \Jaluminium\j in the human body, it would have to be in a form that is much more easily absorbed into our bloodstream. In technical terms, the \Jaluminium\j would have to be more bioavailable than \Jaluminium\j in food, which has low bioavailability. This is because a greater proportion of our daily intake of \Jaluminium\j comes from food, with no apparent ill effects - \Jaluminium\j is the \JEarth\j's third most common element and occurs naturally in food and \Jwater\j.\p
Most of our \Jaluminium\j intake passes right through the body as feces, so the study targeted that much smaller part of the \Jaluminium\j intake which is absorbed into the blood. The report says that the \Jaluminium\j obtained from alum-treated drinking \Jwater\j would contribute less than 1% to our body burden of \Jaluminium\j over a lifetime. A related study on food shows that even the \Jaluminium\j we get from food is well within the safe limits determined by the World Health Organization.\p
The study analysed the amount of \Jaluminium\j in the blood and urine of 29 healthy volunteers aged between 36 and 76, while on a strictly controlled diet and consuming alum-treated drinking \Jwater\j. It was funded by the \JWater\j Services Association of \JAustralia\j (WSAA) and conducted by the Centre for Advanced Analytical Chemistry at CSIRO \JEnergy\j Technology.\p
In brief, the results of blood and urine analysis showed that there was no significant detectable increase in the concentration of \Jaluminium\j in the blood of volunteers after consuming food or alum-treated drinking \Jwater\j. Age and gender made no difference to the amount of \Jaluminium\j absorbed from drinking \Jwater\j.\p
\BKey names:\b Jenny Stauber, Fiona Cumming, Jane Allen\p
#
"Teraflops and terabytes",882,0,0,0
(Nov '98)
November saw one of the supercomputing milestones achieved when a team of researchers tweaked their simulation of metallic magnetism to run at 1.002 Teraflops - more than one trillion calculations per second. They got to this result by using a 1,480-processor Cray T3E supercomputer at the manufacturer's facility in \JMinnesota\j, capping an already remarkable scaling up of the code to run on increasingly powerful massively parallel supercomputers.\p
This bettered a result of 657 Gigaflops (billions of calculations per second) on a 1024-processor Cray/SGI T3E supercomputer some months ago.\p
A Gigaflop is defined as a thousand million (10 to the power of nine) floating point operations per second, while a teraflop is 10 to the power of 12 floating point operations per second. Normal home computers have their speeds measured in terms of megaflops (10 to the power of six floating point operations per second).\p
As recently as the mid-1990s, the "Teraflop Club" was a mythical association of people who consumed outrageous amounts of computer time in order to produce a few simple pictures of glass balls with intricate ray-tracing techniques. Now, it seems, the club has just enrolled its first members.\p
(Note: the alternative speed rate, MIPS, or Millions of Instructions Per Second, is so frequently misused that it is often cynically translated as "Meaningless Indication of Processor Speed" or in other unflattering ways, except by people trying to sell computers.)\p
But if scientists need very rapid number-crunching computers to replace physical testing with computer models, the needs of business are a little different. What they demand is very rapid sorting of huge amounts of data. To do this, you need clever algorithms as well as monster machines, but given these, fast rates are just around the corner. \p
In fact, in early November, a 144-processor system was used to sort a terabyte of data - which is about the information contained in a million unabridged dictionaries or 1600 CD-ROMs - in under 50 minutes. The previous record, on a shared-memory supercomputer rather than a cluster of industry-standard computers, was 2.5 hours.\p
The system was made of a cluster of 72 off-the-shelf dual Pentium II Proliant servers running Windows NT and linked by Compaq's ServerNet - a matter which will no doubt be mentioned in a number of advertisements over the next few months, or until an even better performance is noted somewhere else. The system has been dubbed Kudzu, because, like the plant of the same name, it can grow in any direction.\p
#
"It's in the contract?",883,0,0,0
(Nov '98)
One of the standard things every schoolchild learns somewhere in their science class is that things expand when they are heated. There are some alloys like \IInvar\i, which have very low \Iexpansion coefficients\i, but everything expands at least a small amount, or so we are taught.\p
Now physicists at the Johns Hopkins University and Bell Labs are reporting that they have found clues that subvert what once seemed to be a natural law. In a report in Nature in mid-November, they state that several \Iceramics\i show thermal contraction - they shrink as they are heated, and expand as they cool. Only one substance, zirconium tungstate, shows the odd behaviour at something close to room \Jtemperature\j and acts in this peculiar way at a constant clip. The overall shrinkage rate is stated to be 0.0005 per cent for each degree from -375 degrees to 700 degrees Fahrenheit, which translates to 0.0009% for each degree \JCelsius\j in the range from about -225░C to 370░C.\p
The explanation is that the material's atoms vibrate at low frequencies, causing the material to fold in on itself when heated. Because zirconium tungstate can counteract unwanted shrinkage or expansion effects in other materials, it should have many industrial applications. One possible application is forming composite-based components in next-generation fiber optic technology for optical networking.\p
Zirconium tungstate was first synthesized in 1959. But it has only recently attracted the interest of scientists. Its shrinkage was noted in 1968, but ignored because those researchers were seeking materials which did not expand or shrink at all. The material has now been studied in neutron scattering experiments, where neutrons are fired at the material, and the angles and speeds of particles flying off the material are noted.\p
With some fairly solid \Jmathematics\j, the arrangement of the atoms in the base unit can then be worked out. From these studies, the researchers discovered that the material vibrates at very low frequencies.\p
But what causes the low frequencies? It seems that one corner, or atom, of the material's pyramid-shaped building block is untethered and this gives it a low vibrational \Jenergy\j. As temperatures increase, this untethered atom begins pulling in its neighboring atoms, and the whole structure shrinks. On the other hand, in a closely packed structure, which is what you find in most materials, the atoms repel each other as the \Jtemperature\j increases, because there are no available spaces for the atoms to vibrate in, and so the material expands.\p
Lucent Technologies say they are evaluating a zirconium tungstate composite material as a potential packaging material for a "filter," or grating, used in glass optical fiber. The material's unique shrinkage properties would compensate exactly for variations in the glass fiber as temperatures change. This is needed to avoid multiple wavelengths, or channels, of light transmitted through a fiber becoming a scrambled mess.\p
This development has the potential to be very important as Dense Wave-Division Multiplexing becomes more common - this involves increasing the number of transmission channels to boost the capacity of fiber optic transmissions.\p
\BKey names:\b Collin Broholm, Art Ramirez, Debra Fleming, Gabriele Ernst and Glen Kowach\p
#
"A chemical bloodhound to sniff out landmines",884,0,0,0
(Nov '98)
Landmines commonly use TNT and related explosives to do their horrid work. Because modern landmines contain so little metal, they are very hard to detect, providing a new role for the faithful \Ibloodhound.\i Now, newly engineered polymers may be able to work as well as dogs in locating the landmines by detecting trace vapors of TNT and its derivatives. There were estimated to be 120 million unexploded landmines in mid-1998, and rather too few dogs to carry out the search tasks.\p
A report in the \IJournal of the American Chemical Society\i, published in late November, describes the new polymers. They are fluorescent polymers, activated by light to have high \Jenergy\j electrons. The polymer units are mounted on a chemical backbone, and when a molecule of TNT vapor passes between two of the units, it momentarily steals an \Jelectron\j, decreasing the \Jfluorescence\j and sending a signal to the detector's "brain." The system is expected to be very sensitive because one TNT molecule can deactivate many of these electrons.\p
Other applications could also develop from this technology, with the inventor suggesting that they could be used in a system that carries out spot-checks for anything from drugs to explosives.\p
\BKey names:\b Timothy M. Swager\p
#
"Cro-Magnon Venus figures explained",885,0,0,0
(Nov '98)
One of the oddest findings from early human sites of the \ICro-Magnon\i era are the so-called Venus figures. These are grotesquely over-endowed female figures, and people have argued for them as signs of a fertility \Jcult\j, and even as a form of \Jpornography\j, though this last notion seems rather unlikely. In late November, anthropologist Randall White announced on the \JInternet\j that he was about to publish in \IScience\i his theory that the female statuettes carved in various parts of Europe during the Paleolithic period were intended primarily to protect the health of mother and child during childbirth.\p
(This announcement before publication may seem odd, but he also presented his views at a conference at the University of \JPennsylvania\j, and it was in this context that the \IScience\i publication was foreshadowed.)\p
White has carried out a comprehensive examination of some 100 female statuettes which were carved in Europe between 30,000 and 20,000 years ago. The statuettes were excavated from five sites: Brassempouy in \JFrance\j, Grimaldi in \JItaly\j and Gagarino, Kostienki I and Avdeevo in \JRussia\j. He argues that the statuettes emphasize the various physical effects of \Jpregnancy\j on the mother. They do not emphasize or seem to encourage procreation. In those statuettes that have explicit sexual features, the reference is to \Jobstetrics\j. For example, statuettes from the Grimaldi excavation depict dilation for birthing.\p
White adds that where statuettes were well excavated, they were found buried in carefully dug pits as if they were being ritually offered. Perhaps more importantly, they showed little evidence of long-term use. All of the marks from the tools used to make them were well-preserved, and there is no evidence of polishing by handling that might lead one to believe that they were passed on from one generation to the next. This suggests that they were ritually buried shortly after production.\p
From these observations, White concludes that their purpose was not so much fertility as it was protecting the health of mother and child during birth. Furthermore, the primary focus would seem to have been on the well-being of the mother. He suggests that this makes sense, as fertility levels were not a concern with hunters and gatherers, as they would later become with animal herders, and more particularly with farmers, who typically have an excess of food, and are happy to have more mouths to feed if those mouths are attached to working hands.\p
#
"Dinosaur eggs, dinosaur embryo skin",886,0,0,0
(Nov '98)
In mid-November, \INature\i featured the news of a massive find of \Jdinosaur\j eggs from a \Jdinosaur\j nesting site, which dates from the late Cretaceous. It is approximately 70 to 90 million years old, and it is located near Auca Mahuida, in the Patagonian badlands of Argentina. The nesting ground was strewn with eggs of sauropods, plant-eating \Idinosaurs.\i \p
Researchers say the eggs were so plentiful in what they refer to as "the square-mile [about 2.5 kilometers squared] nesting site" that it was virtually impossible to walk in the area without crushing egg shell fragments under foot. Details were also scheduled to appear in the December 1998 issue of \INational Geographic.\i\p
The new site has been named "Auca Mahuevo" for its tremendous abundance of eggs (\Ihuevos\i in Spanish). Dozens of the eggs still have unhatched \Jdinosaur\j embryos inside - with only six previously known fossilized \Jdinosaur\j embryos, this is a major find. As well as teeth and tiny embryonic bones, many of the fossilized eggs contain patches of delicate fossilized skin, providing the first glimpse of the soft tissue covering baby dinosaurs. The \Jfossil\j skin reveals a scaly surface, much like the skin of a modern-\Jday\j lizard. One of the fossils has a distinct stripe of larger scales near its center, which probably ran down the animal's back.\p
Most probably, the dinosaurs which laid the eggs were from the sauropod group we call the titanosaurs. The name is entirely appropriate, because while the embryos were only about 40 cm (15 inches) long, the adults would have reached a length of 13 meters, or 45 feet. The titanosaur identification comes from the teeth of the embryos which are tiny and shaped like pencils: the only sauropods alive at the end of the Cretaceous period with teeth this shape were the titanosaurs.\p
The sauropods are best known to us through one of its members, \IBrontosaurus\i (more correctly, \IApatosaurus\i), and they flourished from 200 million to 65 million years ago. Previously, some scientists had speculated that the sauropods may have given birth to live young, but these eggs are clearly identified as being laid by sauropods.\p
Scientists working at the site believe that egg clusters were laid in the floodplain of ancient streams that periodically overflowed, burying the unhatched eggs on its banks in a layer of mud. The silt covering protected the eggs, and some of their contents, from scavengers and disintegration by the elements. Repeated cycles of egg laying and flooding could have produced the super-abundance of \Jfossil\j eggs and embryos found at the site.\p
Adult titanosaurs had bony, armored plates embedded in their skin. The skin of the embryos does not show any signs of armored plates, suggesting that these grew only after the dinosaurs had hatched. This growth pattern mirrors that seen in modern armored lizards and crocodiles, the juveniles of which lack the bony patches in the skin that are present in adults.\p
\BKey names:\b Luis M. Chiappe, Rodolfo A. Coria, Carmen Funes and Lowell Dingus\p
#
"New predatory dinosaur",887,0,0,0
(Nov '98)
One of the more spectacular \Idinosaurs\i, a spinosaurid, with curved claws like giant meat-hooks and a long, narrow, \Jcrocodile\j-like skull, has recently been excavated in the TΘnΘrΘ Desert of central \JNiger\j. The 11-meter (36 foot) monster is both a new species and a new \Jgenus\j. The spinosaurids are part of the two-legged theropod group of dinosaurs that includes those favorite monsters, \ITyrannosaurus\i and \IVelociraptor.\i The spinosaurids are a peculiar group of fish-eating dinosaurs which have long, narrow jaws studded with cone-shaped teeth, a fin-like sail varying in height along their backs, and large, sickle-shaped thumb claws.\p
The new \Jdinosaur\j, which flourished around 100 million years ago, was described in \IScience\i in mid-November, when its name, \ISuchomimus tenerensis\i, was revealed. (The Greek for \Jcrocodile\j is "souchos", and so the \Jgenus\j name means "\Jcrocodile\j-mimic," while \Itenerensis\i refers to the desert where the skeleton was found.\p
\ISuchomimus\i would have been something to write home about. An adult human would have stood at eye-level with the thigh of the \Jdinosaur\j's hind leg. The discovery was important, because while researchers knew of a couple of spinosaurid fragments found near their site, their discovery of the most complete spinosaurid skeleton yet was something of a surprise.\p
The first spinosaurid, \ISpinosaurus\i, was discovered in \JEgypt\j in 1912 but it was destroyed during the bombing of Munich in World War II. More bones of \ISpinosaurus\i-like predators have since been found in \JNiger\j, \JBrazil\j, and Europe.\p
The new find has raised some new puzzles at the same time that it has delighted scientists. Two genera of spinosaurid were previously known from southern sites, and these were quite different from the European \IBaryonyx\i. The obvious and simple solution was that the spinosaurids were originally distributed across the enormous landmass of \JPangaea\j. As \JPangaea\j rifted and the Tethys Sea formed between the northern and southern halves (which would later form first Laurasia and \JGondwanaland\j, and later, the continents in the Northern and Southern hemispheres), \IBaryonyx\i evolved separately on the northern continent and the two more closely related spinosaurids evolved to the south.\p
The puzzle arises from the fact that \ISuchomimus\i is more like \IBaryonyx\i than it is like the southern spinosaurids found in \JEgypt\j and \JBrazil\j. The similarities between the two dinosaurs suggest that \ISuchomimus'\i ancestors evolved in the north. Later, spinosaurids from the north must have colonized the Southern continent via a land bridge across the Tethys Sea.\p
#
"The world's oldest flowering plant - for now",888,0,0,0
(Nov '98)
The history of \Iplant fossils\i has just been given a good shake, with the discovery of a \Jfossil\j plant in China. Before this find, the oldest-known flower was a 115-million-year-old specimen found in \JAustralia\j in the late 1980s. The Yixian formation in which the \Jfossil\j was found has been dated to about 120 million years old by Canadian researchers, but Chinese researchers have used \Jradiometric dating\j to place the age of the plant at between 142 million and 148 million years old.\p
Either way, the plant remains the oldest flowering plant \Jfossil\j, although the discoverers think it possible that the record will eventually extend back a few more million years. The report, which featured as the cover story in \IScience\i in late November, describes one of a line of plants related to the \Jmagnolia\j. The flower lacked petals, but in botanists' terms, it is a true flower, because it had carpels, or leaf-like pods which opened to release seeds. Non-flowering plants have no such structures. "It's what we would call a pre-floral flower," commented an American botanist, writing in the same issue of \IScience.\i\p
The \Jfossil\j consists of two stems, about 7 cm (3 inches) long. It came from a \Jlimestone\j deposit in the Yixian formation, made famous during the last decade or so for its large number of pristine fossils of dinosaurs, insects, fish and other ancient life. The formation, in the Liaoning province of north-east China, was the site of an amazing find, with a \ISinosauropteryx\i which contained a fossilized \Jmammal\j carcass in its gut, and an egg in its oviduct.\p
A key in identifying it as an early flowering plant was the presence of seeds which were preserved in the \Jfossil\j. The seeds provided solid proof that this was, indeed, a flowering plant, but that is about as far as we can go. There is no evidence to indicate if the \Jfossil\j is a tree or a shrub, though most palaeobotanists believe that the early flowering plants were probably woody.\p
\BKey names:\b David Dilcher and \JSun\j Ge\p
See also \ILargest \Jdinosaur\j unearthed!,\i May 1997 updates\p
#
"Finding a hotspot",889,0,0,0
(Nov '98)
Scientists have believed for many years that the pattern of volcanic islands in the Pacific Ocean represents a pattern where "hot spots" have operated, bringing hot \Jmagma\j from deep within the \Jearth\j, up towards the surface. The idea is that the crust moves across a "plume" of hot material, and where the plume hits the crust, volcanoes occur. As often happens in science, when the discoveries start to come, they come in a rush.\p
It was only in mid-1997 (see \BMapping \Jfossil\j hot spots\b, May 1997) that the hot spot theory got a real boost, when hard evidence became available. In August 1997, we reported a view that \Jmagma\j plumes may be found on Venus (\BVenus still active?\b), while February 1998 saw us reviewing the subject (\BHot spots in the news\b). We have carried two stories on the way plumes have shaped \JAfrica\j in the past two months.\p
Now scientists at the University of \JCalifornia\j, Santa Cruz, may have located the origin of the Hawaiian plume at the boundary between \JEarth\j's mantle layer and its metallic core. This plume has probably been there for about eighty million years, building \JPacific islands\j such as Hawaii throughout that time, and now, with a report in \INature\i in mid-November, the plume must be regarded as an established fact.\p
The researchers found evidence of the Hawaiian plume at the very base of the mantle, at 2,890 km (about 1,800 miles) beneath the surface of the \JEarth\j. At this depth, the molten outer layer of the \JEarth\j's core heats the overlying rock at the base of the mantle. They used seismic waves generated by earthquakes to probe these deep layers of the \JEarth\j, and Sara Russell identified a structural pattern in the boundary layer between the mantle and the core which suggests material is flowing horizontally toward the base of the Hawaiian hotspot and then rising vertically. In the cautious language of the scientist, she says "We're seeing a change from horizontal to vertical structure that seems to be related to the Hawaiian plume".\p
To other observers, off the record and less constrained, Russell has proved beyond doubt that the plume exists. The Hawaiian hotspot is the most productive plume-related hotspot in the world - Hawaii's Mauna Loa, for example, is described as the most massive mountain on \JEarth\j, occupying 10,000 cubic miles and rising 30,000 feet from the seafloor. (In metric terms, that is some 40,000 cubic kilometers, piled up to a height of more than 9 kilometers.)\p
The Pacific plate is traveling over the hotspot, and a series of underwater mountains, island remnants, extending north-west from Hawaii, trace former positions of the hotspot. Russell used seismic waves from earthquakes in the region of \JTonga\j and \JFiji\j to detect the deep structures. The earthquakes, with magnitudes between 5.0 and 7.0, occurred deep in the \JEarth\j and generated seismic waves that passed through the deep mantle beneath the Hawaiian hotspot before being detected by seismic instruments on the West Coast of the United States.\p
Signals from events of this size can be detected anywhere in the world with sensitive seismometers. They are strong enough to be felt on nearby islands, but few of them get into the news. In any case, the data which are needed can be obtained from \Jseismology\j recording stations in \JCalifornia\j and Oregon. Many new stations have been added in this area in recent years, allowing a more detailed analysis of deep structure beneath Hawaii than ever before.\p
The standard method used, seismic \Jtomography\j, relies on the same types of computations that generate \Icomputerized \Jtomography\j (CT)\i scans of the body from x-rays passing through at different angles. As seismic waves radiate outward from the epicenter of an \Jearthquake\j, their speed and other properties are affected by the different types of rock they encounter. By comparing seismic waves that take different paths through the \JEarth\j, scientists can begin to build a sensible model of the \JEarth\j's internal structure.\p
Russell's study included an analysis of how the seismic vibrations are polarized. This polarization analysis is sensitive to flow-induced structural changes which happen when liquid \Jmagma\j is flowing under pressure. This is how Russell revealed, for the first time, variations at the base of the mantle which suggest a localized transition from horizontal to vertical flow.\p
Interestingly, an analysis published recently in \IGeophysical Journal International\i predicted that the base of the Hawaiian plume would be exactly where Russell has located it, if it extended into the lower mantle layer. Because the plume is rising through layers of rock in the mantle that are moving horizontally, the plume will be deflected as it rises through the shifting rock.\p
\BKey names:\b Sara Russell, Thorne Lay and Edward Garnero\p
#
"Plumbing the depths",890,0,0,0
(Nov '98)
In other words, we now have a new picture of what is happening beneath the \JEarth\j's surface, but is it over yet? Probably not, certainly not, if Bruce Marsh is right. The standard view of \I\Jigneous rock\j\i is that it begins as a liquid with no crystals, then it begins growing crystals that eventually produce recognizable, intricate layering in the body. All of the \Jgeology\j textbooks explain magmatic differentiation, the way in which the crystals sort themselves out by floating upwards or sinking, based on the idea that the original \Imagma\i had no crystals in it, an idea that was constructed by \INorman Bowen.\i\p
Now Bruce Marsh, professor of \Jearth\j and planetary sciences from Johns Hopkins University is questioning this, and his questions are striking a responsive cord with other geologists. For more than 25 years, Marsh has been working to understand the deep underground systems that move \Jmagma\j into the \JEarth\j's crust. The problem is that the effort to "get into the boiler rooms," as he calls it, has been particularly difficult because so little of it is actually visible.\p
"You find pieces of systems visible all over," Marsh said. "But most places only show you chunks of the system, like finding bits of a furnace strewn along the side of a road. You never see how, say, you hook the blower up to it or the fuel pipes and the electric lines and everything else. How does it all fit together? What's it look like when the whole system is connected? That's been one of the big problems."\p
The breakthrough came with a chance find in 1993 of large crystals in the rocks of the Dry Valleys of \JAntarctica\j. There, in a scattered rubble pile in the Dry Valleys, he found evidence that the basic assumption of crystal-free \Jmagma\j had to be questioned. Marsh had found his complete "furnace" exquisitely preserved, frozen in time.\p
Lacking vegetation and insects, with very little weathering to strip away the evidence, the cold Antarctic desert is the one place on \JEarth\j where large deposits of \Jmagma\j have been injected in long, laye\Jred sea\jms called "sills," which run for hundreds of miles and which are visible to the naked eye.\p
The main point behind Marsh's ideas is that the long-held notion of giant subterranean chambers containing pure, crystal-free \Jmagma\j does not accurately portray the true underground systems that carry and process the primordial ooze shaping the \JEarth\j's crust. Instead, Marsh is methodically reporting a stream of evidence showing that the \Jplanet\j's internal plumbing is comprised of a plexus of smaller vertical columns of interconnected sheet-like chambers that transport a magmatic mush replete with crystals previously formed and recycled.\p
#
"Antarctic volcano",891,0,0,0
(Nov '98)
A study of drill cores taken from the sea floor off \IAntarctica'\is Victoria Land coast near Cape Roberts, shows surprising evidence of enormous volcanic eruptions, some 25 million years ago. Scientists believe that these eruptions would have significantly altered global temperatures at the time.\p
The evidence points to a major eruption, several times greater than the \IMount St. Helens\i eruption, and possibly even comparable with the eruption of \IVesuvius\i that destroyed \IPompeii\i in AD 79. The thickest distinct layer of volcanic debris is 1.2 meters thick, which suggests an eruption as dramatic as that of \IKrakatoa\i (also called, more correctly, Krakatau) in 1883. These layers contain volcanic pumice up to 1 cm in size, which suggests that the \Jvolcano\j was located within 50 to 100 kilometers of the drilling site and that it erupted in a style reminiscent of Vesuvius.\p
The evidence was gathered as part of the Cape Roberts Project. For the past two Antarctic field seasons, (US) National Science Foundation-sponsored scientific teams, working in collaboration with scientists from \JAustralia\j, Britain, \JGermany\j, \JItaly\j and New Zealand, have been drilling the Antarctic seabed. Their aim has been to obtain samples that would provide evidence about the climatic and geologic history of \JAntarctica\j during the last 100 million years.\p
Drilling this year had reached a depth of approximately 110 metres below the seafloor when this unexpected evidence of volcanic activity was encountered. It came in the form of layers of volcanic debris that were erupted explosively into the \Jatmosphere\j, and then settled through the air and into the ocean onto the seafloor. The thickness and coarseness of the main debris layer indicates a large-volume eruption that generated an ash \Jcloud\j that reached into the \Jstratosphere\j.\p
The layers of volcanic debris lie within beds of muddy sands, indicating a relatively quiet seafloor with occasional weak currents before and after the eruptions. This relatively quiet environment was disrupted at least twice and possibly as many as four times, by large and rapid inputs of volcanic debris (mostly pumice). The debris was supplied by voluminous eruptions from a nearby source, but the exact location and characteristics of the \Jvolcano\j are still unknown.\p
Eruptions like this probably had a significant impact, not only on the Antarctic environment, but also on the global environment of the time. Modern examples of similar eruptions, such as Mount Pinatubo, a much smaller event, cooled world climate by 0.5░C for a year after its 1991 eruption.\p
#
"The evaporation paradox",892,0,0,0
(Nov '98)
Scientists have no shortage of data in support of a trend to \Iglobal warming\i: \Jtemperature\j, precipitation, stream flow and \Jcloud\j cover records all indicate that warmer, rainier \Jweather\j is now more common in many regions of the world. There is just one problem: the measurements of \Jevaporation\j at \Jweather\j stations, taken from the simple metal pans used to measure this variable, indicate that less moisture has been rising back into the air from these pans. This has been dubbed the "\Jevaporation\j paradox."\p
\JWeather\j stations throughout the world usually place \Jwater\j-filled metal pans, about a foot deep and three feet in diameter, outdoors on wooden platforms. Each \Jday\j, a technician measures how much \Jwater\j has disappeared from the pan, and the evidence has been that the \Jevaporation\j figures in many parts of the world do not seem to align with other climate change indicators. \p
A paper in \INature\i in early November suggests that the \Jevaporation\j pan records do support predictions made by the climate computer models that are used to simulate the impact of increased greenhouse gases. The problem, say Marc B. Parlange and Wilfried Brutsaert, is that many researchers have misinterpreted the \Jevaporation\j measurements.\p
The basic principle of \Ihydrology\i, the \JEarth\j's \Jwater\j cycle begins when \Iprecipitation\i happens, when \Jwater\j falls to the ground as rain or snow, soaks into the soil or moves into bodies of \Jwater\j, then evaporates to form clouds that start the cycle once again. Over the past quarter century, data collected from the United States, the former Soviet Union, India and Venezuela have all indicated that the pace of this cycle is picking up. In other words, it has been taking less time to complete each cycle of precipitation through \Jevaporation\j.\p
The reason for this quickening pace is uncertain: it may be a natural phenomenon or one that humans have triggered by releasing more carbon dioxide. Either way, the existence of an accelerated \Jwater\j cycle is now widely accepted.\p
So why would \Jevaporation\j figures have decreased in areas where rainfall and cloudiness have become more prevalent? If \Iless\i \Jwater\j vapor is moving into the \Jatmosphere\j, how could more rain and clouds be forming? The answer, it seems, is remarkably simple: the puzzled scientists had their eye on the wrong ball, they were asking the wrong question.\p
The rate of \Jevaporation\j from a free \Jwater\j surface depends on a whole range of factors, such as wind speed, \Jtemperature\j, and local \Jhumidity\j. The key issue is \Jhumidity\j: with everything else the same, a pan in a dry desert will record higher \Jevaporation\j rates than another pan in a rain forest. So if an area has more snow and rain, the \Jatmosphere\j may be much more humid, and this will reduce the \Jevaporation\j from the test pan. So "regional land-surface moisture" must be factored in when climate researchers interpret raw measurements from \Jevaporation\j pans.\p
If this \Jhumidity\j effect is taken into account, the paradox disappears, and \Jevaporation\j rates fall in line with other signs of climate change. In fact, in many situations, decreasing pan \Jevaporation\j actually provides a strong indication of increasing terrestrial \Jevaporation\j!\p
#
"December, 1998 Science Review",893,0,0,0
\JThe top 10 breakthroughs of 1998\j
\JA new Australopithecus fossil\j
\JWas Stone Age art the work of autistics?\j
\JLarge asteroid in Argentina -- how many killed?\j
\JThe first complete Gorgon\j
\JEarth's mean surface temperature expected to rise\j
\JWest Antarctic icesheet still stable \j
\JFossil signs of an old greenhouse\j
\JDo glaciers slow the volcanoes?\j
\JCalifornia's next volcano?\j
\JEarth's innermost core is solid \j
\JUnder pressure\j
\JLooking for alternatives to methyl bromide\j
\JTamoxifen and estrogen\j
\JExplaining green tea and cancer\j
\JHamburger with a cherry on top?\j
\JDiet and cancer\j
\JWine as a weapon against stroke\j
\JHow leprosy attacks us\j
\JLifeguard lung\j
\JPenicillium infections in HIV patients\j
\JThe genome of a nematode\j
\JEight calf clones\j
\JAnother CP violation?\j
\JResearch spending\j
\JThe great census debate\j
\JThe farthest quasar\j
\JA new pulsar\j
\JNEAR finds problems, solutions\j
\JRadiation belts bad for satellites\j
\JRobosurgeon does heart bypass\j
\JHeart tremors, earth tremors, where's the difference?\j
\JNo danger in implants\j
\JPedagogical agents\j
\JPalmPilots to rule the world?\j
\JThe oldest tropical ice core yet\j
\JGrandmothers can reduce NHS costs?\j
\JFly fishing could be costing the NHS \j
\JChocolate and sweets can make you live longer?\j
#
"The top 10 breakthroughs of 1998",894,0,0,0
(Dec '98)
In mid-December, having decided that nothing major would emerge between then and the new year, \IScience\i journal published its list of the top ten scientific breakthroughs for 1998. Top of the pops was the discovery that physicists and astronomers have determined that the universe is expanding at an accelerating rate, based on their observations of distant, ancient exploding stars (see \BHow the Universe Will End\b, October 1997, \BUniverse to Keep on Going\b, January 1998). This supports the existence of a mysterious, self-repelling property of space first proposed by \IAlbert Einstein,\i which he called the cosmological constant.\p
The work has been carried out by the High-Z \ISupernova\i Search Team, a collaboration of 21 scientists and 11 institutions around the world. The loosely organised team was started in 1995 by Brian Schmidt of the Mount Stromlo and Siding Spring observatories in \JAustralia\j and has since been joined by the \JSupernova\j \JCosmology\j Project, based at Berkeley Lab and headed by Saul Perlmutter. Working in friendly competition, the two groups are able to spark off each other, and guarantee that everything is double-checked, but so far, they appear to be "in remarkably violent agreement".\p
The two groups have studied one type of \Jsupernova\j, which essentially is the blast that happens when a dead star transforms into a natural thermonuclear bomb. In particular, they have been looking at Type Ia supernovae, very bright astronomical "standard candles" that all have the same intrinsic brightness, so that their apparent brightness gives us a direct estimate of their absolute distance.\p
Light from the supernovae being studied has been travelling 5 billion to 7 billion years, or between one-third and one-half the life of the universe. By comparing the distance of a Type Ia supernovae with the redshifts of its home galaxy, astronomers can calculate how fast the universe was expanding at different times in its history, revealing whether or not the expansion rate is accelerating. Good results depend upon observing many Type Ia supernovae, both near and far.\p
A Type Ia \Jsupernova\j is a rare event. In a typical galaxy Type Ia supernovae may occur only two or three times in a thousand years, and they need to be detected while they are still growing brighter. The problem for astronomers was getting \Jtelescope\j time when there was no assurance that there would be any success, yet without the \Jtelescope\j time, they could not show that success would follow.\p
The method now used involves making images of 50 to 100 patches of sky, just after a new moon, when the sky is dark. Each patch contains roughly a thousand distant galaxies. Three weeks later the same patches are imaged again, at which time supernovae occurring anywhere in these fields show up as bright points of light. The typical catch averages around two dozen, and because these supernovae will not yet have reached their brightest over the three weeks, this guarantees that the next observation period, just before the new moon, will have some excellent supernovae to study.\p
One advantage of working with the Type Ia supernovae is that they are so similar, whether nearby or far away, that the time at which an explosion started can be worked out just by looking at its spectrum. In fact, Type Ia supernovae which exploded when the universe was half its present age behave in exactly the same way as they do today.\p
In 1994, most astronomers believed the universe was slowing down, and the questions asked were, how quickly is it slowing? What is the mass density of the universe? Is there enough mass to reverse expansion and eventually end the universe in a Big Crunch?\p
In 1917, when Einstein was trying to balance his equations of General Relativity and preserve a picture of a balanced universe that would neither expand nor collapse on itself, he proposed that there might be a property of empty space, which he called the cosmological constant, which was somehow involved in balancing the universe between expansion and collapse. When Edwin Hubble showed that the universe is expanding, Einstein dismissed his cosmological constant idea, calling it "the biggest blunder of my life."\p
First instincts are often best, and now it looks as though Einstein spoke too soon. If things are being pushed apart faster now than when the universe is young, something has to be doing the pushing, and the cosmological constant, known by the Greek letter lambda, looks like the prime suspect. And barring change in the value of lambda, whose exact nature remains a mystery, the universe will expand forever.\p
The Z symbol in High-Z refers to the \Jredshift\j of the supernovae, and the researchers are now searching for more supernovae with high redshifts to get more information about the early universe. They are also looking for nearby supernovae, those with low redshifts, to make sure that type Ia supernovae remain essentially the same at all ages, and make dependable "standard candles".\p
A few press reports have referred to the cosmological constant as "antigravity". While it is indeed a component of gravity that pushes things apart instead of pulling them together, this is not the sort of force you can harness for antigravity boots. We are talking serious theory here, but theory which still has to be tested: physicists are now toying with the idea that even the emptiest space contains gravitational \Jenergy\j which helps make the universe fly apart. If even empty space has \Jenergy\j in it, that brings us to totally new physics, which might lead anywhere if it is correct, and nowhere very much if it is wrong -- though the replacements theory is going to be very interesting indeed.\p
On the other hand, Jeffrey Peterson, a Carnegie Mellon astronomer, told a conference in Paris on December 18 that his results, using a microwave \Jtelescope\j in \JAntarctica\j that measured the dimensions of extremely distant gas clouds, indicated that the material of the universe was given just the right kick by the Big Bang to expand forever, never collapsing, but also never becoming so dilute that gravity can be ignored.\p
In other words, Peterson believes that the universe will slow down, though never by enough to stop completely. The \Jtelescope\j, Viper, is operated by the Center for Astrophysical Research (CARA) in \JAntarctica\j at US National Science Foundation's Amundsen Scott South Pole Station. It is used to make images of the faint structure, an anisotropy, seen in the sky. According to current astrophysics, the glowing clouds of gas that Peterson observed with Viper would be, in astronomical terms, relatively close by and would measure as much as one-half degree of arc across the sky -- at least, they would be, if the universe is slowing down.\p
This size, the same as the diameter of our \Jsun\j or our moon, is exactly what the clouds measure, and as this is the size predicted by \Jinflation\j theory, it looks as though we have a contradiction. The puzzle remains.\p
The runners-up were also interesting. \ICircadian rhythms\i (\BSeeing the light\b, January 1998, \BCircadian clockworks uncovered\b, May 1998, \BJet-lagged blue-green \Jalgae\j\b, September 1998), came second, because the effect has featured in a number of discoveries relating to the mechanisms behind these behaviour patterns. Even more interesting, it appears that fruit flies and mice-separated by nearly 700 million years of \Jevolution\j-share the same timekeeping proteins.\p
\BThe remaining runners-up, in no particular order, include:\b\p
Combinatorial chemistry (\BA saving of time\b, April 1998);\p
Genomics (see \BThe \Jgenome\j of a \Jnematode\j\b, this month, among many others, most containing the word "\Jgenome\j");\p
Neutrino mass (see \BNeutrinos -- do they have mass\b, June 1997, \BNeutrinos have a cycle too\b, December 1997, and \BNeutrinos oscillate, so they have mass\b, June 1998)\p
Teleportation in quantum leaps (see \BTeleportation a tiny step closer\b, December 1997)\p
Biochips (see, for example, \BNew sensor for \Jbacteria\j\b, April 1998)\p
Cancer therapies based on drugs such as Tamoxifen, Heceptin, and Raloxifene (see \BTamoxifen and oestrogen\b, this issue);\p
\BOther hot issues include:\b\p
\IPotassium channel structure\i, the way cell membranes manage to allow in or out certain ions that are essential for sending messages along nerves-the basic connection that allows people to see, think, taste, and touch;\p
\IMolecular \Jmimicry\j\i, where we now have evidence of a link between autoimmune disorders and infections such as Lyme disease or herpes simplex virus 1. Researchers have now confirmed earlier suspicions that the infection causes the immune system to attack the body's own molecules. This development may lead to better understanding and treatment of a number of autoimmune disorders including diabetes and multiple sclerosis.\p
\BAnd the best bets for 1999?\b\p
Photonic band gap materials and devices, aging, millennial-scale climate change, carbon sinks and the global carbon budget, bioterrorism, and allergies. More on this in December 1999.\p
#
"A new Australopithecus fossil",895,0,0,0
(Dec '98)
The world's first-ever find of an entire \Jape\j-man skull has been discovered at Sterkfontein, near \JJohannesburg\j in South \JAfrica\j, by Dr Ron Clarke of the Wits Palaeo-\JAnthropology\j Research Group. The date has been variously quoted at 3.5 or 3.6 million years, but for the moment, it can only be assigned to an age ranging from 3.22 mya to 3.58 mya.\p
Not only is the skull complete, with both its lower and upper jaws and even its teeth in contact, but the skull is with its skeleton. The skull, limbs and torso belong to a four-foot-tall creature which lived in a wooded area and was capable of climbing trees. Past finds of \IAustralopithecus\i have been either a partial skull or a partial skeleton, but never both of the same creature.\p
It is also the first time that such a complete ancient tibia with \Jfibula\j (lower leg) has been found - or such a complete \Jape\j-man foot, and radius (outer forearm). Phillip Tobias, Professor Emeritus of the Wits Anatomical Sciences Department, has assigned the specimen to \IAustralopithecus africanus\i, linking it to such finds as the \I\JTaung skull\j\i, and "Mrs Ples". Tobias, however, is well-known as a "lumper", who once blandly assured your reporter that he regards "Lucy" as a member of \IA. africanus,\i while most workers, even the lumpers, class Lucy as a member of \IAustralopithecus afarensis\i. For the moment, most others seem to be referring to the specimen just as \IAustralopithecus.\i\p
Clarke, described by colleagues as "his own man", is reported to be withholding judgment on just where the \Jfossil\j fits in the \Jhominid\j line. The foot bones have both apelike and human features, suggesting the individual was comfortable climbing trees and walking upright. At the same time, it has "massive" cheekbones, suggesting that it had large jaw muscles, so the skull does not appear to match known examples of the "gracile" variety of \IAustralopithecus\i known as \IA. africanus.\i Nonetheless, he is currently assigning it to \IAustralopithecus,\i rather than to \IParanthropus,\i as the robust hominids are usually called, these days.\p
Clarke says it will be a year before the rest of the skeleton is unravelled. The find began with the identification of four \Jfossil\j foot bones of a creature that Tobias dubbed 'Little Foot'. Clarke pointed out in 1994 that the owner of these \Jfossil\j feet had some \Jape\j-like features and some human features. It walked upright yet was also a tree-climber. More recently, Clarke found some lower leg bones which fitted the foot bones, and he began to consider that the rest of the skeleton might well still be in the rocks at Sterkfontein.\p
Armed with a cast of the fragment of the newly-found shinbone, Clarke's two sharp-eyed assistants, Nkwane Molefe and Stephen Motsumi, set out to search the walls for a matching cross-section in the cave. Amazingly, within two days they found their "needle in a haystack". It was at the opposite end to where they had previously excavated, but the fit was perfect and long sessions of chiselling of the concrete-like rock exposed the lower leg bones and a complete forearm bone.\p
Then came a blank, with no further bones appearing, until Clarke noticed a displacement in the thick lime layers and deduced the rest of the skeleton could have fallen further down. He changed the search area. Soon, some more bone appeared in the rock, and then they realised that it was the back of the mandible, the lower jaw. Then suddenly, the searchers saw a glint of enamel, the sign that they had found a tooth. More importantly, it was an upper tooth, revealing that they had the whole of the skull. It was, says Clarke, a palaeontologist's dream come true.\p
When the foot bones were originally dated, they had been found in a loose block, and while the block appeared to be from a Silberberg Grotto layer known as Member 2, there was still some doubt about the true location, and so about the date. Member 2 in the Silberberg Grotto is about 10 metres below (and therefore older than) the deposit from which such famous specimens as Mrs Ples were excavated, which is known as Member 4.\p
The \Jantelope\j bones which are present in Member 4 are of animals found in East \JAfrica\j between about 2.5 and 2.7 million years ago, so Member 2 must be older than this -- a 1995 estimate suggested that the gap probably represented a period of several hundred thousand years, making the foot bones somewhere around 3 million years old.\p
Because the rest of the skeleton has now been located \Iin situ\i in the Silberberg grotto, we know that it comes from Member 2, but we also have extra information available to us. There are several layers of flowstone or stalagmite material, and flowstone carries a record of the \Jearth\j's magnetic field as it was when the stone was formed.\p
Tim Partridge, the expert who has been doing the dating work, took careful samples of all the flowstones, and sent them off for analysis at the Geomagnetism Laboratory of the University of Liverpool by John Shaw and Dave Heslop. Their measurements indicated that no fewer than five changes in magnetic polarity occurred during the time represented by the deposition of these flowstones (in other words, the position of the \Jearth\j's magnetic pole changed from the "normal" to the "reversed" position, or \Ivice versa\i, on five separate occasions).\p
The timing and duration of each change in the \Jearth\j's magnetic field for the past 118 million years is known with great accuracy. Placing any local sequence of changes, like that in the Silberberg Grotto, within the complete record means first working out broad upper and lower age limits from other evidence. In this case, the \Jantelope\j remains retrieved from Member 4 provide an upper limit of 2.7 million years; while a lower limit of about 4.0 million years may be assumed from the oldest (rather fragmentary) \Jhominid\j remains yet found in East \JAfrica\j.\p
Given these assumptions, the matching of the flowstone layers above and below the skeleton to the global palaeomagnetic time scale appears quite clear: the \Jhominid\j remains are positioned between the Gauss-Gilbert reversal boundary and the end of the Mammoth event. This is why Partridge has been able to conclude that the specimen can be assigned a palaeomagnetic age of between 3.220 and 3.580 million years.\p
The find will now cause people to question the standard assumption that East \JAfrica\j was the cradle of humankind. The oldest skeleton discovered so far has been that of Lucy, a female whose remains were unearthed in \JEthiopia\j, but this takes the South African location of the hominids back even earlier, and possibly back as far as the hominids which left their footprints at \ILaetoli.\i\p
The Sterkfontein \Jfossil\j site lies near the summit of a low hill to the south of the Bloubank River valley, some 9.5 km north-west of Krugersdorp, about 50 kilometres from \JJohannesburg\j. The \Jfossil\j cave is at an elevation of about 1480 metres, and the highest point of the hill is 1486 metres. The main \Jfossil\j cave overlies extensive underground caves developed at an elevation of 1450 meters; these caves boast an under ground 'lake' and many beautiful stalactite and stalagmite formations.\p
The caves are extensive dolomite cave deposits and proved to be one of the richest sites of early \Jhominid\j fossils in \JAfrica\j, after mining of the caves began to obtain \Jlimestone\j for industrial purposes in 1896. The fossils formed when the caves first opened, around 3.5 million years ago, as \Jgroundwater\j dissolved away some of the \Jlimestone\j. When remains fell into the caves, these were cemented by limey \Jwater\j to form breccia infills, time capsules which retain a record of the fauna of the time, down to this \Jday\j. As well as hominids, the infills have revealed fossils of sabre-toothed cats, giant leaf eating monkeys, small primitive baboons and extinct antelopes. \p
The cave mouth probably had trees around it -- more than 300 fragments of wood have been located, and these indicate that a tropical gallery forest fringed by savanna existed at Sterkfontein over 2.6 million years ago. The picture we get is of carnivores sitting in trees, eating their prey and dropping the remains, where the remains either fell right into the cave mouth, or were washed in by later rain. We can only guess at how a whole \Jhominid\j ended up in the cave, but luckily for us it did.\p
#
"Was Stone Age art the work of autistics?",896,0,0,0
(Dec '98)
Standard theory says that the art work of the people we call \ICro-Magnon Man\i reveals their power of \Icommunication:\i without speech, say the experts, they could not have expressed themselves as they did. An interesting alternative view has just been set down in the \ICambridge Archaeological Journal,\i where Nicholas Humphrey of the New School for Social Research in New York argues that cave art was produced by people with underdeveloped minds.\p
Humphrey is a psychologist, and he says he has found uncanny similarities between the cave paintings at two sites in \JFrance\j-Chauvet and \JLascaux\j-and the drawings of an autistic girl called Nadia born in \JNottingham\j in 1967. Despite having severe learning difficulties, Nadia drew with startling realism from the age of three. Many of her drawings are reproduced in a book by Lorna Selfe, a British psychologist, who wrote the book about Nadia in 1970.\p
Both sets show striking realism, and both capture animals in motion using lines to contour body shapes. Then again, the cave artists often drew animals haphazardly on top of one another, as Nadia did. He draws other parallels as well, but the balance of opinion seems to be against Humphrey -- for now, anyhow. The consensus seems to be that many children lack language, but very few of them draw like Nadia. In short, a lack of language does not automatically produce great art.\p
#
"Large asteroid in Argentina -- how many killed?",897,0,0,0
(Dec '98)
Did \Iasteroids\i kill Argentinian fauna? Around 3.3 million years ago, a massive asteroid hit coastal south-eastern Argentina, according to a mid-December report in \IScience.\i Peter Schultz and Marcelo Zarate reached this conclusion after looking at cliffs of windblown dust deposits called loess near the coastal town of Miramar. The cliffs contain a layer of glassy, bubble-filled slabs up to 2 metres across. First reported in 1865, these rocks are called escoria and had been attributed to everything from \Jlightning\j strikes to ancient human-tended fires. The escoria lies in a 30 km (18 mile) layer along the cliffs.\p
Chemically, the glass resembles the loess, but with all the right impact signatures: unusually high levels of \Jmagnesium\j oxide and \Jcalcium\j oxide, significant amounts of iridium and \Jchromium\j, and only the tiniest traces of \Jwater\j. The glass has twists and folds, streaky flow patterns typical of rapidly cooled impact glass, and there are mineral breakdown products that require temperatures even hotter than those of \Jlightning\j and volcanoes, so the researchers suggest that a body, about a \Jkilometre\j or so in diameter, hit just offshore, producing a now-buried crater.\p
The radiometric date, 3.3 million years ago, is within 100,000 years of an abrupt, temporary 2░C cooling of ocean bottom waters recorded in Atlantic and Pacific sediments. The evidence for this is found in the ratios of heavy to light oxygen \Jisotopes\j in sediment cores from the nearby ocean floor.\p
There was a sudden \Jextinction\j which affected, among others, 35 genera of mammals and a flightless bird, mostly kinds known only from that region. A dusty layer just above the glass layer shows \Jfossil\j evidence of the \Jextinction\j event, which includes large \Jarmadillo\j-like creatures, ground sloths, hoofed groups of related mammals and a flightless carnivorous bird. Other fauna later appeared in their place.\p
For the moment, the event is being treated as an interesting \Jcorrelation\j, but with high rates of \Jextinction\j in that time, there is no proof yet that the asteroid actually caused the extinctions.\p
#
"The first complete Gorgon",898,0,0,0
(Dec '98)
South \JAfrica\j produced another remarkably complete \Jfossil\j at almost the same time, a therapsid -- either a mammalian \Jreptile\j or a reptilian \Jmammal\j, from a group called the gorgonopsids.\p
The \IGorgon\i of ancient \Jmythology\j was a hideous monster. The gorgonopsid of palaeontology was a ferocious predator with both reptilian and mammalian characteristics that became extinct 250 million years ago. While there have been many partial finds of gorgonopsids, palaeontologists from the South African Museum and the University of Washington have discovered what appears to be the first complete \Jfossil\j of a gorgonopsid. The find, in mid-November, was reported on the \JInternet\j in early December.\p
Searching on a mile-high plateau in South \JAfrica\j's \JKaroo\j region, on the Permian-Triassic boundary, the team found a small bone fragment jutting from sedimentary layer associated with the latest part of the Permian period. Excavation revealed the complete skeleton of an individual, probably from the \Jgenus\j \IRubidgea.\i The skeleton was lying down, with its head curled to the right and all four limbs tucked beneath the body. The lower parts of the skeleton, including the pelvis and lower limb bones, extended into underlying rock.\p
In the past 150 years, more than 12,000 fossils have come from the \JKaroo\j region, including several hundred \Jgorgon\j fragments, but no complete specimen has ever been found, and while \Jgorgon\j fossils also have been found in China and \JRussia\j, none of those is complete.\p
The specimen will be given to vertebrate anatomists for study before it is prepared and mounted for display in the South African Museum in Cape Town. So far, we know that the skull is 75 cm (30 inches) long, the backbone is two metres (6.5 feet) long, and the overall length is more like three metres (around ten feet).\p
In the late Paleozoic, the era just before dinosaurs, gorgonopsids were the largest predators around. Their heads appeared somewhat dog-like, with large sabre-tooth upper canine teeth up to 4 inches long. Though they had a somewhat mammalian appearance, their eyes were set at the sides of the head like those of a lizard, and the body was probably covered with scales rather than hair. The gorgons would have resembled a cross between a lion and a large monitor lizard.\p
The therapsids were one of the major groups of vertebrates. They shared a common descent with the reptiles, but the gorgons and other therapsids were on a line that gave rise to mammals rather than dinosaurs, lizards, turtles or birds. They were wiped out in the world's most severe \Jextinction\j event, the Permo-Triassic mass \Jextinction\j of 250 million years ago that killed 80 percent to 90 percent of all species on \JEarth\j. The new specimen, coming from the very end of the Permian, must have been one of the very last gorgons on \Jearth\j.\p
\BKey names:\b Roger Smith, Peter Ward\p
#
"Earth's mean surface temperature expected to rise",899,0,0,0
(Dec '98)
A new modelling study at the US National Center for Atmospheric Research (NCAR) in Boulder, \JColorado\j predicts that the \Jearth\j's mean surface \Jtemperature\j is expected to rise .2 degree Kelvin (.36 degree Fahrenheit) per decade over the next four decades. Results were derived from a two-century simulation of \Jearth\j's climate. Other results expected by the end of 1998 include information on climate changes related to precipitation, cloudiness, and large-scale run-off.\p
The climate simulations were driven by observed changes in atmospheric trace-gas concentrations for the period 1870 to 1990 and by two projected trace-gas scenarios for the period 1990 to 2100. The greenhouse gases included in the model are carbon dioxide, \Jmethane\j, nitrous oxide and \Jchlorofluorocarbons\j 11 and 12. Emissions of sulfur dioxide (SO\D2\d) resulting from human activity are also included, with projected increases over time. Natural SO\D2\d emissions were assumed to be constant. SO\D2\d is important because it is converted in the \Jatmosphere\j into sulfate \Jaerosol\j, which reflects some sunlight back into space and may slow or reverse global warming trends in certain regions.\p
#
"West Antarctic icesheet still stable",900,0,0,0
(Dec '98)
A report in the journal \IScience\i indicates that the interior of the West Antarctic ice sheet, the largest grounded repository of ice on the \Jplanet\j, is stable. It is not melting rapidly, it is reasonably stable and it has been so for more than a century. An international team of scientists has analysed five years of \Jsatellite\j radar measurements covering a large part of the southernmost continent to reach this conclusion. (See \BGlobal Warming -- Impact on \JAntarctica\j,\b March 1997)\p
At most, they say, the ice sheet may be melting at the rate of 1 \Jcentimetre\j a year, which converts to a sea-level rise of 1 \Jmillimetre\j, when the extra \Jwater\j is spread over the world's oceans. Most of the West Antarctic ice sheet sits on dry land while the East Antarctic ice sheet is grounded below sea level. Changes in the East Antarctic sheet would have little effect on sea level since the ice displaces sea \Jwater\j. But a complete melt of the West Antarctic ice would pour new \Jwater\j into the oceans, raising sea levels.\p
Two European Space Agency remote sensing satellites, ESA-1 and ESA-2, were used to measure ice altitudes from 1992 through 1996. The 800-\Jkilometre\j (497-mile) orbits of the satellites reached to 81.5 degrees South, allowing them to regularly monitor at least 60% of the continent's grounded ice. The satellites can measure sea level to within 5 cm, but they are less accurate over ice. To overcome this inaccuracy, researchers had to devise new algorithms to decipher the raw, ice sheet altimetry data and correct for several variables such as radar penetration below the ice surface and snow accumulation.\p
#
"Fossil signs of an old greenhouse",901,0,0,0
(Dec '98)
A small \Jvolcano\j erupted in \JIceland\j during December, but it was expected to die out by the end of the month, with no great harm being done. This was not the case 90 million years ago, when volcanoes around the world poured carbon dioxide into the \Jatmosphere\j, until the \JArctic\j was "as warm as present-\Jday\j \JFlorida\j". So at least once in \JEarth\j history, high amounts of the greenhouse gas warmed \JEarth\j to much higher temperatures than usual.\p
That, at least, is the story told by fossils of champsosaurs, as reported in \IScience\i in mid-December. The exciting part was the discovery of bones from a 2.4 metre (eight-foot) champsosaur, an extinct \Jcrocodile\j-like beast with a long snout and razor-sharp teeth, in rocks with an age range somewhere between several hundred thousand to a few million years, suggesting that the climate changes held for some time.\p
The two dozen bones the team found include a tibia, a \Jfemur\j, and ribs and vertebrae from both small and large champsosaurs, as well as turtle shells and several bones from fish. The champsosaurs lived in a freshwater environment on Axel Heiberg Island, a Canadian island west of \JGreenland\j, where they needed an extended warm period each summer to survive and reproduce. The island was then a little further south than it is today, but it was still well inside the \JArctic\j Circle, at 79 degrees north.\p
Based on the available \Jfossil\j evidence, the annual mean \Jtemperature\j in the \JArctic\j during the late Cretaceous period, from about 92 to 86 million years ago, was about 14░C (around 57░F). That would mean it was rarely, if ever, freezing during the winter (and for reptiles to survive, this is an absolute requirement). Summer temperatures must have consistently reached into the 30s on the \JCelsius\j scale, or the 80s and 90s on the Fahrenheit scale.\p
The fossils come from a four-metre (12 foot) deep sedimentary layer lying on a thick (300 metres, 1000 feet) layer of \Jbasalt\j, showing that the champsosaurs were in the area immediately after extreme volcanic activity. The next layer above is a black marine shale which is common in the \JArctic\j.\p
The volcanic eruptions were not the spectacular eruptions which feature on the TV news. Rather, they were "basaltic", where huge masses of \Jlava\j oozed out, and carbon dioxide floated skyward. Besides huge amounts of \Jlava\j in the \JArctic\j, where hardened \Jlava\j rock today measures more than a \Jkilometre\j thick in some places, \Jmagma\j oozed from volcanoes in the Caribbean, the Pacific Ocean northeast of \JAustralia\j, the Indian Ocean, off the coasts of Madagascar and \JBrazil\j, in South \JAfrica\j and in the Southwestern United States.\p
\BKey names:\b John Tarduno, Donald Brinkman, Paul Renne and Pat Castillo.\p
#
"Do glaciers slow the volcanoes?",902,0,0,0
(Dec '98)
The effects of volcanoes on the climate are well-known, but new evidence indicates that the opposite also occurs, with glaciers now accused of having prompted eruptions after they retreated north across North America.\p
Allen F. Glazner, from the University of North Carolina at Chapel Hill told the American Geophysical Union's meeting in San Francisco in early December that they had found that volcanoes tended to erupt between glacial cycles, based on the evidence of almost the last million years of geologic activity in eastern \JCalifornia\j. When the glaciers were present, he said, the volcanoes were pretty much subdued. Then, when the glaciers retreated, volcanoes became much more active.\p
So far, the control mechanism is unknown, says Glazner. Possibly the extra weight of the glacial ice holds the \Jmagma\j, or molten rock, in place. Then, when the ice melts, there is less weight on the \Jearth\j's crust, and this triggers volcanoes to erupt. Areas like \JCalifornia\j's Mammoth Lake, he suggests, become more plausible as sites of future Californian volcanoes -- and this was no mere random suggestion, as the next article reminds us.\p
#
"California's next volcano?",903,0,0,0
(Dec '98)
Most people associate volcanic activity with sulfur fumes, yet carbon dioxide is often released by volcanoes and volcanic activity. For the past nine years, since an \Jearthquake\j swarmed beneath the \Jvolcano\j in 1989, carbon dioxide has been seeping out of the ground in areas of Mammoth Mountain, killing trees and posing a health hazard in this resort area.\p
The same meeting of the American Geophysical Union heard from John Rogie, Penn State graduate student in geosciences, that the gas flow is much more complicated than previous measurements indicated. Previous measurements were taken at yearly intervals, but Rogie measured gas levels in the area for a 24-hour period and found that the flux of carbon dioxide varied by up to a factor of three throughout the \Jday\j.\p
His target area was a 12-hectare (35-acre tree-kill) area near Horseshoe Lake. Rogie took continuous measurements of carbon dioxide flux and other environmental variables from August until the equipment was removed in early November when snow levels reached more than two feet. As the snow accumulated, the carbon dioxide concentration at the snow/ground interface, and in the snowpack itself, built up enormously to then level where it could overwhelm the sensors on the instruments, thus offering a risk to people straying into the area.\p
Rogie found that the average carbon dioxide flux at the instrument was 2000 grams of carbon dioxide per square metre per \Jday\j, but that levels could fall as low as 50 grams or go as high as 5000 grams. The carbon dioxide flux showed a daily, cyclic pattern, rising in the afternoons, and being lower in the morning. This is a puzzle, as Rogie believes he has ruled out \Jtemperature\j as the cause of the cycle. There is also a second twelve-hour cycle, which may be linked to barometric pressure oscillations.\p
Rogie is now exploring the possibility that the 24-hour cycle is linked to \Jearth\j tides, caused by the effects of the moon's gravity on the solid portion of the \JEarth\j in the same way that ocean tides are the effects of the moon's gravity on the liquid parts of the \JEarth\j. \JEarth\j tides are very small, but do effect the width of fractures and faults, which could affect carbon dioxide degassing rates. Soil moisture, wind speed and barometric pressure are also suspects.\p
Mammoth Mountain is considered to be a dormant \Jvolcano\j with its last eruption some 200 years ago, but it may produce as much carbon dioxide as the active \Jvolcano\j Kilauea in Hawaii, with estimates ranging from 400 to 1200 tonnes per \Jday\j. The CO\D2\d levels found in the ground and snow within tree kill areas are much higher than normal levels, up to 99%, making the area very dangerous.\p
But while Mammoth Mountain is seen as dormant, a gravity study also reported at the AGU meeting shows that the nearby Long Valley Caldera is on the rise, as it has been since 1980, and a new study confirms that it is \Jmagma\j, not \Jwater\j, that is forcing it upward. The \Icaldera\i is located on the east side of the Sierra, with Mammoth Mountain perched on its western rim and the town of Mammoth Lakes inside the caldera itself.\p
The study compared gravity measurements from 1983 with 1998 levels, and showed changes which can most easily be explained by assuming an intrusion of \Jmagma\j into the shallow subsurface during the 15 years between measurements. The less worrying hypothesis, that \Jwater\j was being forced into the dome, can now be ruled out.\p
To put the situation in context, Long Valley Caldera is a 17-by-32 \Jkilometre\j hole in the ground, created 750,000 years ago by a volcanic eruption of such huge proportions that it blew 500 cubic kilometres of rock into the air -- making it shift 500 times as much material as Mt St Helens and it sent airborne ash all the way to \JNebraska\j. There have been a few other minor eruptions, the last one 500 years ago, but after that, it settles down until 1980, when there were four earthquakes of \Jmagnitude\j 6, followed by small tremors on a daily basis ever since. Since then the dome in the middle of the caldera has risen by about 70 centimetres (a little over 2 feet), and carbon dioxide outgassing on Mammoth Mountain has stepped up.\p
Hot \Jwater\j would be unlikely to cause anything more worrying than the \Jgeyser\j activity seen at \JYellowstone\j, but a shallow \Jmagma\j dome might blow as powerfully as the 1994 Rabaul Caldera in Papua New Guinea, which fitted this description.\p
If large masses move under the ground, the local gravitational field varies very slightly -- not enough to notice, but enough to measure, and because \Jwater\j is less dense than \Jmagma\j, the differences can be calculated, and dissected out of the gravitational effects. Right now, the data indicates that there is a growing dome of molten rock, somewhere under the surface, but don't cancel your holiday plans just yet: an eruption at some stage is now more likely, but that is as far as anybody is willing to go.\p
More information can be found on the Web at: \Bhttp://pangea.stanford.edu/~maurizio/maurizio.html\b\p
#
"Earth's innermost core is solid",904,0,0,0
(Dec '98)
The \IEarth's structure\i was in the news again. Two researchers told the same AGU meeting that they offered the first direct evidence that -- inside a liquid core -- the very centre of the \JEarth\j is solid. This, they say, came from analysis of seismic waves generated by the June 1996 \Jearthquake\j in \JIndonesia\j and recorded at a large-array seismic network spread across \JFrance\j.\p
We refer our readers to \BLayers in the \JEarth\j's Core,\b October 1998, for evidence that this news appears to be less than novel.\p
#
"Under pressure",905,0,0,0
(Dec '98)
But what would that solid central core be like? Scientists at the Carnegie Institution's Geophysical Laboratory have used X-ray \Jdiffraction\j techniques with diamond anvil cells to determine the elastic properties of iron under ultra-high pressure. They conclude that the inner core propagates sound in the same manner as highly compressed iron near its melting \Jtemperature\j.\p
As we reported in October, seismologists have found that sound waves travelling through the inner core in an east-west direction are slower than those travelling in a north-south direction. This difference, called seismic anisotropy, could arise from a texture in the core, rather like the grain in wood, in which case there is a parallel with the ease with which wood will split in one direction.\p
The researchers found that the elasticity of hexagonal-close-packed iron, which appears to be the form of iron under these extreme pressures, can explain the slow seismic wave velocity at the inner core. They also suggest that the speed of the seismic waves points to an inner core that is close to melting. The authors also note that an alternative scenario is possible: that the presence of additional components with low shear-wave velocities and densities similar to iron could also be present in the inner core.\p
\BKey names:\b Ho-kwang Mao, Russell Hemley\p
#
"Looking for alternatives to methyl bromide",906,0,0,0
(Dec '98)
The fumigant methyl bromide, a chemical implicated in the depletion of the \JEarth\j's \Istratospheric \Jozone layer\j,\i is due to be phased out under the Montreal Protocol, which calls for methyl bromide use to be reduced in increments, beginning with 25% less in 1999, 50% less in 2001, 75% less in 2003 and then eliminated in 2005. Methyl bromide is commonly used to fumigate soil, but it is also used in odd places, such as museums where artefacts need to be protected against attack from small invertebrates.\p
The fumigant solves an array of problems for farmers. Applied in the field before planting, the gas kills most old plant roots, weed seeds, \Inematode\i worms, soil fungi and \Jbacteria\j, all of which can reduce crop yield. It is also used to ensure that produce being shipped to markets worldwide is free of pests. So the race is now on to seek alternatives, and a range of achievements by a small army of University of \JCalifornia\j scientists were posted on the \JInternet\j during December. Here is a brief account of some of the solutions being explored.\p
Methyl iodide is proving useful in crops such as strawberries, vegetables, melons and nursery products -- it is also safer for the farm workers using it. Methyl bromide is used as a gas, which spreads through the \Jatmosphere\j more readily and is more difficult to contain than a liquid -- and methyl iodide is a liquid. The ozone depletion potential of methyl iodide is insignificant compared to that of methyl bromide because ultraviolet light in the \Jatmosphere\j decomposes nearly all methyl iodide before it can reach the \JEarth\j's protective \Jozone layer\j in the \Jstratosphere\j. And best of all, methyl iodide is actually better at controlling the delightfully named fungal disease of melons, \IMonosporascus cannonballus.\i The chemical is not yet registered as a pesticide in the US.\p
Solarizing the soil, which involves it heating it by covering with clear plastic, is just as effective as methyl bromide in killing weed seeds, nematodes and fungal pathogens in warm, inland valleys or desert areas. It is also remarkably cheap. There are great differences in heat sensitivity among these organisms, so research is now under way to create a "thermal death \Jdatabase\j", with the long-term aim of creating an \JInternet\j-accessible \Jdatabase\j telling growers how to kill off specific species of nematodes, fungal pathogens and weeds.\p
Nursery pots can also be solarized, when the pots are filled with planting mix, they are placed on a pallet and covered with a double layer of clear plastic that is held up like a tent by a frame or wire hoops. The potting mix in the pots gets much hotter than in an open field, hot enough so that after just 30 minutes, the treatment is complete. Nursery potting mix needs to be treated to get rid of weed seeds which occur in soil, but composted green waste, shredded pine bark and broken pecan shells have been found to be just as effective at suppressing weeds in nursery container soil as pre-emergent herbicides in a recent two-year study.\p
When farmers change their "permanent" crops, any live roots left in the soil provide support for damaging nematodes until the new plant roots can be attacked, severely stunting the growth and vigour of the new crop. Methyl bromide has been used to kill, among other things, living plant roots left behind when old trees and vines are removed. If this is unavailable, farmers may need to chop off trees or vines above the soil line, paint a \Jherbicide\j on the cut trunk, then wait one year before planting. This loses an entire growing season, but the organisms which thrive on live roots are overtaken during that time with organisms that thrive on dead roots. Those dead-root organisms will leave the new plants alone.\p
Electronic methods are also being explored, using ultraviolet \Jlaser\j technology to zap food-spoiling microorganisms, and electromagnetic \Jenergy\j in the form of radio waves to elevate soil temperatures just enough to kill nematodes, fungi and other pathogenic organisms.\p
Methyl bromide, as was mentioned in the introduction, is used by packing houses in the USA to fumigate harvested fruits and vegetables exported to certain foreign countries and sometimes to other states. \JCalifornia\j strawberries, for example, must be fumigated before being shipped to \JJapan\j, and the same requirements apply to other produce being shipped across borders. Research is under way into the use of carbon dioxide at cold-storage temperatures on pests found in table grapes -- western flower thrips, Pacific \Jspider\j mites, omnivorous leafroller and grape mealybug.\p
#
"Tamoxifen and estrogen",907,0,0,0
(Dec '98)
A report in a late December issue of \ICell\i describes the molecular mechanism by which Tamoxifen blocks the effects of \Ioestrogen/estrogens\i, a process that has been shown to prevent breast cancer in some women at high risk. This may offer valuable clues about ways to design new, more effective disease-preventing medications with fewer side effects.\p
Estrogen delays the buildup of artery-clogging plaque, which prevents bone loss leading to \Josteoporosis\j, and may even postpone the onset of \IAlzheimer's disease.\i But these benefits all disappear at menopause, when women stop producing oestrogen. Hormone replacement therapy can reduce the "hot flushes" of menopause, and maintain the other benefits of oestrogen, but continued exposure to the hormone increases a woman's risk of breast or uterine cancer.\p
Selective estrogen receptor modulators (SERMs), and also called "designer estrogens" are a class of drugs that function like estrogen in some tissues but block estrogen's actions in others. The best-known of these, Tamoxifen, is used for breast cancer prevention, but while it works well against heart, \Josteoporosis\j and breast cancer problems, it retains estrogen's tendency to promote uterine cancer.\p
We have known since the 1950s that estrogen works by binding to a specific receptor, found only in certain types of tissues. We now know that once it is contacted by estrogen, the receptor relays the message to turn on specific genes.\p
The action is not a simple on-off signal. When the \Iligand\i occupies its binding site on the receptor, it triggers a change in the three-dimensional shape of the receptor. This creates a docking site for either co-activators or co-repressors, other proteins which can enhance or suppress the activity of the receptor when they are bound to it. In other words, the whole operation becomes extremely complicated, especially as the shape change varies according to the nature of the compound, whether it's a full estrogen, a full anti-estrogen, or a mixed estrogen/anti-estrogen such as Tamoxifen.\p
Tamoxifen causes one region of the receptor to rotate out of place, not only preventing it from functioning but packing it into a groove, where a co-activator protein is thought to bind.\p
Another SERM, Raloxifene, appears to have similar benefits to Tamoxifen, with less risk of uterine cancer. The two will be compared with each other in a large clinical trial beginning next year, but neither is perfect, so any clue on how they interact with the estrogen receptor would allow researchers to enhance their benefits and reduce their drawbacks. \p
#
"Explaining green tea and cancer",908,0,0,0
(Dec '98)
As we have reported several times (\BGreen tea of benefit,\b September 1997, \BGreen tea kills cancer,\b December 1997), green tea appears to be useful against cancer, but now there is a scientific explanation for the effect. Two American researchers, Dorothy Morre and D. James Morre, have told the 38th annual meeting of the American Society for Cell \JBiology\j in San Francisco that epigallocatechin gallate, or EGCg, a compound in green tea, inhibits an \Jenzyme\j required for cancer cell growth and can kill cultured cancer cells with no ill effect on healthy cells.\p
Green tea leaves are rich in this anti-cancer compound, with concentrations high enough to induce anti-cancer effects in the body, and Dorothy Morre suggests that drinking more than four cups of green tea a \Jday\j could provide enough of the active compound to slow and prevent the growth of cancer cells. Epidemiologists have previously found that people who drink more than four cups a \Jday\j of green tea seem to have a lower overall risk of cancer, but scientists were unsure how the tea produced these effects.\p
All teas come from the same botanical source, but green tea differs from black tea or other teas because of the way the tea leaves are processed after they are picked. To make green tea, the leaves are not oxidised, but instead they are steamed and parched to preserve the natural active substances of the leaf better.\p
The Morres, who both work at Purdue University, have shown how green tea interacts with an \Jenzyme\j on the surface of many types of cancer cells including breast, prostate, colon and neuroblastoma. This \Jenzyme\j, called quinol oxidase, or NOX, helps carry out several functions on the cell surface and is required for growth in both normal and cancerous cells. Normal cells express the NOX \Jenzyme\j only when they are dividing in response to growth hormone signals, whereas cancer cells can express NOX activity at all times. This overactive form of NOX, known as tNOX, meaning \Jtumour\j-associated NOX , has long been assumed to be vital for the growth of cancer cells, because drugs that inhibit tNOX activity also block \Jtumour\j cell growth in cancer.\p
A black tea infusion will inhibit tNOX activity at dilutions of one part tea to 100 parts of \Jwater\j. Green tea is between 10 and 100 times as potent, inhibiting the activity of tNOX at dilutions ranging from one part tea per 1000 to 10,000 parts of \Jwater\j. They found that EGCg was capable of inhibiting the tNOX activity of cancer cells at low doses, the levels that could be derived from drinking several cups of green tea per \Jday\j, but did not inhibit the NOX activity of healthy cells. They also found that in the presence of EGCg, the cancer cells literally failed to grow or enlarge after division. Then, presumably because they did not reach the minimum size needed to divide, they underwent programmed cell death, or apoptosis.\p
#
"Hamburger with a cherry on top?",909,0,0,0
(Dec '98)
It may sound like a bizarre combination, but cherry hamburgers may yet catch on around the world. Right now, the concoction is on school menus in 16 states as part of the U.S. Department of Agriculture National School Lunch Program, according to Ray Pleva, the northern \JMichigan\j butcher and cherry grower who created the product.\p
Earlier this year, a member of the \JMichigan\j Legislature proposed, unsuccessfully, that the product should be proclaimed as "the official \JMichigan\j state burger". Now, we learn that adding cherries to hamburger meat retards spoilage and reduces the formation of suspected cancer-causing compounds known as HAAs (heterocyclic aromatic amines). The news was released on the web edition of the \IJournal of Agricultural and Food Chemistry\i in November, and in the print version during December. The \JMichigan\j link was maintained, as the researchers behind the study are from \JMichigan\j State University.\p
In the study, MSU researchers found that ground beef patties containing tart cherry tissue had "significantly" fewer HAAs when pan fried, compared to patties without cherry tissue added. Measurements showed that the fat content of cherry patties was lower than that of regular patties, but the moisture content was greater, thereby verifying early findings.\p
Combining cherry tissue with ground beef has previously been shown to produce a patty which is lower in fat, yet juicier and more tender than pure beef burgers. HAAs are formed naturally during cooking. Many of them have been determined to cause cancer in some animals and are suspected to be carcinogenic in humans. A primary cause of off-flavour, lipid \Joxidation\j also causes discolouring and texture change in meat during storage.\p
#
"Diet and cancer",910,0,0,0
(Dec '98)
A British study in the \IBritish Medical Journal\i in December points out that around half of UK cancer deaths are due to tumours of the lung, bowel, breast and prostate. At the same time, these cancers are virtually absent in many countries in the developing world, but increase in incidence within one or two generations when migrants move from low to high risk areas.\p
It follows that we should attribute many cancers common in Western populations to environmental factors, meaning that they should be largely preventable. In fact, diet may account for up to 80% of all cancers of the bowel, breast and prostate. The authors of the study highlight red and processed meats and alcohol as the highest risk foods and cite a diet rich in vegetables and fruit as the most protective. This is not to suggest that all cancers are diet-related: other environmental factors known to affect susceptibility to cancer include: physical activity, reproductive and sexual behaviour, infection with hepatitis B or C viruses, infection with \IHelicobacter pylori\i and exposure to sunlight, ionising radiation and chemicals.\p
They conclude with a short list of suggestions for avoiding cancer risks: don't smoke; take regular exercise; don't be sexually promiscuous; avoid prolonged exposure to direct sunlight and avoid contracting \Ihepatitis\i B and C viruses.\p
#
"Wine as a weapon against stroke",911,0,0,0
(Dec '98)
Other studies have suggested that moderate amounts of alcohol consumption may reduce one's risk for having a \Istroke.\i A Danish study, reported in \IStroke: Journal of the American Heart Association,\i and published in early December, concludes that wine, but not beer or spirits, may have the most protective effect.\p
The study involves a 16-year study of 13,329 people, 6067 men and 7262 women between the ages of 45 and 84. Over the 16-year span, 833 people had strokes. Those who said they drank wine on a weekly basis, about one to six glasses per week, had a 34% lower risk of stroke than those who never or hardly ever drank wine. Those drinking wine on a daily basis had a 32% reduction in risk, while people who drank beer or spirits did not have any statistically significant reduction in stroke risk.\p
The researchers think wine may offer protection because it contains flavonoids and \Jtannins\j, \Jnutrients\j which have been shown to help inhibit the development of \Jatherosclerosis\j, the plaque obstructions that cause to heart attacks and strokes. On the other hand, the wine effect may simply reflect different drinking patterns in wine drinkers, as wine may be consumed with meals to a greater extent than beer and spirits; the latter two may be consumed irregularly throughout the \Jday\j. The researchers think these differences in 'timing' may be important.\p
While men drank beer and spirits more often than women, there was no difference between the sexes in regard to amount or frequency of wine intake. Study participants were asked whether they drank beer, wine or spirits and how frequently they drank - "never/hardly ever," "monthly," "weekly" or "daily." People who drank wine had a statistically significant decreased risk of stroke in all four frequency groups compared with those who never or hardly ever drank wine. In contrast, no similar effect from drinking either beer or spirits was found in either of the frequency groups: those who drank beer weekly had a 9% higher risk of stroke and those who drank spirits weekly had a 3% lower risk of stroke, but these results were not statistically significant.\p
#
"How leprosy attacks us",912,0,0,0
(Dec '98)
A report in \IScience\i in mid-December reveals the way \Jleprosy\j (\IHansen's disease\i) targets the peripheral nerves. This is the crucial step leading to nerve damage in this disease. About 800,000 people worldwide suffer from \Jleprosy\j, which is caused by the bacterium \IMycobacterium leprae,\i but this is even more important than that. The new insight may help prevent nerve damage in \Jleprosy\j, but it will also help us understand how nerve damage starts in other neurodegenerative diseases such as muscular dystrophy, multiple sclerosis and various types of peripheral nerve diseases. And just for good luck, the researchers showed that two haemorrhagic fever viruses, one of them Lassa fever virus (LFV), use the same route to attack cells.\p
The \Iperipheral \Jnervous system\j\i includes all the nerves that fan out from the central \Jnervous system\j to the skin, muscles and internal organs. As the cells develop, Schwann cells surround the nerve fibres and wrap around the axon to form the \Imyelin\i sheath. The \Jmyelin\j acts as an insulator, allowing the nerves to function with much greater reliability. If the \Jmyelin\j is damaged, the nerve fibres are no longer insulated and nerve impulses cannot be conducted efficiently.\p
In the peripheral \Jnervous system\j, this Schwann cell-axon unit is surrounded by a protective layer called the basal lamina, which is secreted by the Schwann cells. The basal lamina is made of molecules called laminin-2, which hooks onto the Schwann cell by a "hook" called a laminin receptor. Another molecule, called dystroglycan (DG), is also involved. It belongs to a common family of molecules called glycoproteins, which are often embedded in cell membranes.\p
Together, these molecules manage communication between the Schwann cell and its environment. This is where the \Jleprosy\j bacterium attacks, using a cell binding area of laminin-2, called the G domain, to attack the Schwann cells. And using the bacterium they also identified, for the first time, the site on the laminin-2 molecule that interacts with alpha-DG. This is important, because the main cause of some types of muscular dystrophy involves the disruption of laminin-2-DG \Jlinkage\j.\p
\JLeprosy\j can be treated with multidrug therapy that kills most of the \IM. Leprae\i in a few weeks. The nerve function loss caused by \IM. Leprae\i invasion of Schwann cells is irreversible, and this leads to complications if patients lose all sensation in the hands and feet, so that accidental burns or injuries are not noticed. This can lead to serious scarring or even to the loss of fingers or toes and facial disfiguration, so knowledge which may help prevent this sort of damage would be extremely useful.\p
In a second report, researchers describe studies using mutated mouse cells that lacked DG. These were able to resist infection with an arenavirus called lymphocytic choriomeningitis virus (LCMV), until the researchers reintroduced the gene for DG into the DG-deficient cells, when they once again became vulnerable to viral infection. It appears that both LFV and LCMV bind to DG on the surface of mouse cells.\p
It may seem odd, for dystroglycan to be involved in both bacterial and viral infection, but it is a major glycoprotein sticking out of the cell. As such, it ought to be a prime target for microbes which typically use such a path to attack our cells.\p
\BKey names:\b Anura Rambukkana, Kevin Campbell\p
#
"Lifeguard lung",913,0,0,0
(Dec '98)
Lifeguards at indoor swimming pools with \Jwater\j spouts and sprays, waterfalls and \Jwater\j slides may contract a lung disease after breathing \Jbacteria\j suspended in \Jwater\j droplets small enough to be inhaled into the lungs. \p
In one study reported in the \IAmerican Journal of Public Health,\i 65% of a group of lifeguards at an indoor pool at a large municipal recreation centre complained of symptoms such as frequent cough; recurrent wheezing or chest tightness; laboured, difficult breathing; and/or fever that occurred during and after work hours.\p
Lifeguards spend longer periods of time in the pool area, breathing potentially contaminated aerosols, but even users may be at risk in some cases, but so far, only people working in the pool area suffered from the illness. Properly named granulomatous pneumonitis, this disease results in inflamed nodules in the lungs. The disease is caused when the immune system in the lungs turns on as a reaction to an inhalant. This is because tiny \Jwater\j droplets can get deep enough into the lungs, and if the disease is not treated properly, permanent lung scarring can occur.\p
\JChlorine\j may kill the \Jbacteria\j themselves, but their remains are still capable of causing an immune reaction in the lung. From monitoring, it appears that bacterial byproducts measured in the air at the indoor pool were 27 to 162 times higher than in two control pools which lacked \Jwater\j sprays, and 25 times higher than outside air. Concentrations of airborne \Jbacteria\j byproducts were highest near the lifeguard's "crow's nest", about 8 feet (2.4 metres) from the surface of the pool, than at pool the level itself.\p
#
"Penicillium infections in HIV patients",914,0,0,0
(Dec '98)
We tend to think of the \Jfungus\j \IPenicillium\i as a source of wonder drugs, but not all species of \IPenicillium\i are equally human-friendly. A large number of HIV patients in \JThailand\j have become ill from a previously very rare \Jfungus\j called \IPenicillium marneffei.\i This is a potentially fatal infection which can be treated with itraconazole, according to a December report in the \INew England Journal of Medicine.\i\p
In northern \JThailand\j, \IP. Marneffei\i is the third most common opportunistic infection in HIV patients. The infection causes weight loss, fever, anaemia, and skin lesions. It has a 20% fatality rate even with appropriate treatment but is 100% fatal if not diagnosed and treated.\p
#
"The genome of a nematode",915,0,0,0
(Dec '98)
We reported in August 1997 (\BWorms in the news\b) that the \Jgenome\j of the \Inematode,\i \ICaenorhabditis elegans,\i was "due to be completed soon". In fact, sequencing the animal's 97 million-base \Jgenome\j took a little longer than that, but the final result was published in the December 11 issue of the journal \IScience.\i\p
While several of them will fit on the head of a pin, the 1 \Jmillimetre\j-long roundworm has, like humans, a \Jnervous system\j. It also digests food, and has sex: no other advanced animal has ever been completely sequenced like this. Some nematodes are parasites, but \IC. Elegans\i lives happily in rotting vegetation, or in the laboratory, and will live in Petri dishes on a steady diet of the bacterium \IE. coli,\i throughout their life cycle of two to three weeks.\p
The \Jnematode\j came into prominence in the 1960s, when molecular biologist \ISydney Brenner\i of the Medical Research Council Laboratory of Molecular \JBiology\j in Cambridge decided to use the tiny worm for developmental studies. A fully-grown animal has just 959 cells, including a \Jnervous system\j made up of about 300 neurons, and the cells are all transparent, allowing all the cells to be seen.\p
As Brenner's team studied the worm, they began to build up a map of the whole \Jgenome\j, based on inferences they were able to draw about various \Jgenome\j fragments. That work led to this year's completion of the entire \Jgenome\j sequence.\p
At one stage, the \Jnematode\j sequence was to be completed as a "dry run" for the human \Jgenome\j project, which was certainly inspired by the successes researchers found in the earlier stages of \Ithe C. elegans\i project. The skills developed there have certainly been transferred to the much larger human \Jgenome\j project, and that project's managers announced recently that it expected to complete the 3 billion-base pair human \Jgenome\j sequence two years ahead of time, partly because the worm sequencers had established such successful methods for complex genomes.\p
Like humans, the worms reproduce sexually, though unlike humans, they are able to fertilise themselves. A digestive tube runs the full length of the worm's body, and the tiny \Jnervous system\j can detect odour and taste, as well as respond to \Jtemperature\j and touch. The worms develop from an embryo, form complex tissues and organs, eat, digest their food, grow old and die. In more than thirty years of study, researchers have mapped every connection in the worm's \Jnervous system\j, and the lineage of each cell in the adult animal's body has been tracked from the moment of fertilisation.\p
The worm's genetic material is located on six chromosomes. The \Jgenome\j carries 19,099 protein-coding genes -- about one every 5,000 DNA bases-and there are also around 800 genes that have other functions. This is several times the number of genes that classical genetic studies would suggest, indicating that even now, there are things that the geneticists do not fully understand. About 40% of the 19,099 genes match those of other organisms, including humans, with the remainder sitting in the "don't know" basket for the moment. The chromosomes also contain large amounts of repeated DNA which does not encode proteins. Geneticists assume that it probably plays some role in \Jchromosome\j function, or in organising genes, or in regulating their activity.\p
\BKey names:\b Robert Waterston, John Sulston\p
#
"Eight calf clones",916,0,0,0
(Dec '98)
Mid-December saw a report in \IScience\i that \Jcloning\j veteran Yukio Tsunoda of Kinki University in Nara, \JJapan\j, has succeeded in efficiently \Jcloning\j a large \Jmammal\j. Tsunoda used the "Dolly method" to clone eight calves from one adult cow with a success rate of 80%, which is extraordinarily high.\p
When Ian Wilmut cloned Dolly (see \BSheep \JCloning\j a Success,\b February 1997), he started with 400 eggs to get 29 embryos and one cloned animal. The Japanese group used 125 cumulus and oviduct cells as their starting points to get eight calves, four of which died at or soon after birth. Oviduct cells are part of the lining of the female reproductive tracts, while cumulus cells usually attach to the embryo in the womb and help feed it. In each case, the cells were starved into a resting state, and then fused with cow egg cells from which the nuclei had been removed.\p
The cells need to be in a resting state (quiescence) so that as many genes as possible are switched off, returning the nucleus as far as possible to the state found in the nucleus of a fertilised egg. The fusion is achieved by an electric shock.\p
Japanese researchers have an excellent tradition of \Jcloning\j research, although continued failures had caused cutbacks. After "Dolly", work started up again, and in recent months, five groups have produced a total of 19 calves cloned from adults. There is one major difference between Japanese research and that in the rest of the world: with the best beef selling in \JJapan\j for US$100 a pound (about $220 a \Jkilogram\j), the emphasis in \JJapan\j is on improving beef quality, while the rest of the world seeks to produce transgenic cows which secrete various pharmaceuticals into their milk.\p
Tsunoda is reported to be working on a second batch of cloned calves produced using cells from twenty different tissues, including the liver, kidney, and heart cells.\p
#
"Another CP violation?",917,0,0,0
(Dec '98)
Why does the universe contain more matter than antimatter? According to standard theory, the two types of particles should exist in equal amounts. One possible answer is to be found in \ICP violation,\i seen in 1964 in decay of particles called kaons. But since that time, the effect has not turned up in any other particle, and physicists have been driven to speculate whether CP violation is a general principle of nature or somehow restricted to a single system.\p
News broke in December, however, that scientists at the giant Fermi National Accelerator Laboratory (Fermilab) particle accelerator have observed something which may represent a second case of CP violation. The work will be reported "soon" in \IPhysical Review Letters,\i and it involves a slight difference in the rate at which B mesons and anti-B mesons decay. So far, researchers are being careful: the difference only involved a small sample, so that the result has, at best, weak statistical confidence. All they will claim is anomalous behaviour, but this will obviously be an area to watch.\p
#
"Research spending",918,0,0,0
(Dec '98)
Mid-December saw an assortment of announcements from around the world on research funding by various governments. China has launched a US$300 million program to run through to the end of 2003, covering fifteen areas, and Premier Zhu Rongji hailed the program (equivalent to about 5 cents a year per head of population) as a "top concern". Emphasis is to be on rapidly growing fields, such as information sciences and biotechnology, that officials expect to contribute significantly to the country's economic prosperity, and awards will be made under the title Program 973, named for the year and month (March 1997) it was proposed by the National Committee of the Chinese People's Political Consultative Conference.\p
A couple of days earlier, more than 800 members of the national governing committee of CNRS, \JFrance\j's giant basic research agency, were crammed into the ornate House of Chemistry to argue against research minister Claude AllΦgre's proposals to reshape the French research establishment. This is only the fourth time the CNRS's full science committee has convened since it was created in 1945, the first time that it met as a result of a request from the researchers. If future ministers have any sense, it may also be the last such meeting.\p
AllΦgre's plan seeks to create closer ties between the CNRS, universities, and industry, but the researchers disagreed heatedly with this notion. Research conditions, they say, have been poor for many years, and budgets have been stagnant. The real damage for AllΦgre came from what many see as a heavy-handed attempt to put the CNRS under the control of the universities, which have a poor reputation among research scientists. The meeting has had one positive effect, however, in making the scientists united as never before, but whether they will move unitedly in the direction proposed by AllΦgre remains to be seen.\p
Russian science has gone from a nosedive to a plummet over the past few months. The 1999 budget, submitted to the lower house of Parliament, or Duma, by new Science Minister Mikhail Kirpichnikov in mid-December, represents a 70% cut, when converted into hard currency terms, dropping from around $10 billion in 1990, to $1.83 billion in 1998, and just 11 billion roubles or $520 million in 1999. This means that 1999 will see Federal spending drop even lower. The amount spent per researcher has already dropped from $9000 in 1997 to $5000 in 1998, less than 4% of expenditures typical in the West.\p
#
"The great census debate",919,0,0,0
(Dec '98)
An argument which we outlined (\BUnited States 1997 \JCensus\j,\b June, 1997) some time back has now reached the US Supreme Court. Earlier this year, two U.S. District courts ruled unanimously that sampling violated the \JCensus\j Act, and the US \JCensus\j Bureau has appealed both cases to the Supreme Court.\p
The bureau plans to use sampling to estimate some 10% of the nation's population, which it says will be more accurate and save $675 million. House Republicans and others on the conservative side of the fence have called for a traditional person-by-person count, arguing that the sampling procedure is subjective and would be open to partisan tampering.\p
Democrats suggest that the Republicans are more interested in avoiding a system which would be more reliable in counting disadvantaged people who tend to support the Democratic Party. If these people were reliably counted, Democratic districts would be split, increasing the number of Democrats returned in the next House of Representatives election in 2000, when present indications are that a voter backlash may well see the Republicans crushed in any case, as a result of the Clinton \Jimpeachment\j moves.\p
The Democrats' self-interest argument is borne out by comments such as that of the conservatives' attorney, Michael Carvin, who told the court "We do not contest that it's more accurate, but sampling is illegal and unconstitutional." On the other side, Solicitor General Seth Waxman argued for the \JCensus\j Bureau that the case should be thrown out because neither the House nor the other plaintiffs had standing to bring the case, as they have not been legitimately harmed by sampling.\p
Waxman also pointed out that the \JCensus\j Act reads, in part that the Secretary of Commerce shall conduct the \Jcensus\j \I"in such form and content as he may determine including the use of sampling procedures and special surveys."\i\p
In the end, even if the Supreme Court were to find that sampling, using statistical methods to add people missed in a head count, is neither illegal nor constitutional, the debate will go back to the House of Representatives, where legislators who oppose sampling may try to withhold funding for the effort. Democratic supporters of sampling say that if they lose in court, they would move for a two-number \Jcensus\j, as the case before the courts applies only to the allocation of Representatives seats among the states, and the sample estimates might still be used for creating congressional districts within states and for the allocation of federal funds.\p
One thing is certain: either way, US news in early 1999 will carry a great deal of \Jrhetoric\j about the reliability of statistics, close to what mathematicians mean when they speak of "statistical abuse".\p
#
"The farthest quasar",920,0,0,0
(Dec '98)
The Sloan Digital Sky Survey is still getting under way, but the new sky-mapping technology being used in the survey has already located three of the four most distant quasars known. Two astronomers, Fan and Strauss, operated the 3.5-metre \Jtelescope\j at Apache Point Observatory in New Mexico, over the \JInternet\j to follow up on \Jquasar\j candidates from the Sky Survey data, and they reported their results at a conference in early December.\p
As their time on the \Jtelescope\j ran out, they turned to one of the last promising high-\Jredshift\j \Jquasar\j candidates before they finished up. As soon as they saw the spectrum, they say they knew they had a record-breaking \Iquasar.\i The finds have redshifts of 5.0, 4.9, and 4.75, with two of the three eclipsing the former record holder, discovered in 1991 at a \Jredshift\j of 4.89.\p
Extrapolating from the find, astronomers say that with just 1% of the sky surveyed, along the celestial equator, they should find some 500 quasars with \Jredshift\j greater than 4.75. The Sloan Survey scans millions of objects, and uses special software to select particularly interesting ones for a follow-up look. Strauss and Fan were given 19 \Jquasar\j candidates by the software, and follow-up spectra confirmed 12 of them as actual quasars, a 70% success rate. That far exceeds the 10% success rate typical of \Jquasar\j hunts.\p
\BKey names:\b Michael Strauss, Xiaohui Fan, Heidi Newberg and Brian Yanny\p
#
"A new pulsar",921,0,0,0
(Dec '98)
Colleen Wilson-Hodge, an astrophysicist at NASA's Marshall Space Flight Center has just scored two for the price of one: rather than discovering a new star, she has found two: a \Ipulsar\i orbiting a massive star. This is not her first claim to fame: see \BScientist Finds 2-in-1 Burster\b (25 March, 1998), where she is identified as Colleen Wilson.\p
The massive companion star is a superhot blue-white star (type \IB[e]\i) about 8 to 15 times larger than our \Jsun\j, with a distinctive signature in the form of emission lines caused by glowing \Jhydrogen\j and oxygen blown off the star. The pulsar is known as XTE J1946+274, and also as GRO J1944+26. The visible companion star may well be a type \IB[e]\i star about 13,000 light years away.\p
The object may have been detected by Canada's Ariel \Jsatellite\j in 1976, as a recorded object, for 3A 1942+274, lies within the RXTE and BATSE error boxes. (The 1942 is the right \Jascension\j (or hour angle: 19h 42m) and the +274 refers to the \Jdeclination\j, the angle above or below the equator: +27.4 degrees.\p
Wilson-Hodge found a 15.8-second period in the data, clearly making the object a pulsar. It has a burst pattern which makes it the 33rd member of a small, growing clan of transient, \Iaccretion\i-powered pulsars. (See \BEinstein's Frame Dragging,\b November 1997) It was a recent giant outburst which made the pulsar apparent, when it reached a peak of 35 milliCrabs (3.5% the brightness of the Crab Nebula). Next step: to wait and observe some "normal" bursts to see if they can determine the orbital period, and some of its other secrets.\p
#
"NEAR finds problems, solutions",922,0,0,0
(Dec '98)
In mid-December, things were looking good for the NEAR \Jspacecraft\j (see \BMathilde - not your average asteroid?\b, July 1997, and \BNEAR but far,\b June 1998) to make interplanetary history, with a scheduled arrival at asteroid 433 Eros and the first close-up and comprehensive study of an asteroid set for January 10, 1999. Then on December 20, the initial rendezvous burn of NEAR was aborted. All is not lost, however, as the craft remains close to Eros, and will be able to \Jorbit\j it later.\p
The rescheduling of the NEAR mission was made necessary by the abort of a planned 20-minute engine burn on December 20, 1998. The \Jspacecraft\j aborted the firing after the gentler settling burn was complete, and just seconds after initiation of the main (bipropellant) burn, causing communications with the \Jspacecraft\j to be lost for about 27 hours. Contact was reestablished early on December 22 when NASA's Deep Space Network locked onto a radio signal from the NEAR \Jspacecraft\j.\p
NEAR then began downloading stored data, which revealed "that the brief engine burn exceeded certain safety limits associated with the onboard system that autonomously controls the \Jspacecraft\j". In other words, the \Jspacecraft\j decided that something was wrong when the main engines fired, and this resulted in the abort when the fault protection software identified a problem and switched NEAR to a safe mode. The values have now been altered in the program, and a burn on January 3 was expected to get the craft back on target, after a December 23 flyby gave the mission managers, The Johns Hopkins University Applied Physics Laboratory (APL), extra information.\p
Within days of the abort the NEAR team developed a complicated command sequence for the December 23 flyby of Eros, to obtain multi-colour images, near-infrared spectra, and magnetic field measurements. The commands were uploaded to the \Jspacecraft\j in record time and executed as planned, producing images of the asteroid and other valuable data.\p
Mission designers now expect the rendezvous with Eros will take place by May 2000. The planned burn will last 24 minutes and will increase the \Jspacecraft\j's speed by 939 metres per second (2,100 mph), putting it close to the same speed as Eros. The burn will be divided into an initial 3-minute, small hydrazine settling burn that will change the velocity by only 5 metres per second (11 mph) and a 21-minute, bipropellant main engine burn that will provide the rest of the velocity change.\p
NEAR will then travel in an \Jorbit\j around the \Jsun\j that nearly matches that of Eros. For the next year NEAR will travel behind Eros in a slightly closer \Jorbit\j to the \Jsun\j. By mid-February 2000, NEAR will catch up to Eros. The \Jspacecraft\j will then enter \Jorbit\j around Eros and begin its year-long study of the asteroid.\p
Thomas B. Coughlin, NEAR project manager at APL reported on the \JInternet\j: "We're very confident that we've found the problems associated with the December 20 abort. The abort lost us time but the flyby gave us valuable information about Eros' shape and mass that we wouldn't have had -- information that will help us during our orbital phase a little more than a year from now." \p
Web information is available at \Bhttp://near.jhuapl.edu,\b where regular updates are available.\p
#
"Radiation belts bad for satellites",923,0,0,0
(Dec '98)
The \IVan Allen radiation belts,\i once thought to be quiet doughnut-shaped (or bagel-shaped) regions containing electrons and protons centred thousands of miles above \JEarth\j's surface now appear to be much more energetic, according to what Daniel Baker told the American Geophysical Union in San Francisco in early December. Baker said new findings indicate that electrons in the Van Allen radiation belts circling \JEarth\j are energised to speeds much higher than researchers had thought.\p
He suggests they should be seen as powerful, energetic \Jparticle accelerators\j, and a source of "killer electrons", like those which probably played an important role in the failure of the Galaxy 4 \Jspacecraft\j in May 1998. The event led to a temporary loss of pager service to 45 million customers in America.\p
This is a concern, as the next solar maximum period, when the \Jsun\j is most active, is expected in late 2000 or early 2001, and this will see the belts vary much more wildly, so a study is now under way which may help scientists protect satellites better by powering them down, or using back-up systems during electronic storms.\p
#
"Robosurgeon does heart bypass",924,0,0,0
(Dec '98)
December saw a world first -- a robotically-assisted heart bypass surgery on a 70-year-old female patient, using the ZEUS \UTM\u Robotic Surgical System. This allowed the surgeon to perform delicate and critical suturing on the heart through tiny ports, a technique not possible with conventional open-heart surgery.\p
In normal open-heart surgery, a 12 to 15 inch incision is required to split a patient's breast bone to provide a surgeon direct access to the heart. In the near future, says the surgeon involved, Ralph J. Damiano Jr., they should be able to perform completely closed-chest heart bypass surgery, making cardiac surgery even less invasive. The "ports" are pencil-sized openings: a small endoscope camera is inserted into the chest through one port and is held by a voice-controlled robotic arm, while the other two robotic arms manipulate surgical instruments under the surgeon's direct control.\p
The main advantage to the patient: reduced patient pain and trauma, shorter recovery times and convalescent periods, and overall improved outcome, says Damiano, who will be carrying out more operations in early 1999. Movements of the surgical instruments are controlled through handles which resemble conventional surgical instruments. The movements of the instrument handles are scaled, and any hand tremor is filtered.\p
#
"Heart tremors, earth tremors, where's the difference?",925,0,0,0
(Dec '98)
Moving on to tremor of another sort, a report in the \IAnnals of Biomedical \JEngineering\j\i in December describes a method of applying the same \Jmathematics\j used for measuring the \Jearth\j's seismic activity to finding early signs of heart trouble. The study was conducted in pigs, whose hearts are structurally similar to human hearts.\p
The technique involves mapping the components of electrical waves recorded at the heart's surface with each heartbeat. Then they take a wave front, a two-dimensional structure, and break that complex wave into its individual wave components. They use a one \Jcentimetre\j-square grid with 144 electrodes to map the simultaneous electrical activities occurring beneath each electrode.\p
\BKey names:\b Timothy A. Johnson, Wayne Cascio.\p
#
"No danger in implants",926,0,0,0
(Dec '98)
The debate about breast implants continues (see \BBreast Implants Dangerous?,\b May 1997, \BImplants Safe in Britain,\b July 1998). Now a court-appointed scientific panel in the USA has concluded that, based on the available evidence, silicone breast implants do not appear to cause immune diseases such as \Isystemic lupus erythematosus.\i The report was released at the end of November, 1998.\p
Judge Sam J. \JPointer\j of the U.S. District Court in \JAlabama\j appointed the panel two years earlier to review the scientific evidence in lawsuits by women claiming their implants caused debilitating symptoms ranging from fatigue to sore joints. \JPointer\j asked the panel members, a toxicologist, an immunologist, an epidemiologist, and a rheumatologist to consider whether the plaintiffs' expert testimony "provide[s] a reliable and reasonable scientific basis" for concluding that silicone breast implants "cause or exacerbate" systemic diseases, such as lupus or connective tissue disease, that might account for the reported symptoms.\p
The panel considered 40 studies in their fields and heard from scientific witnesses, before concluding that they could find no links between implants and disease. Dow Corning, which is in \Jbankruptcy\j, proposed a US$3.2 billion settlement of its lawsuits recently, and while plaintiffs can still opt to go to trial, this finding is likely to have some influence on thousands of pending cases.\p
#
"Pedagogical agents",927,0,0,0
(Dec '98)
Most advances in educational technology fail for one of two reasons. Some of the advances are announced and hyped long before they become reality, while many of the others are developed and tested in the most suitable of all environments, then announced and hyped in much the same way.\p
Typically, new developments which are actually created are then tested on arithmetic or other simple areas of \Jmathematics\j, or in areas such as \Jengineering\j or physical chemistry, where learning is essentially a chain of linear steps. In short, the new developments of the past have all been quite good for linear training where learning proceeds along a single dimension, but next to worthless when they are used for any sort of education in more than one dimension.\p
Two pedagogical agents, or softbots, announced on the \JInternet\j during December may have managed to break through the dimensional barrier. Steve (Soar Training Expert for Virtual Environments) hovers nearby as you move electronically through a maze of controls in the engine room of a virtual U.S. Navy surface ship. He provides explanations, answers questions, gives demonstrations, and offers helpful hints when you are stumped. He never makes a mistake and never tires; yet he has infinite patience with human fallibility and fatigue -- or so claim Steve's developers at the Information Sciences Institute (ISI) at the University of Southern \JCalifornia\j's School of \JEngineering\j.\p
Steve appears when you put on a head-mounted display, a helmet containing tiny computer screens in front of each eye, and a pair of data gloves with built-in position and touch sensors. You are now equipped to enter the stereoscopic, three-dimensional environment of the virtual engine room he inhabits. \p
Steve is not merely programmed to follow a fixed script: he can make decisions as a task is being performed. You can also watch from multiple angles as Steve demonstrates a task and explains it to you verbally. Steve can also work with several students, representing the ship's crew, at once. He understands how the crew members' roles interact. Moreover, he can coordinate his instruction with other virtual agents who can assist individual "crew members" or play the role of a missing seaman.\p
Steve's limitation, however, is that he runs on a powerful Silicon Graphics workstation, but the other new softbot, Adele (Agent for Distance Education Learning Environments), can run on an ordinary PC. Adele is a two-dimensional, animated persona implemented as a Java-based applet for medical students taking on-line courses. (An applet is a software program that is downloaded automatically off the World Wide Web and run on the student's computer. Because it is written in Java, an applet is much safer to download and use, and it is platform-independent, running on any machine with suitable software.)\p
Adele wears a white medical coat, carries a clipboard, and has a stethoscope draped around her neck as she monitors students while they examine a virtual patient on their computers. Adele can be programmed to teach a variety of subjects. While her first assignment has been to instruct medical students and physicians, she is already being adapted for dentistry coursework and developers are investigating the possibilities for an electrical \Jengineering\j course.\p
But will softbots replace teachers? Not in the immediate future, say the development team. Teachers are most effective when they work one-on-one with students, they say. Even so, human teachers cannot work one-on-one with everyone in the class at once. Software agents can, and they can be available all the time, freeing up teachers to handle the more important aspects while the agents carry out routine instruction, and even grade tests.\p
Teachers will still need to intervene when students run into difficulties. And for the foreseeable future, softbots will not be able to read facial expressions, although research is being carried out in this area. Now the developers are beginning to take serious note of questions such as whether or not softbots should express disapproval of a student's failing efforts, or show it in their own facial expressions.\p
One lesson has already been learned from Steve, who started out as a face and torso, with hands that only came into the picture when they had to do something. The team found that his lack of arms bothered too many people, so they have been adding more body parts. For the moment, voices will be synthesised locally, using a text to speech generator. Even if this gives an obviously artificial voice, audio clips take up too much memory. In the same way, the x-rays that Adele refers to are downloaded in advance and stored locally.\p
For more information about Steve and Adele and the Center for Advanced Research in Technology for Education (CARTE), point your browser to \Bhttp://www.isi.edu/isd/carte/\b\p
#
"PalmPilots to rule the world?",928,0,0,0
(Dec '98)
PalmPilots, the popular handheld information manager, can do rather more than their specifications suggest. During December, European users have been reporting success in getting the infrared-equipped Palm III to emulate remote-controlled car keys. The trick is to use freely available software called OmniRemote, designed to let the infrared-equipped PalmPilots emulate the behaviour of any \Jtelevision\j remote control.\p
The PalmPilot then samples the digital code sent out via infrared by any sort of remote control system. Once the sample has been captured, the unit can be trained to send out remote signals to control TVs, stereos, and VCRs. All a thief needs to do is spot an approaching car owner and surreptitiously move directly into the path of the infrared beam as the person unlocks the car.\p
Manufacturers say they have the system under control: cars less than three years old have a changing code system, called "rolling code" or "code shift", which means that the lock and the key set themselves to a new sequence after each use. PalmPilot's makers, 3Com, ran for cover a bit, pointing out that the problem has been around for years, in the form of universal remote controls and infrared-equipped laptop computers. They say they are no more to blame than paper clip manufacturers are if a paper clip is used to pick a lock. Interestingly, PalmPilot hackers agree with the company.\p
Meanwhile Canadian hackers have discovered that the PalmPilot can also be used for free phone calls. Once again, it is a new variation on an old trick, exploiting a weakness in older pay phones that hackers, have used for years. The new hack, in the form of RedPalm software plays tones through the PalmPilot's speaker that can fool some phones into believing that callers have deposited quarters.\p
The name RedPalm refers to the machine carrying the software which emulates a "red box", a device made from kit parts, or even from a recordable greeting card, which performs the same function.\p
#
"The oldest tropical ice core yet",929,0,0,0
(Dec '98)
As we indicated in June 1997 (\BEquatorial ice cores\b), the possibility of gathering ice cores from a \Jglacier\j atop a Bolivian \Jvolcano\j has had scientists interested for some time. A recent report in the journal \IScience\i describes a climate in the tropics that was different from what many researchers have thought. An analysis of ice cores drilled there is painting a vivid picture of climate conditions in the tropics over the past 25,000 years. The ice at the bottom of the cores was formed during the last glacial maximum, the coldest part of the last \Jice age\j, making it the oldest core ever recovered from the tropics.\p
These findings go a long way to help build a global climate record that reaches from the North to the South Pole, and the results are turning in some surprises. Until very recently, most researchers believed that only the polar and mid-latitude regions experienced drastic cooling during that period and that the tropics were largely unaffected. The new cores, however, suggest that the tropics were much cooler during the last glacial stage.\p
They also reveal that the area was eight times less dusty during the height of that last \Jice age\j than it is now. A decrease in dust indicates that the regional climate was much wetter at the time when the ice was laid down.\p
Gathering the data involved researchers from Ohio State, \JPennsylvania\j State University, the Russian Academy of Sciences and ORSTOM in \JEcuador\j. Together, they climbed Sajama, a 6542-metre (21,463 feet) extinct \Jvolcano\j towering over the Bolivian \JAltiplano\j, a vast plateau of high desert. Once there, they used a solar-powered drill to bore through the ice cap at the summit and retrieve two cores. Each core reached to bedrock: one at 132.4 metres (434.3 feet), the other as 132.8 metres (435.6 feet).\p
The base ice in the cores dates back 25,000 years into the coldest period of the last \Jice age\j -- the last glacial maximum (LGM). The cores also contained insects and paper-thin bark fragments from the polylepis trees that populate the lower flanks of Sajama, allowing the first-ever confirmatory carbon-dating of the ice cores. Organic material at less than 1.5 metres (4.9 feet) from the bottom of the cores was dated back 24,000 years.\p
In the June 1997 report, we indicated that the cores were to be brought down by hot air \Jballoon\j, but one of the expedition has advised us by e-mail that this did not come off. "The \Jballoon\j delivery from the top of Sajama failed to materialise. A combination of extreme winds at the summit and altitude sickness among the flight crew precluded that approach. In the end, the core sections were man-hauled down from the summit." According to Lonnie Thompson, they had to build a road through a Brazilian national forest so that the trucks could get to the base of Sajama.\p
\BKey names:\b Lonnie Thompson, Bruce Koci, V.S. Zagorodnov, Ping-Nan Lin, ; V.N. Mikhalenko, T.A. Sowers and B. Francou.\p
#
"Grandmothers can reduce NHS costs?",930,0,0,0
(Dec '98)
Grandmothers are an interesting phenomenon, as we noted in \BExplaining Grandmothers,\b April 1998. Now a mid-December report in the \IBritish Medical Journal\i in mid-December suggests that grandmothers are of economic value to Britain's \INational Health Service (NHS).\i It seems that grandmothers can provide reassurance for families with young children, thereby reducing the likelihood of parents taking children to accident and emergency (A&E) departments unnecessarily.\p
Dr Emma Fergusson and colleagues from the Royal Free \JHospital\j, London say that children who have a grandmother involved in their care are less likely to visit A&E with minor or trivial conditions. This appears to be a serious story, but the news perhaps needs to be taken in the context of the next story -- it is, after all, the silly season.\p
#
"Fly fishing could be costing the NHS",931,0,0,0
(Dec '98)
Described as the CRACKPOT study in evidence based \Jtrout\j fishing, this is a \Jparody\j of standard research design as used in the medical form of the \Iscientific method.\i Ineffective \Jtrout\j fishing flies may be costing the British NHS valuable resources, say researchers from Oxford in the December 17 \IBMJ.\i \p
In a study conducted on the River Kennet in Berkshire, Julian Britton, Consultant Surgeon from the John Radcliffe \JHospital\j and colleagues from the Radcliffe Infirmary and Wadham College in Oxford, find that doctors are poor predictors of which fly to use when fishing. They extrapolate this finding and suggest that doctors who fish \Jtrout\j as a hobby may be spending unnecessarily lengthy periods of their leisure time on the river rather than reading Department of Health circulars and composing letters of helpful advice to their Minister.\p
In a deadpan account, \IBMJ\i publicists say the authors concede that their study may be regarded as nothing more than a fishing expedition as it is not based on an agreed hypothesis and conclude that their findings call for the urgent funding of a definitive, large multi-river trial.\p
#
"Chocolate and sweets can make you live longer?",932,0,0,0
(Dec '98)
And in the same issue of the \IBMJ,\i good news for those whose Christmas dinners were too large, and augmented by too many other goodies. Indulging in sweets a few times a month can help you to live longer, suggest I-Min Lee, Assistant Professor Department of \JEpidemiology\j, Harvard School of Public Health, \JBoston\j and Ralph Paffenbarger, also from the Harvard School of Public Health.\p
They say that a study of 7841 men who commenced their studies at Harvard between 1916 and 1950 shows that those who ate candy (chocolates or sweets) lived almost a year longer than those who abstained. Consumption of candy was assessed in 1988, when men were aged 65 years on average.\p
In a longitudinal study, over the next five years, mortality rates were lowest among those indulging one to three times a month and highest among those who abstained, even after accounting for confounding factors. But in bad news for Christmas binge enthusiasts, they found that those who indulged three or more times a week did not reap as much benefit as men eating chocolates or sweets one to three times a month and therefore caution that "as with most things in life, moderation seems to be paramount". Even so, those eating sweets three or more times a week still did better than abstainers.\p
Lee and Paffenbarger say that the presence of antioxidant \Iphenol\i compounds in chocolate, which are also present in red wine, could be helping to reduce the risk of coronary \Jheart disease\j. They also speculate that cacao, from which chocolate is made, can inhibit \Joxidation\j of low density lipoprotein \Jcholesterol\j as well as enhance immune function, leading to decreased risks of \Jheart disease\j and cancer.\p
Sadly, as so often happens with research today, the authors have been forced to admit a potential conflict of interest, as they each tend to be partial to a chocolate bar a \Jday\j.\p
See also: \IChocolate and \Jcocoa\j\i\p
#
"1999 Science in Review",933,0,0,0
\JJanuary, 1999 Science Review\j
\JFebruary, 1999 Science Review\j
\JMarch, 1999 Science Review\j
\JApril, 1999 Science Review\j
\JMay, 1999 Science Review\j
\JJune, 1999 Science Review\j
\JJuly, 1999 Science Review\j
\JAugust, 1999 Science Review\j
\JSeptember, 1999 Science Review\j
\JOctober, 1999 Science Review\j
\JNovember, 1999 Science Review\j
\JDecember, 1999 Science Review\j
#
"January, 1999 Science Review",934,0,0,0
\JThe body clock reaches further\j
\JWhat use is a theoretical breakthrough?\j
\JSunscreen ingredient can cause DNA damage \j
\JResistant bacteria\j
\JCDC plan for resistant bacteria\j
\JHeart research Top 10 for 1998\j
\JNew diagnostic test for new variant Creutzfeldt-Jakob disease\j
\JUnderstanding Taxol\j
\JDietary soy set to be the new miracle food\j
\JBut there's more . . .\j
\JThe smoker's gene\j
\JThomas Jefferson and Sally Hemings\j
\JWolf Prize to Eric R. Kandel\j
\JThe planet makers\j
\JIs Pluto really a planet?\j
\JHolding up ET\j
\JDark matter report\j
\JBiggest in the universe . . .\j
\JThe bonds in water\j
\JVirtual sculptures\j
\JApples which don't turn brown\j
\JPhotonic crystals in the news\j
\JEarthquake deaths\j
\JAncient Greek quakes\j
\JQuestioning the Cambrian explosion\j
\JReconstructing the theropods\j
\JDeep treasure\j
\JFormaldehyde emissions\j
\JA cause for war\j
\JWhat is a quarantine system worth?\j
\JA bigger Arctic ozone hole in 1997\j
\JTwenty-year temperature record revealed\j
\JSOHO in more trouble\j
\JMajor NEAR engine burn completed \j
\JTelomerase does not make cells cancerous\j
#
"The body clock reaches further",935,0,0,0
(Jan '99)
A report in \ICell\i during January indicates that the same genetic machinery which controls the inner movements of the biological clock (or \B\1biological rhythm\b\c) may also drive the basic rhythms of the body, such as the rise and fall of body \Jtemperature\j, blood pressure, \Jhormones\j and the sleep-wake cycle. \p
According to the report, the clock consists of at least six proteins which are involved in a series of elegant and precisely timed stages which result in genes being switched on and off over a 24-hour cycle. This timepiece centres on a structure called the suprachiasmatic nucleus (SCN), which lies behind the eyes.\p
The researchers have discovered that the gene for arginine vasopressin, a \Jpeptide\j that is released rhythmically in specific brain regions over the course of the \Jday\j, contains the same 'on switch' as one of the six genes in the central clock mechanism. The switch for this output gene appears to be turned on and off by the same proteins, or transcription factors, which control the central clock genes.\p
#
"What use is a theoretical breakthrough?",936,0,0,0
(Jan '99)
Your reporter frequently finishes an article on yet another exciting breakthrough of the month, with a list of areas where the breakthrough may have application over the next few years. Scientists usually supply this sort of information as quickly as possible partly because they feel driven to justify the enquiries that they began simply out of curiosity.\p
Sometimes, the justification for a breakthrough is hard to make, because the end use will eventually come from a completely unexpected direction. Two standard legends, told variously about many scientists, have a scientist responding to the question "what use is it?" in two ways. The first answer is "what use is a baby?", while the second runs along the lines of "one \Jday\j, you may be able to tax it". (See \B\1electrical measurement\b\c.)\p
Neither of these answers, it seems, is sufficient for today's decision-makers, which means science needs to take a careful look at itself, to see how the actions of basic researchers can be justified. Given the present emphasis from bean counters and politicians for "accountability" (meaning immediate returns on research), the mixture of pure and applied research, the connection between technology and basic research in the two developments is worth studying.\p
Professor Robert M. White from Carnegie-Mellon University, published an interesting reflective article in the journal \IChemtech\i in January, looking at how two conspicuous technologies, \B\1magnetic \Jresonance\j imaging\b\c and global positioning systems (see \B\1Car Navigation System Update\b\c, January 1997), were developed. According to White, the connection is "serendipitous and asynchronous". In some cases, he says, innovation can be asynchronous, with technology sometimes being developed in a practical way before the theory is understood, but in other cases, the theory comes first, and the applications come later.\p
White looks first at a view of scientific research put up by Donald Stokes, in his \IPasteur's Quadrant\i (Brookings Institution Press: Washington, DC, 1997). Stokes maps research onto a plane where one axis measures the extent to which the use of the research is considered, while the other dimension reflects the extent to which research is aimed at fundamental understanding.\p
The plane can be divided into four quadrants, three of which are named for scientists whose work describes that area. \B\1Louis Pasteur\b\c stands in the "use-and-understanding-inspired" quadrant, \B\1Niels Bohr\b\c represents the more "curiosity-driven" quadrant, and the \B\1Thomas Alva Edison\b\c quadrant includes research primarily directed toward solving practical problems without particular reliance on the underlying science, driven by the need to find things which could be used. \p
The fourth quadrant, where research is driven by neither use or understanding is closest to the gathering and organisation of data. White's two examples both involved establishing the fundamentals first, though in each case, extra work was required to make the technologies work. In other words, he is concerned with the importance of Bohr's quadrant.\p
MRI, he points out, began as an attempt to measure the magnetic property of the nucleus of \Jhydrogen\j by exciting it with a radio frequency field, using what we now call nuclear magnetic \Jresonance\j, or NMR. Much later, a use was found for NMR and it was utilised to distinguish between healthy and cancerous tissue, from outside the body. So the curiosity-driven research came first, and the applications followed later.\p
Again, practical MRI needed a large magnetic field at very high intensity, and Raymond Damadian was fortunate to find that the high-\Jenergy\j physics community had just developed superconducting magnets for its own curiosity-driven purposes, including one with a 135 cm (53 inch) bore.\p
So how do politicians satisfy themselves that such basic research is really worthwhile, when the benefits may not appear for thirty years or more? The answer, suggests White, lies in the way that scientists judge themselves and their community, where pointless research and research proposals are eliminated by the peer review process. Science and scientists, he points out, operate under a great deal of self-discipline. In simple terms, research that lacks impact soon finds that it lacks funds. Even if there is no simple way of measuring effectiveness, that does not mean that it cannot be assessed.\p
White's second case study is GPS, which derived from an attempt to discover better ways of measuring time. Building on the knowledge that atoms could be excited in an atomic beam by electromagnetic radiation, Isidor Rabi developed the first \B\1atomic clock\b\c, and over time, these devices were refined. In part, this work was done in the hope of improving the measurements of red shifts in \Jastronomy\j.\p
Once satellites became common, the opportunity arose to merge the new technologies, and let tiny differences in timings identify a unique position on the globe, based on data from a number of satellites. Now, GPS systems are common in cars.\p
In other words, says White, the "reservoir of knowledge established in Bohr's quadrant" is the source of the real gains in technology, but nobody can say where or when the next advance will occur. Congress should concern itself more, he says, with "the innovation process itself, which includes science education, standards, and tax policy that allow all the pieces of a technology to be brought together".\p
Breakthroughs like the proof of Fermat's Last \JTheorem\j, the exploration of the surface of Mars, the discovery of the "top" quark, and the synthesis of the \Jsoccer\j-ball form of carbon known as "buckyballs" have yet to offer any useful applications. The work which led to NMR was rewarded in 1952 when Purcell and Bloch were awarded the Nobel Prize, long before anybody realised what the work would lead to.\p
#
"Sunscreen ingredient can cause DNA damage",937,0,0,0
(Jan '99)
A report, released on the \JInternet\j in late December, and published in \IChemical Research in \JToxicology\j\i in January, warns that a major active ingredient in some \B\1sunscreen\b\c formulations damages DNA when exposed to sunlight. This effect has been observed only in the laboratory at this stage, but researchers say that if similar damage occurs within skin cells, it could destroy them, or possibly initiate changes leading to skin cancer.\p
The ingredient is PBSA (2-phenylbenzimidazole-5-sulfonic acid), which is commonly used in sunscreens on sale in some parts of the world, such as the U.S. and Europe. It protects skin by strongly absorbing harmful high- \Jenergy\j UV-B (290-320 nm) wavelength light or natural sunlight, but in the process PBSA becomes energised and, in principle, capable of damaging adjacent skin tissue.\p
PBSA which has been exposed to light is able to damage the \Jguanine\j base sites in DNA, and this is how it \Imay\i increase the risk of developing skin cancer. For now, the authors stress, the risks associated with using such a sunscreen are far less than the risks of not using a sunscreen at all.\p
There is currently no evidence, say the authors, that PBSA actually enters human skin cells, but the report notes that "this new information regarding the photosensitising properties of PBSA sounds a cautionary note: it may be safer to replace it with another ultraviolet filter that does not attack DNA".\p
#
"Resistant bacteria",938,0,0,0
(Jan '99)
\IStreptococcus pneumoniae, \ibacteria which can cause life-threatening infections in adults and especially children, are rapidly becoming resistant to \Jpenicillin\j and cephalosporins, but luckily they are not increasing in virulence, according to a article in November's issue of \IPediatrics\i, written by Moshe Arditi.\p
Yet while the new strains are no more severe, because they can resist most \B\1antibiotics\b\c, they pose a much greater threat to victims, and the risk is accelerating. Over three years, Dr Arditi recorded an increase in resistance to ceftriaxone (a cephalosporin) which jumped from 1.7% to 5% to 15%, while \Jpenicillin\j resistance jumped from 13% in the second year to 27% in the third year of the study, 1996. Less than a decade earlier, incidences were reported to be around the 0.02% level.\p
Many varieties, different serological types, of pneumococci exist. The \Jbacteria\j can cause \Jpneumonia\j and many other infections, such as sinusitis. They are also the most common cause of acute middle ear infections and invasive bacterial infections such as \Jpneumonia\j and meningitis, causing half a million cases of \Jpneumonia\j in adults and children and 5000 cases of bacterial meningitis in children in the USA each year.\p
(Bacterial meningitis is a bacterial infection of the meninges, the membranes that envelop the brain and spinal cord. Of the 180 children in this study, 14 died, while 25% developed neurological problems and 32% suffered moderate to severe hearing loss.)\p
While the prospect looks grim, there is good news as well. Dr. Arditi says that pneumococcal polysaccharide vaccines, which use a \Jcarbohydrate\j from the cells of the \Jbacteria\j to provide immunity, have been developed and are in use. There is a snag, however, as the vaccines are safe and effective for adult use, but they have not worked well in protecting young children, especially those younger than two years of age. Now newer vaccines, called multivalent pneumococcal conjugate vaccines, are available for use on younger patients. These combine vaccines against several strains of pneumococcus and attach a carrier protein, which causes children to respond better to the vaccines. \p
Ten years ago, the most common cause of bacterial meningitis was \IHemophilus influenzae\i type B (Hib), but since the introduction of a specific Hib vaccine in 1989, \IStreptococcus pneumoniae\i has become the most common cause of \Jmorbidity\j and mortality resulting from bacterial meningitis. \p
#
"CDC plan for resistant bacteria",939,0,0,0
(Jan '99)
In mid-January, the Centers for Disease Control and Prevention (CDC) published new recommendations in \IThe Pediatric Infectious Disease Journal\i for treating middle ear infections (acute otitis media) caused by resistant strains of \IStreptococcus pneumoniae\i (DRSP). \p
While amoxicillin remains the preferred oral antibiotic for DRSP, a SmithKline Beecham \JInternet\j press release followed soon after the article, drawing special attention to a SKB antibiotic, Augmentin (amoxicillin clavulanate), which is on the CDC list, together with the products of two rivals, Ceftin (cefuroxime axetil) and Rocephin (ceftriaxone). There are 13 other drugs which may be used at times in treating otitis media, but these are less effective against DRSP.\p
Three \Jbacteria\j cause most earaches: \IS. pneumoniae\i causes about half of all ear infections while \IHemophilus influenzae\i accounts for 20-30% and \IMoraxella catarrhalis\i accounts for 10-15%. The three CDC-nominated drugs are effective against each of these, and also against DRSP, but Augmentin is mentioned many more times than either of the other products, Ceftin and Rocephin.\p
While the release identifies its source and is completely ethical and responsible, it highlights one of the problems of using the \JInternet\j to gather information like this, where other companies may not be quite so careful. It is only natural that a pharmaceutical company will highlight the usefulness of its own product - but how many are as open as SKB in underlining their financial interest?\p
#
"Heart research Top 10 for 1998",940,0,0,0
(Jan '99)
The American Heart Association (AHA) announced its "top 10" on December 30, 1998. This is the third year the list has been offered, and it recognises "achievements in basic and clinical research that may have the greatest impact in improving the prevention and treatment of cardiovascular disease".\p
The list includes gene therapy to create a "natural" bypass to circumvent plaque obstructions in the heart's blood vessels, the use of the so-called "super aspirin" platelet IIb/IIIa receptor blockers which keep blood \Jplatelets\j from clumping and forming blood clots that can trigger a heart attack or stroke, and the use of simple anti-inflammatory drugs such as aspirin to help prevent the blood clots that can trigger heart attacks and strokes.\p
Other honourable mentions were given to non-surgical imaging technology, and the discovery that heart cells may recover as a result of the left ventricular assist device or LVAD, an auxiliary \Jpump\j which is used to help patients with severe heart failure who are awaiting heart transplantation surgery. In some cases, use of the LVAD has allowed some patients to make do without a heart transplant.\p
Other developments on the list include the statement that fewer than 10 cigarettes daily dramatically increases the risk of death from \Jheart disease\j, new evidence on diet and "bad" \B\1cholesterol\b\c, and public education campaigns which are succeeding in getting people to recognise symptoms and seek treatment earlier.\p
#
"New diagnostic test for new variant Creutzfeldt-Jakob disease",941,0,0,0
(Jan '99)
An effective test for variant Creutzfeldt-Jakob disease (vCJD), the human form of the disease we call \B\1scrapie\b\c in sheep and \B\1bovine spongiform encephalopathy (BSE)\b\c in cows, was reported in \IThe Lancet\i in mid-January. While the disease could previously only be identified by a post mortem examination, or by a brain \Jbiopsy\j, the new test requires minor surgery to take a tonsil sample, which is then analysed to detect a rogue form of 'prion' protein. The test was developed by a team led by Professor John Collinge at the Imperial College School of Medicine at St. Mary's \JHospital\j in London. \p
Interestingly, only patients with vCJD, and not classical CJD, have the rogue protein detectable in the tonsil. This shows that the prion infection in vCJD behaves quite differently when compared with classical CJD. The test will be able to be used on anonymous samples of tonsil tissue taken during routine surgery, to establish how common vCJD infection is in Britain. The rogue prion turns up in the \Jtonsils\j well before other disease symptoms appear, but as yet, there is no cure for vCJD.\p
#
"Understanding Taxol",942,0,0,0
(Jan '99)
Scientists studying the molecular structures involved in cell division have explained at least part of the operation of the anti-cancer drug, \B\1Taxol\b\c, according to a report in the \IJournal of Molecular \JBiology\j\i during January. Since the mid-1970s, scientists have known that Taxol targets microtubules in the cell. The microtubules act like cables, pulling the two new cells apart during cell division. The action of Taxol on the microtubules prevents the cells from dividing, which then triggers apoptosis, a cellular mechanism also known as programmed cell death.\p
Apoptosis operates as a natural mechanism that kills malfunctioning cells as part of the body's defence against cancer. This is because the excess cells formed in a \Jtumour\j are destroyed by apoptosis, and so cancer only gets away if it can somehow prevent the process.\p
The researchers were studying the structure of microtubules, and they found that Taxol also attacks a second target in cancer cells, and this knowledge may make it easier for others to develop more efficient anti-cancer drugs. To do this, they created a huge set of bacterial viruses, each genetically engineered to exhibit a fragment of a different cellular protein. Then they screened the viruses to see which viruses bind to Taxol. \p
While the viruses with microtubule proteins bound to Taxol, other viruses carrying a protein called Bcl-2, a molecule first discovered in human B-cell leukemias, also bound to Taxol. Bcl-2 acts to block the cell from completing the process of apoptosis. When Taxol binds to the protein, it stops the Bcl-2 from working, and this allows apoptosis to continue in the normal way.\p
\BKey names\b: Lee Makowski, Bonnie Wallace \p
#
"Dietary soy set to be the new miracle food",943,0,0,0
(Jan '99)
In recent times, \B\1soya bean\b\c products have been shown to lower \Jcholesterol\j in humans. The food is a source of omega-3 fatty acids and \Jcalcium\j, and it also supplies most of the essential amino acids people need to make proteins. There are also suggestions, still to be confirmed, that soybeans may also lessen some symptoms of menopause. In late January, J. Mark Cline reported on the effects of soy on post-menopausal monkeys. Estrogen-replacement therapy has an unfortunate side-effect: it causes cell proliferation, where cells divide uncontrollably.\p
This division sets the scene for cancer, in both the mammary glands of the breast and in endometrial tissue of the uterus. When Cline fed his monkeys on soy products, the cell proliferation was stopped, suggesting that the soy protein has the same role as progestin does in common estrogen/progestin formulations, by reducing the risk of cancer. Some combinations were effective, but Cline reported that the relative dose of soy protein -- which contains plant estrogens (called phytoestrogens) -- and estrogen replacement therapy may be critical. Some rats, whose ovaries had been removed, at low doses of estrogen replacement therapy, were given the dose of soy estrogen found in natural soy, and this dose caused an increase in cell proliferation in the breast tissue.\p
At a higher dose of estrogen replacement therapy, the soy reduced the cell proliferation in both the uterus and breast. In other words, we have an interesting effect here, one which will undoubtedly trigger a lot of research during 1999. One key issue is the effect that eating soy foods has on people generally. Epidemiologists know that Asians living in Asia have very different cancer rates from their cousins living in the United States or \JAustralia\j. They also know that Asian diets typically contain less fat and a higher proportion of vegetables, including soy protein, when compared to Western diets. \p
It has been shown that American-born children of Asian immigrants have a 60% higher risk of developing breast cancer than do people born in Asia who immigrate to America and now are eating Western diets, suggesting that the protection effect happens early in life.\p
#
"But there's more . . .",944,0,0,0
(Jan '99)
As well, a recent report in \INeuroscience Letters\i indicates that soya beans may bring relief from pain -- or at least it may do so if you are a rat. In the study, rats fed a diet containing soy meals developed far less pain after nerve injury than their other rats on soy-free diets. It may even go further, as rats and humans have a number of similarities in the way they perceive pain, suggesting that the effect in rats may also be seen in humans. The discovery came from a study on Yoran Shir's animal model for pain, while he was studying in America. After nerve injury, human patients can feel pain from even a gentle touch, and Shir's rats had been treated to simulate this effect. In earlier work, after the sciatic nerve to one foot is partly severed surgically in anaesthetised rats, researchers in Israel had measured sensitivity to pain by touching the foot with fine probes of varied size.\p
At Johns Hopkins, Shir tried to repeat this work, but most of the laboratory rats showed no heightened sensitivity to pain. The researchers investigated every possible factor, looking at rat strains, surgical techniques, and even the rat food. In the end, it was the rat food that proved to be the culprit, with American rat food pellets containing more soy protein than the original Israeli food. When the soy-fed rats were transferred to a soy-free diet, the pain sensitivity returned.\p
So far, researchers have no idea what part of the soy meal controls the sensitivity to pain, or what the mechanism might be. Two possibilities are current favourites: either soy proteins may interfere with the ways cells transmit pain messages, or the phytoestrogen plant \Jhormones\j may be involved. Either way, the discovery looks interesting. James Campbell, the other researcher involved, comments that the ". . . concept explains why aspirin isn't particularly useful for pain from nerve injury, for example, but works for \Jinflammation\j". \p
#
"The smoker's gene",945,0,0,0
(Jan '99)
Why do some people have more trouble giving up smoking than others? The 20% of former smokers who have broken their addiction to \Jtobacco\j like to suggest that they have stronger will power, but a report in \IHealth \JPsychology\j\i during January identifies what has been called "a gene for smoking".\p
In fact, it would more correctly be called a gene for not smoking, since researchers have found that people carrying a particular version of the \Jdopamine\j transporter gene (SLC6A3-9) are less likely to start smoking before the age of 16. They are also more likely to be able to quit smoking if they start. \p
The study looked at 289 smokers and 233 non-smokers, and found a significant difference in the frequency of the SLC6A3-9 \Jgenotype\j among non-smokers. A person with the SLC6A3-9 \Jgenotype\j has lower novelty seeking traits than a person without this \Jgenotype\j, according to the study, and since novelty seeking has been linked with smoking, this seems to explain the difference.\p
The SLC6A3-9 gene is not a complete answer, rather it is ". . . an influence on the individual's general need and responsiveness to external stimuli, of which \Jcigarette\j smoking is but one example".\p
#
"Thomas Jefferson and Sally Hemings",946,0,0,0
(Jan '99)
In November 1998, political correctness was to the fore as scientists offered a match of sorts for the Clinton \Jimpeachment\j case by asserting that \B\1Thomas Jefferson\b\c must have fathered an illegitimate child by his slave Sally Hemings. In early January, the same researchers admitted in a letter to \INature\i that they were wrong, and that there are at least five of Jefferson's family who could also have fathered the child. The analysis was based on DNA sequences in the Y \Jchromosome\j of the Jefferson family that matched DNA from the Hemings family, linked to old rumours about Jefferson's sex life. \p
Herbert Barger, a descendant of Jefferson, pointed out that Thomas' younger brother Randolph "reputedly liked to party in the slave quarters", and was a likely candidate for the role of Eston Hemings' father. Randolph's sons were also possible fathers, as they had the same Y \Jchromosome\j as their father and their famous uncle. \p
The researchers concede that Barger could be right, but say they left the new suspects out of consideration because the Jefferson descendants had previously claimed that the sons of Thomas Jefferson's sister, Peter or Samuel Carr, were the most likely candidates -- and the Y \Jchromosome\j evidence ruled them out. Two articles published in \INature\i in November are now recognised as "misleading", says the recantation. The excuse: the articles were hurried into print to beat the popular media, which had learned about their results and were poised to publish.\p
The whole analysis was even more flawed than the letter admits, since an unknown Jefferson illegitimate son (from any of the candidates or their fathers or paternal grandfathers) could also have been the father of Eston Hemings, as indeed could an unrecognised third cousin, so long as he was descended on a direct male line from the Jefferson male line.\p
#
"Wolf Prize to Eric R. Kandel",947,0,0,0
(Jan '99)
Professor Kandel, Columbia University and Howard Hughes Medical Institute pioneer of memory research, has earned the 1999 Wolf Foundation Prize in Medicine with four decades devoted to discovering what molecular changes take place in cells when an organism learns a new behaviour. \p
The prize, Israel's most prestigious award, has been given each year for 20 years, and so far, 17 of the recipients have gone on to receive a Nobel Prize. Kandel's work looks at what happens in the brain when a memory is formed or when learning takes place. Kandel and his collaborators looked at the "sea hare", \IAplysia\i, a large slug-like sea snail. This animal has a small, flattened shield-like shell running down its back, hidden beneath the mantle.\p
The animal has a very simple \Jnervous system\j, but it can be taught to retract its gills at a stimulus, so that researchers could discover which nerve cells were involved in the action. They also traced the formation of long-term memory to the nucleus of the neuron, and showed that the set of biological signals which leads to a memory ends with a molecule called CREB. This molecule switches on dozens of genes which stimulate the growth of synapses. This growth then leads to the formation of persistent, long-term memories. \p
In 1990, they discovered that if they blocked CREB, these events were halted, demonstrating that CREB is a key to the genetic basis of long-term memory. A later discovery was labelled CREB-2, leading to the original CREB being renamed CREB-1. CREB-2 is an inhibitor of CREB-1, stopping the formation of long-term memory, so the sea hare must produce CREB-1 and remove CREB-2. Kandel is now extending his work to study genetically modified mice.\p
#
"The planet makers",948,0,0,0
(Jan '99)
The national meeting of the American Astronomical Society heard in January that a three-year spectroscopic survey reveals a group of stars near our \Jsolar system\j that have a much greater allotment of heavy elements than other nearby stars that are like our \Jsun\j. The 12 planets all have giant, Jupiter-mass planets orbiting them.\p
The concentrations of heavy elements, those heavier than \Jhydrogen\j and \Jhelium\j, are typically two to three times greater than in the \Jsun\j, according to Guillermo Gonzalez. In fact, two of the stars in the sample have the highest heavy element abundances reliably measured in any star in the universe. This supports the theory that stars formed in heavy-element-enriched interstellar clouds are more likely to produce a giant \B\1planet\b\c. It is possible, however, that the giant planets move toward the parent star in a process called \Jplanet\j migration, dragging any remaining inner disk material or smaller planets with them, and enriching the star with heavy elements. Gonzalez added that his results could be useful to \Jplanet\j hunters, who can use the knowledge of a star's heavy element abundance as a criterion in preparing a list of target stars to search.\p
#
"Is Pluto really a planet?",949,0,0,0
(Jan '99)
There was a brief flurry of anxiety in January when Mike A'Hearn, an American astronomer conducted an email poll of 500 astronomer members of the International Astronomical Union, asking them whether they considered \B\1Pluto\b\c to be a genuine \Jplanet\j or not. Pluto is much smaller in diameter than the other planets, its \Jorbit\j is more elliptical than other planets, and its composition is unlike the inner rocky planets, and just as unlike the outer gas giants.\p
So if Pluto is not a \Jplanet\j, what is it? It may be one of the "trans-Neptunian objects" (TNOs). These are small icy objects beyond Neptune's \Jorbit\j, more than 70 of which have been discovered recently. Some of them have elliptical orbits like Pluto, and some people believe that Pluto is just the largest of the new group. At the end of January, there was no news available on the vote, but whatever it is, Pluto will still remain as a most unusual object in the sky.\p
Footnote: in early February, Brian Marsden, head of the International Astronomical Union's Minor \JPlanet\j Center, announced a sensible compromise. Pluto is to be listed as a TNO, but it will retain its planetary status, at least so far as astronomers are concerned.\p
#
"Holding up ET",950,0,0,0
(Jan '99)
Gamma ray bursts may be even more important than we think: an article in \INew Scientist\i in late January, suggests they may provide the answer to the paradox of \B\1Enrico Fermi\b\c about extraterrestrial civilisations.\p
Fermi pointed out that if there are other intelligences out there, they should have reached us by now. Our galaxy is about 100 000 light years across, so even if a space-faring race could explore the galaxy at just a thousandth of the speed of light, it would take them only 100 million years to spread across the entire galaxy. Since the galaxy is around 10 billion years old, the ETs should have been here by now, even from other galaxies.\p
Gamma-ray bursts, GRBs, generated by collisions between collapsed stars may be the answer. Given the effects of GRBs on life has concluded that they may only now be beginning to explore their own galaxies. James Annis, appropriately an astrophysicist at Fermilab near \JChicago\j, thinks cataclysmic gamma-ray bursts often sterilise galaxies, wiping out life forms before they have evolved sufficiently to leave their \Jplanet\j, and he has said so in the Journal of the British Interplanetary Society. He points out that if a GRB went off in the Galactic centre, two-thirds of the way out on the Galactic disc, humans would be exposed to a wave of powerful gamma rays lasting a few seconds.\p
The present rate of GRBs seems to be about one burst per galaxy every few hundred million years, but in the past, theory predicts that GRBs were more common, perhaps frequent enough to prevent intelligent life arising. The only problem: GRBs are too brief to wipe out a whole \Jplanet\j, since the planets mass would protect almost half the population. Annis suggests that GRBs are likely to have many indirect effects, such as wrecking \Jozone layer\js that protect planets from deadly levels of ultraviolet radiation. It can only be a matter of time before people blame GRBs for one or more of the \Jearth\j's \Jextinction\j events.\p
#
"Dark matter report",951,0,0,0
(Jan '99)
A report in late December 1998, suggests that a newly discovered binary system in the Smaller \B\1Magellanic \JCloud\j\b\c (the SMC), a nearby galaxy, may help us to understand the "missing \Jdark matter\j" problem better. The report, which is to appear in \IAstrophysical Letters\i, was placed on the \JInternet\j in preprint form, just a few days before Christmas.\p
Astronomers spotted the system when it acted as a gravitational \Jlens\j to magnify the light from another star. If a heavy object passes between a star and us, the force of gravity bends light coming from the star, making it appear at first brighter and then dimmer, in what is called a "lensing event".\p
The lensing event happened in June 1998, but required observation and detailed analysis, so that the nature of the lensing object could be estimated. The two most likely forms of the missing \B\1\Jdark matter\j\b\c, if it is not in the form of undetected black holes, is as massive compact halo objects (MACHOs) or as weakly interacting massive particles (WIMPs). Gravitational lensing is one of the few ways astronomers may detect the presence of dark, massive objects in our galaxy. If many such objects exist, they could account for the missing mass of the universe, but we still need to know if we are dealing with MACHOs or WIMPs.\p
In the first place, observations revealed that the object causing the lensing was a binary system, while extra analysis revealed the speed at which the binary system was moving through space. A \Jlens\j in the SMC would be travelling about ten times slower than a \Jlens\j in the halo, and the researchers concluded that if the \Jlens\j is in the SMC, it is travelling at about 75 km/sec, while if it is in the halo, it must be going at about 35 km/sec, an improbable speed for an object in the halo, making it unlikely that the binary system is part of the missing \Jdark matter\j.\p
In other words, the results poke a small hole in the MACHO theory, without offering any support to the WIMP theory.\p
#
"Biggest in the universe . . .",952,0,0,0
(Jan '99)
In early January, researchers told a meeting of the American Astronomical Society that they have located the largest structure so far seen in the universe, a ribbon of very rich clusters of galaxies, stretching over some 400 million light-years.\p
Galaxies like our own seem to clump into groups of a few to several dozen galaxies, and these groups seem to be close to other groups, forming what astronomers call clusters, while clusters seem to be gathered up into superclusters, leaving much of space almost empty.\p
The new supercluster was found in the \Jconstellation\j Aquarius by a team led by David Batuski, using a 3.6-metre \Jtelescope\j at the European Southern Observatory in \JChile\j. At one point, a knot of six rich clusters within the chain is crammed into a space just a few tens of millions of light-years across, about a hundred times more tightly packed than other galaxies in the region. \p
So what forms superclusters like this? It probably is a leftover from some minor irregularity or irregularities in the Big Bang, say astronomers.\p
#
"The bonds in water",953,0,0,0
(Jan '99)
To say the least of it, \Jwater\j is a peculiar substance. In fact, according to standard theory, it should not even be a liquid, but it is a liquid because of the \Jhydrogen\j bonds which tie individual H\D2\dO molecules into larger clumps. The same effect explains why ice floats on \Jwater\j.\p
As far back as the 1930s, \B\1Linus Pauling\b\c predicted that the weak \B\1hydrogen bond\b\c between \Jwater\j molecules must part of their identity from stronger \Jcovalent\j bonds within the \Jwater\j molecule. Now a new experiment has supported this prediction, according to a report in \IPhysical Review Letters\i in mid-January.\p
\JHydrogen\j bonds are found in many complex molecules -- the two halves of the DNA helix are held together by these bonds, for example, but the effects of the \Jhydrogen\j bond are at their most obvious in plain simple \Jwater\j.\p
Although the force of the \Jhydrogen\j bond is quite weak, it is enough to ensure the unusual properties of \Jwater\j. The \Jhydrogen\j bond causes \Jwater\j to require a large amount of heat to raise \Jwater\j \Jtemperature\j one degree, giving it a specific heat much greater than most other substances. As a result, the ocean holds huge amounts of heat, allowing the seas to moderate climate fluctuations.\p
More importantly, \Jwater\j expands when it is cooled below 4 degrees \JCelsius\j, whereas most liquids only expand when they are heated. As a result, when ice forms, it forms on the surface of the \Jwater\j, making a "thermal blanket" of ice to insulate the \Jwater\j below from freezing. Therefore, life can survive in small ponds, even when the surface may be frozen for many months, and the seas are prevented from slowly filling with sinking ice.\p
The \Jhydrogen\j and oxygen atoms in H\D2\dO are linked by \Jcovalent\j bonds, where electrons are best thought of as "shared" between two atoms. The shared \Jelectron\j pair, called a sigma bond, helps to fill each atom's outer "valence" shell of electrons, a situation which makes the bond very stable. These bonds can really only be described by quantum mechanics, because the electrons do not "belong" to any single atom any more.\p
The \Jhydrogen\j bond is thought of as a simpler matter, similar to the electrostatic attraction which makes plastic attract dust after it has been rubbed with a cloth. The electrons in the sigma bond are more often near the oxygen, so that \Jwater\j molecules are polar, with a negative charge on one side, and a positive charge on the other.\p
That was the comfortable image that Pauling questioned when he suggested that the distinction was rather less clear-cut. He suggested that the \Jhydrogen\j bonds between \Jwater\j molecules would also be affected by the sigma bonds within the \Jwater\j molecules. So, in a sense, the \Jhydrogen\j bonds would partially assume the identity of these bonds! \p
One of the basic rules of science is that everything in nature moves towards the lowest \Jenergy\j state -- in fact, scientists usually say that things naturally seek their lowest-\Jenergy\j state. \p
Electrons do this by minimising their total \Jenergy\j, which includes their kinetic \Jenergy\j, their \Jenergy\j of motion. In doing so its velocity is reduced which means the \Jelectron\j has a lower momentum. Now according to a well-established principle of quantum physics, the Heisenberg Uncertainty Principle, when an object reduces its momentum, it must spread out in space. This spreading out, or delocalisation, happens to electrons in many other situations, so what we see in \Jwater\j also applies in superconductors.\p
Turning again to quantum physics, we can also think of electrons as waves, and in the \Jwater\j molecule, the \Jelectron\j waves on the sigma and \Jhydrogen\j bonding sites overlap, which means that the electrons cannot be distinguished, and the two types of bonding become enmeshed, and interdependent, with each of the bonds taking on some of the properties of the other.\p
But how far does the effect go? How much are the \Jhydrogen\j bonds affected? Until the recent experiment, this has been a matter for conjecture and argument, but now we have clear experimental evidence.\p
Using the European \JSynchrotron\j Radiation Facility (ESRF) in \JGrenoble\j, \JFrance\j, a US-\JFrance\j-Canada research team has used the ESRF's ultra-intense x-rays to produce Compton scattering, a pattern generated when x-ray photons bounce off ice crystals. The pattern is named after its discoverer, \B\1Arthur Holly Compton\b\c. If a \Jphoton\j comes in contact with matter containing electrons, some of the \Jphoton\j's \Jenergy\j is transferred to the electrons, and is then re-transmitted as x-rays travelling in a different direction, and with a lower \Jenergy\j level. If we measure many Compton-scattered photons, we can deduce a great deal about the electrons which were struck.\p
The most important thing about the Compton scattering method is that it gives direct information on the low-\Jenergy\j state of an \Jelectron\j in an atom or molecule. This lets us reconstruct the \Jelectron\j's "ground-state wave-function", the complete quantum-mechanical description of an \Jelectron\j in a \Jhydrogen\j bond in its lowest-\Jenergy\j state. \p
The aim of the experiment was to detect any overlapping of the \Jelectron\j waves in the sigma and \Jhydrogen\j bonding sites. The researchers looked at ice crystals, rather than \Jwater\j, because in solid ice, the \Jhydrogen\j bonds are pointing in only four different directions, as the \Jwater\j molecules are frozen in a repetitive pattern. Even then, the effect was expected to be small, because only 10% of all the electrons in ice are associated with the \Jhydrogen\j bond or sigma bond. The rest are electrons which do not form bonds. The Compton scattering provides information on all of the electrons, not just those involved in bonds, and this tends to bury the information.\p
This is where the ESRF comes in: it can produce very intense beams of x-ray photons, so that there are enough Compton scattering events to allow useful statistical analysis. The experimenters also aimed x-rays at the ice from different directions, and the differences they found allowed them to eliminate the effects of the non-bonding electrons.\p
The result: they found wavelike fringes corresponding to interference between the electrons on neighbouring sigma and \Jhydrogen\j bonding sites, and Pauling's prediction was confirmed. \p
The future applications of this work have physicists salivating. The new knowledge is likely to change the way nanotechnologists work, because many of the "self-assembling" modules they use rely on \Jhydrogen\j bonds to build themselves, but it also has applications in other areas. We may come to understand the structure of DNA better, and the methods used may also be applied to materials which are free of \Jhydrogen\j bonds, such as superconductors and switchable metal-insulator devices, in which one can control the amount of quantum overlap between electrons in neighbouring atomic sites. \p
\BKey names\b: E. D. Isaacs, A. Shukla, P.M. Platzman, D. R. Hamann, B. Barbiellini, and C. A. Tulk \p
#
"Virtual sculptures",954,0,0,0
(Jan '99)
Art lovers can already look at the world's great paintings on the \JInternet\j, but how do you call up a sculpture on a computer, and view it from any angle, or zoom in on it? Up until now, the whole idea would have been dismissed as impossible, but very soon, the Digital \JMichelangelo\j Project is set to change all that.\p
The idea is to harness the magic of advanced computer graphics to let art lovers examine highly realistic, three-dimensional images of the statues of \B\1Michelangelo\b\c on display screens at their local art museum, or even on their personal computers. Ideally, users will be able to look at a sculpture from different angles, zoom in on chisel marks, and even change the lighting conditions to see how it affects the appearance of a three-dimensional object.\p
The only snag: it may be a while before you can do this on the \JInternet\j, because the whole process will be very information-intensive, with huge amounts of data required to support even one statue. Most of the works are likely to involve terabytes of data.\p
The Digital \JMichelangelo\j Project looks as though it will not only make virtual copies of \JMichelangelo\j's statues available to the world: it should also set a new standard for the computer representation of three-dimensional physical objects.\p
Developers were set to begin scanning sculptures in the Gallerie dell'Academia in \B\1Florence\b\c during January, moving on to the Medici Chapel in February, and then on to \JRome\j. The scan data will be converted into an accurate, three-dimensional computer model. They will then create a matching overlay that contains additional data about surface colour and characteristics required to reproduce its image accurately.\p
The main research work comes from Stanford's Computer Graphics Laboratory, which created a "3-D \Jfax\j" in 1996, when a small statue was scanned, and sent electronically to a stereolithography plant in Southern \JCalifornia\j that used the information to produce a detailed plastic replica of the original statue. One interesting aspect may come from electronic restoration of statues like \JMichelangelo\j's "Moses" in San Pietro in Vincoli, \JRome\j. The beard of this statue has been worn down by the reverent touch of generations of Jewish visitors. In the Digital Michaelangelo project these damaged areas could be reconstructed.\p
The data might also be used to show how \JMichelangelo\j solved the challenge of placing a grown man in a woman's lap when he created his "\B\1Pietα\b\c". The answer was to give Mary very long legs, and once the data are stored, animation could be used to have Mary stand up, revealing the distortion that he used.\p
Following on the 1996 "3-D \Jfax\j", the method could also allow accurate replicas to be made to any scale, and the developers point out that many ancient Greek statues were originally painted in bright colours, which could be reinstated digitally.\p
Historical trivia: reference books usually report that \JMichelangelo\j's giant statue, "\B\1David\b\c" is 434 cm (14 feet 3 inches) tall, but the actual height is 517 cm (17 feet), according to early work. This created a problem for the project, as the gantry of the larger of their two scanners was sized on the reported height, and so had to be extended.\p
#
"Apples which don't turn brown",955,0,0,0
(Jan '99)
Researchers at the U.S. Department of Agriculture reported in the \IJournal of Agricultural and Food Chemistry\i in January that they have been able to stop cut apples from browning for five weeks under "normal atmospheric conditions," a significant jump over the current five to seven \Jday\j shelf life for most cut fruit that has been treated against browning. \p
In the past, slices have been preserved by dipping them in ascorbic acid and then storing them at low temperatures. The new treatment is said to involve ". . . combinations of enzymatic inhibitors, reducing agents and anti-microbial compounds containing \Jcalcium\j".\p
#
"Photonic crystals in the news",956,0,0,0
(Jan '99)
Researchers Shawn Lin and Jim Fleming at the US Department of \JEnergy\j's Sandia National Laboratories say they have created a microscopic three-dimensional lattice which confines optical light in the range of wavelengths between 1.35 and 1.95 micrometres. They reported the result in \IOptics Letters\i in early January.\p
The technique they used offers an effective way to bend light entering or emerging from optical cables. With silicon rods just 0.18 micrometres wide, this optical lattice is just a tenth the size of a similar device, operating at infrared wavelengths and reported recently (see \B\1Photonic crystals open the door to faster computing\b\c, September 1998). Just as a semiconductor passes electrons at some energies while holding back others, the photonic crystal is a semiconductor for light, passing some wavelengths while reflecting others.\p
The photonic crystal takes its name from the internal structure, which repeats regularly, at about twice the wavelength of the target light. The internal dimensions are managed in this way to ensure that the "crystal" has a far greater ability to select desirable wavelengths. It traps light as though reflecting it between mirrors, and allows light to be bent and transmitted in a new direction, with negligible loss. The immediate applications are likely to be in the communications area, but in the future, these photonic crystals might provide a basis for light-based ultrafast computing.\p
A report in \IScience\i in mid-January describes work carried out by Samson Jenekhe and Linda Chen of the University of Rochester. They have been working on photonic crystals which assemble themselves in a process called "hierarchical self-assembly".\p
The crystals are a three-dimensional composite of air and plastic, forming a thin film on a glass slide. Self-assembly starts with polymer molecules of poly(phenylquinoline)-block-\Jpolystyrene\j in solution, and these self-organise into spheres which then come together in a precise, ordered way to form a larger periodic structure. The researchers describe the process as being rather like bricks forming themselves up into a wall, with no template required.\p
The devices cover about a square \Jcentimetre\j, and they are about 30 micrometres thick, which makes them truly three-dimensional. The polymer is encoded to make the particles assemble into the required shapes: once the polymers are prepared, it takes them "just minutes or hours to organise into photonic crystals".\p
#
"Earthquake deaths",957,0,0,0
(Jan '99)
Just a few days before an \B\1earthquake\b\c in \JColombia\j killed an estimated one thousand people, the U.S. Geological Survey National \JEarthquake\j Information Center (NEIC) reported preliminary figures for the \Jearthquake\j deaths in 1998, which stand at 8928, three times the 1997 total of 2907, and twenty times greater than the 1996 total of 419, right around the world. All of these figures, they noted were below the long-term average of around 10,000 deaths a year. They also pointed out that even 1994 and 1995 were below average.\p
Two events in the border area of \B\1Afghanistan\b\c and \B\1Tajikistan\b\c accounted for at least 6323 fatalities, though in seismic terms, with magnitudes of less than 7.0, neither was classified as a major \Jearthquake\j, and there were other stronger quakes in other parts of the world. The Balleny Islands region, between \JAustralia\j and \JAntarctica\j had an \Jearthquake\j on March 25, with a strength of 8.3, while the year's other "great \Jearthquake\j" (greater than 8.0) occurred on November 29 in the Ceram Sea, near \B\1Sulawesi\b\c, \JIndonesia\j, which was rated at 8.1.\p
Only ten major earthquakes with magnitudes from 7.0 to 7.9 were recorded in 1998, against an annual average of twenty such events. While there is a public perception that the number of earthquakes is getting greater, this seems to have arisen from better seismological reporting, better news gathering, and from the natural increases in deaths and amounts of property damage which follow on from a growing world population.\p
More information is available on the NEIC Homepage at \Bhttp://wwwneic.cr.usgs.gov/\p
#
"Ancient Greek quakes",958,0,0,0
(Jan '99)
Gisela Walberg, an archaeologist, believes she knows what led to the destruction of Midea, one of the three late Bronze Age citadels of the Greek \B\1Argolid\b\c, the other better-known ones being \B\1Tiryns\b\c and \B\1Mycenae\b\c. All three were destroyed somewhere around 1200 BC, and the cause is still a matter for debate, but Walberg told the Archaeological Institute of America in late December about her excavations at Midea, some 90 km (55 miles) south-west of \JAthens\j.\p
At Tiryns, a new palace main hall (or megaron) was built after the destruction below the site's main hill, while at \JMycenae\j, there is no evidence of rebuilding at all. Walberg has discovered that the Midea megaron was rebuilt on the same spot and restored to its original state, after just a brief interruption, and only the interior plans were changed. Excavation evidence from Tiryns supported the notion that an \Jearthquake\j destroyed all three sites, but the fact that the megaron was rebuilt in the same fashion makes it likely that the same "owners", rather than invaders, restored it. As well, Walberg found evidence of collapsed walls and fire that point to an \Jearthquake\j as the cause.\p
#
"Questioning the Cambrian explosion",959,0,0,0
(Jan '99)
About 530 to 540 million years ago, most of the phyla of modern animal groups appeared quite suddenly in the \Jfossil\j record. It has always seemed quite reasonable to assume that this sudden divergence, called the "Cambrian Explosion", was the result of a massive divergence of life forms, perhaps 600 million years ago. (See \B\1Cambrian explosion\b\c, July 1997)\p
While the \Jfossil\j record seems to support this assumption, it now appears that the actual process of divergence may have started some 600 million years earlier still. So the Cambrian explosion was in fact well and truly Pre-Cambrian. The evidence for this revision comes from a large-scale study of gene sequences in a large sample of animals, published in \IScience\i in late January.\p
The researchers set out to identify a set of gene sequences to discover those which appear to have developed mutations at a constant rate over time. Some mutations are neutral, affecting bases in the DNA which do not change the end result, and these mutations should simply be a measure of time, unaffected by evolutionary forces. Other mutations may have some evolutionary effect, but may still provide "clock data", and with enough genes, a clear pattern ought to emerge, telling us how far back certain groups separated out. This lets us search back into the time when life forms did not leave clear fossils, well back into the Pre-Cambrian.\p
In all, they identified 75 nuclear genes which had accumulated mutations at a fairly constant rate relative to one another during their \Jevolution\j. The genes were from species representing three major taxonomic groups, or phyla, of animals (arthropods, chordates, and nematodes), plus plants and fungi. The actual \Jcalibration\j point for their molecular clocks was an evolutionary event well established by \Jfossil\j studies, the divergence of birds and mammals about 310 million years ago, the time when the reptilian ancestors of these two groups first appear in the \Jfossil\j record.\p
Based on this information, the researchers have not only pushed back the Cambrian explosion, they have also set a date for the time when three kingdoms, animals, plants, and fungi, first diverged, about 1.6 billion years ago.\p
If these results are correct, then we need to ask why the fossils did not appear until the Cambrian. Most probably, the answer is that something changed at that time, to favour hard body parts that would fossilise. The cause may have been changes in the \Jearth\j's \Jatmosphere\j, or it may have been the development of eyes in some groups, making them more efficient as predators, or perhaps the hard-bodied ancestors were so tiny that their fossils have not yet been found -- only time can tell.\p
#
"Reconstructing the theropods",960,0,0,0
(Jan '99)
A \IScience\i report in late January describes the conclusions reached by a group of Oregon researchers who have been looking at a \Jfossil\j they call the world's best-preserved theropod, a meat-eating \B\1dinosaur\b\c. The animals appear to have been cold-blooded, suggesting that they were not likely, after all, to be the ancestors of the birds. More importantly, the high quality of the \Jfossil\j means that the researchers have been able to "hang some meat" on the \Jfossil\j, building up a reconstruction of its actual body. From this, they conclude that the theropods had low metabolic rates while at rest, which is an excellent strategy for conserving \Jenergy\j.\p
On the other hand, its enhanced lung ventilation capacity gave it the potential for the type of aggressive, extended activity typical of birds and mammals. So the theropods were fast and dangerous, not at all the sort of animal you would want to meet in a dark alley. The researchers are just the second group to study the remains of a baby \IScipionyx samniticus\i, a meat-eater which lived about 110 million years ago and bore some similarity to a velociraptor (see \B\1Dinosaurs in the news\b\c, March 1998). As well as an intact skeleton, the \Jfossil\j shows remnants of liver, large \Jintestine\j, \Jwindpipe\j and even muscles, probably because the animal died in a shallow, still, saltwater marsh which preserved its structure incredibly well.\p
The \Jfossil\j reveals a divided body cavity, with the lungs and heart in one section, and the liver and intestines in the other. This is only seen in living animals which use an active diaphragm to help ventilate their lungs, such as mammals and crocodilians. In this \Jdinosaur\j, the liver helped \Jpump\j the lungs, acting as a "hepatic piston", probably giving it the same high levels of oxygen exchange, and fast-paced activity as some of the mammals. \p
This system works well in warm climates, but the advantage would have been lost when the world's climate cooled. At the same time, the hepatic piston system is unlike the internal form of the birds, and the nasal turbinate bones are unlike those of any other warm-blooded animals.\p
#
"Deep treasure",961,0,0,0
(Jan '99)
Scientists from the University of \JMichigan\j and \JUtrecht\j University have reported in \INature\i that they have located a slab of former ocean-bed, perhaps 200 million years old, some 2500 km (1550 miles) below the \Jearth\j's surface in \B\1Siberia\b\c. Now the slab lies at the bottom of the layer of superheated rock which makes up the \JEarth\j's mantle, in a place where it can only be reached by seismic tomographic imaging, below \B\1Lake Baikal\b\c.\p
The slab was forced down when the Siberian and Mongolian plates converged between 200 million and 150 million years ago, and it was then forced down, or subducted, at the rate of 1 cm (0.4 inch) per year. For the first time, geologists have proof that subducted slabs do indeed reach the bottom of the mantle, that parts of the eight major and dozen minor plates which make up the \Jearth\j's surface can sink down, deep beneath the \Jplanet\j.\p
The researchers chose the Lake Baikal area for their study because it is the site of an ancient, well-documented subduction zone, and also because it is the site of many earthquakes, as well as having an extensive network of seismic monitoring stations. \p
Seismic tomographic imaging works by measuring the time taken for \Jearthquake\j sound waves to travel through different rocks, because the sound waves travel more slowly through warmer rock. The measurements can be converted by a supercomputer into an image similar to a CAT scan, in which the cooler subducted rocks show up in surprisingly clear contrast to the older mantle rocks.\p
The position of the slab, almost directly below the site of its original subduction, indicates that the Siberian continental plate has moved very little in the last 150 million years. \p
#
"Formaldehyde emissions",962,0,0,0
(Jan '99)
\c\1Formaldehyde\b\c (methanal, CH\D2\dO) is a suspected human \B\1carcinogen\b\c, yet it is used in the manufacture of a wide range of products such as building materials, permanent press clothing, paints, floor finishes, wallpaper and fingernail polishes. Even low level exposure to \Jformaldehyde\j can cause irritation of the eyes and mucous membranes, and long-term exposure may cause respiratory difficulty, \Jeczema\j and \Jhypersensitivity\j. So: are \Jformaldehyde\j-based products safe to use?\p
The answer, according to a report in January's issue of \IEnvironmental Science & Technology\i, seems to be "no". In some cases, these products emit more \Jformaldehyde\j than uncovered particle-boards and veneer plywood, always regarded as the most serious \Jformaldehyde\j problem. Acid-cured floor finishes, which are applied to wooden floors, showed levels up to 1000 times greater than from actual wood products, and even after 24 hours drying, the levels were still five to ten times greater than "the worst wood product".\p
Fingernail hardeners and polishes are also high emitters, but the areas covered are much smaller, and the emissions from new clothing is on a par with that from particle-boards.\p
Before readers get too alarmed, wood products emit at a steady rate for months on end, while emissions from cosmetics, paints and wallpaper drop rapidly within the first hours after application. Of course, the user is more likely to be exposed to paint, wallpaper clothes and cosmetics at close range, while the wood emissions are spread more evenly through the house. The quick fix: wash new clothing when you buy it, and ensure proper ventilation while using the other products.\p
#
"A cause for war",963,0,0,0
(Jan '99)
Humans fight wars over the strangest things, or so it seems on the surface. Such strange causes for war include \B\1the War of Jenkins' Ear\b\c, the eruption of World War I over the assassination of an Archduke and wars that began because newspaper proprietors wanted something to write about - \B\1\JCrimean war\j and\b\c \B\1Spanish-American War.\b\c Below the surface though, most wars are about possession of resources, scarce resources such as raw materials or oil, or even farming land. \p
Now a new scarcity has introduced a new cause for war in the future, according to Klaus Toepfer, director-general of the United Nations Environment Program. Writing in the scientific journal \IEnvironmental Science & Technology\i in January, Toepfer argues that the lack of potable \Jwater\j, good quality drinking \Jwater\j, will lead to future conflicts. This view, he says, is also that of former U. N. Secretary-General \B\1Boutros Boutros -Ghali\b\c.\p
The solution, he suggests, is to manage existing \Jwater\j stocks better, to share them equitably, and to seek new ways of managing and promoting \Jwater\j conservation. Toepfer took up his current position with the UN in February 1998. He is a former minister of the environment for \JGermany\j. \p
#
"What is a quarantine system worth?",964,0,0,0
(Jan '99)
The answer for the United States is that a \B\1quarantine\b\c system might be worth a staggering $123 billion a year, based on a report to the AAAS in early January, when Cornell University ecologist David Pimentel evaluated the damage caused in the USA by alien species which have found their way there from other parts of the world.\p
Aside from the economic costs, outlined below, more than 40% of species on the U.S. Department of the Interior's endangered or threatened species lists are at risk primarily because of non-indigenous species. At the same time, some 98% of US food production, valued at around $500 billion, comes from introduced species, so the picture is not all grim when it comes to exotic plants and animals. One thing that the crop species and the pest species have in common: they were mainly introduced deliberately.\p
The \B\1mongoose\b\c, brought in Puerto Rico and Hawaii (and also into other West Indian islands as well as \JFiji\j) in the late 1800s, was intended to kill pests in the sugar cane. Instead, it has preyed on (and wiped out) reptiles, amphibians, and ground-nesting birds. It also carries \B\1rabies\b\c.\p
The \B\1rat\b\c is a slightly different story. The introduced \IRattus rattus\i (the European, black or tree rat) and \IRattus norvegicus\i (the Asiatic, \JNorway\j or brown rat) escaped from ships, and there are now a billion of them in the United States. That, at least, is a pest that a serious quarantine service might have kept out.\p
Pimentel said that it was too late to send the pest species back, and commented that most of the non-indigenous species have arrived only in the last 70 years. The US, he concluded, would be lucky to escape further damage. He finished his paper with a detailed list of the most damaging groups, and later released this information onto the \JInternet\j:\p
\BAnnual economic costs of some introduced species in the United States\p
\bFrom David Pimentel, Lori Lach, Rodolfo Zuniga and Doug Morrison, College of Agriculture and Life Sciences, Cornell University \p
Weeds in crops $29,000,000,000 \p
Diseases in crops $23,500,000,000 \p
Rats $19,000,000,000 \p
Insects in crops $14,500,000,000 \p
Weeds in forages, gardens, etc. $6,500,000,000 \p
Human diseases $6,500,000,000 \p
Cats $6,000,000,000 \p
Plant diseases in gardens $3,000,000,000 \p
Zebra mussels $3,000,000,000 \p
Insects in gardens $2,500,000,000 \p
Insects in forests $2,100,000,000 \p
Insects in forests $2,100,000,000 \p
Birds $2,100,000,000 \p
Asiatic clam $1,000,000,000 \p
Fishes $1,000,000,000 \p
Other plants $250,000,000 \p
Pigs $200,000,000 \p
Dogs $136,000,000 \p
Dutch Elm disease $100,000,000 \p
\JMongoose\j $50,000,000 \p
Green crab $44,000,000 \p
Gypsy moth $22,000,000 \p
Fire ants $10,000,000 \p
Horses and burros $5,000,000 \p
Reptiles and amphibians $604,000 \p
#
"A bigger Arctic ozone hole in 1997",965,0,0,0
(Jan '99)
\JArctic\j \B\1ozone\b\c depletion, to give the "hole" its correct name, has always been less than ozone depletion in the Antarctic, but a report in the \IJournal of Geophysical Research\i in January says ozone losses reached record high levels during the northern winters of 1995-96 and 1996-97. The second winter also featured a very long-lived \JArctic\j polar stratospheric vortex. This is a low pressure system occurring between 14 and 35 kilometres (9-22 miles) above the \Jearth\j. The vortex forms during the polar winter, when the lack of sunlight lowers the \Jtemperature\j and produces a circular wind system. The vortex is almost certainly a key step in setting up the chemical conditions which lead to ozone loss.\p
Typically, the Antarctic is colder than the \JArctic\j, so the stratospheric clouds which initiate ozone loss, are more common there. The vortex in 1996-7 produced low temperatures and ozone losses at levels similar to those of the Antarctic in the early 1980s. Up to half the ozone was lost, compared with long term averages. The vortex lasted into May, keeping the ozone hole going until then, and raising ground-level ultraviolet radiation in northern Europe.\p
Two chemical processes were involved. The better-known \Jchlorine\j activation (mainly from \B\1CFCs\b\c) in the polar stratospheric clouds in winter was replaced by other reactions powered by nitric oxide (NO) and \Jnitrogen\j dioxide (NO\D2\d), thanks to the persistent vortex.\p
The researchers say that the long-lasting winter vortices may persist, if they are caused by changing climatic conditions. This would lead to ever greater low-ozone air masses, driven by early spring halogen chemistry or by summertime \Jnitrogen\j oxide (NOx) chemistry, leading to significant effects in the northern hemisphere.\p
The note of alarm being sounded by these better-informed authors contrasts interestingly with the complacent attitudes encountered by your (southern hemisphere) reporter in Edinburgh in 1993, when the ozone hole was dismissed by a geologist as something that need only worry a few penguins in \B\1Antarctica\b\c.\p
#
"Twenty-year temperature record revealed",966,0,0,0
(Jan '99)
A January meeting of the American Meteorological Society (its 79th) was given an adjusted version of the \Jtemperature\j records of the \Jearth\j's \Jatmosphere\j for the past twenty years. Collected by nine different satellites in polar orbits, all monitoring the microwave emissions from oxygen in the \Jatmosphere\j, the data presents special problems when they are combined.\p
Imagine trying to develop a record from nine different thermometers, all made with slightly different technology, when you cannot get the instruments back to check their \Jcalibration\j. Imagine the extra problems when the satellites carrying the instruments begin to drift in their orbits, and you can begin to see some of the problems that scientists were challenged with.\p
The conclusions: the lower \Jtroposphere\j (the lowest 8 km or 5 miles) shows no warming and no cooling occurred over the period from 1979 to 1997, while the lower \Jstratosphere\j (around 15-20 km or 9-12 miles above the \Jearth\j) has actually cooled by -0.6 degrees C per decade.\p
The 1998 El-Ni±o event lifted average temperatures in April and May of 1998 by almost +0 C above the base period mean of 1982-91. The other two events over the study period were both associated with volcanic eruptions: Mt. Pinatubo in 1991 and El Chichon in 1983, but the 1998 event was not preceded by any eruption.\p
The cooling of the lower \Jstratosphere\j is consistent with ozone depletion, but what are we to make of the tropospheric data? "Warming sceptics" suggest that this is clear evidence that there is no global warming, but others are not so sure. Surface \Jtemperature\j measurements in 1998 showed it to be the warmest year of the century, following gradual warming over the past twenty years.\p
#
"SOHO in more trouble",967,0,0,0
(Jan '99)
Just months after miraculously rescuing the Solar and Heliospheric Observatory (SOHO) from a near-fatal accident, NASA researchers reported in early January that the \Jsatellite\j's last stabilising \Jgyroscope\j had failed. \p
After producing stunning images of the \Jsun\j and revealing excellent data about the workings of the \Jsun\j, the joint U.S.-European craft went out of control in June 1998, and it took five months to regain control. The instruments on the craft survived the mishap, but \Jtemperature\j swings caused fine wires to break on two of the three gyroscopes. The last \Jgyroscope\j gave out on December 21, and SOHO is now having to burn up its fuel in order to maintain its position. At the present rate, it will burn up a ten-year supply of fuel in about six months, leaving it out of control. Engineers are working to find software to work-around to the problem.\p
#
"Major NEAR engine burn completed",968,0,0,0
(Jan '99)
The NEAR \Jspacecraft\j is set for a February 2000 rendezvous with 433 Eros. This will come after a 24-minute, large bipropellant engine burn on January 3, to increase the \Jspacecraft\j's speed. The asteroid overtook the \Jspacecraft\j during December, when it went out of control as a previously planned burn was aborted. At the time of the burn, NEAR was more than half a million miles (900 000 kilometres) away from the asteroid, and falling further behind. Now the craft is travelling 2100 mph (940 metres/sec) faster than before.\p
As the asteroid zipped past, the craft was able to take some pictures, allowing planners to work out "where not to point the camera", meaning the large shadowed areas of the surface which are visible in the images.\p
#
"Telomerase does not make cells cancerous",969,0,0,0
(Jan '99)
The January issue of \INature \JGenetics\j\i carries a report that cells immortalised with telomerase, now more than 220 generations past their normal life span of 75 to 80 divisions, remain young and vigorous. This means that the way is now clear to start applying telomerase in medical treatments.\p
Obviously, there will need to be further safety checks, say researchers in the field, but they are now confident that telomerase can be used to treat conditions such as aging and cancer. The cells in the culture, according to the report, show none of the characteristics usually associated with cancer cells, such as \Jchromosome\j instability, serum-independent growth, loss of contact inhibition and loss of cell-cycle checkpoint controls. \p
A second report, from researchers at the Geron Corporation, indicates that the cells with introduced telomerase do not produce tumours in mice. Most normal cells do not contain telomerase, and these cells have a finite life span, while telomerase is found in around 90% of cancer cells, so it is important to establish that telomerase does not behave as an oncogene, a substance which can start new cancers in some way.\p
Instead, it appears that the abnormalities seen in cancer cells are due to other mutations, and telomerase merely allows the cells to keep dividing.\p
\BKey names\b: Jerry Shay, Choy-Pik Chiu \p
#
"February, 1999 Science Review",970,0,0,0
\JA predictable volcano\j
\JNew Indian Ocean eddies found\j
\JNew catalogue of mouse genes and gene fragments\j
\JHow smart are snakes?\j
\JMalaria and AIDS vaccine advances\j
\JArteries, knee cartilage, anything\j
\JWhat is killing America's elephants?\j
\JVitamins in the news\j
\JDo wine drinkers just happen to eat right? \j
\JPutting off dialysis with a low-protein diet\j
\JMaking drinking water safe\j
\JThe return of Minamata\j
\JWorld's smallest Web server \j
\JMaking tasty soy cereals and snacks a reality\j
\JSick buildings and a household fungus\j
\JA better magnet for better fusion\j
\JTempering glass to halt cracks \j
\JThe Australia Prize goes to solar cell researchers\j
\JA grim prospect\j
\JThe coming Boom Doom for Western society\j
\JPhysics societies win in court\j
\JMartian dunes?\j
\JAntigravity and NASA\j
\JSOHO still a source of new science\j
\JA new asteroid near the earth\j
\JSorting out the molecular clock\j
\JNEAR/Eros update\j
\JProject Stardust\j
\JAntibiotic resistance to vancomycin\j
\JWorld population to hit 6 billion soon\j
\JA sixth sense for feeding under wet sand\j
\JGlenn Seaborg dies at 86\j
#
"A predictable volcano",971,0,0,0
(Feb '99)
A group of British researchers says that the volcanic ash covering much of \1Montserrat\c, a Caribbean island, may pose a serious health risk to people who have been exposed to it. The Soufriere Hills \Jvolcano\j has been pumping out ash in large plumes since 1995, and the researchers say it contains large amounts of cristobalite, a \1silica\c mineral which can cause the lung disease, silicosis.\p
Their report was published in \IScience\i in late February, and it was accompanied by a second report, describing the work of a combined British-American team, also studying the same \Jvolcano\j. This group believes that it may be possible to predict the short-term activity of the \Jvolcano\j. \p
This predictability might make it possible to help some of Montserrat's inhabitants return from the north of the island to the affected southern parts, but if the ash is as dangerous as the first study suggests, it may never be safe to reclaim the south of the island.\p
When small grains of cristobalite are inhaled, the mineral has been known to cause \1silicosis\c, a disease which is typically considered a hazard for workers in occupations such as \Jquartz\j mining and other stone cutting, blasting and drilling work.\p
Once the tiny particles of crystalline \Jsilica\j reach the \Jalveoli\j, they become embedded in the lungs. As more particles settle, they cause thickening and scarring of the tissue, and eventually the lungs cannot supply oxygen to the blood efficiently. The effects are slow and cumulative, so it is not possible to predict what the end result of existing exposures at Montserrat may finally be. \p
The Soufriere Hills \Jvolcano\j produces ash in two ways. The ash may be formed when \Jmagma\j erupts violently from the interior of the \Jvolcano\j, or when \Jlava\j builds up into a dome which grows until the sides of the dome become too steep. Then the sides collapse, setting off violent (and more frequent) eruptions. \p
This second type of eruption has produced surprisingly high levels of cristobalite particles in the ash, suggesting that the crystals were growing inside the \Jlava\j dome between eruptions and were further concentrated in ash eruptions. A more general principle here appears to be that volcanoes with \Jlava\j domes may pose a greater risk for silicosis than other types of volcanoes. \p
Worse, it looks as though human activity may increase the number of dangerous particles in the air. Like snowflakes, the ash particles clump together, but when ash is swept away, or walked on or driven over, the clumps are broken up, and are more likely to be thrown into the air again, where they may be breathed.\p
The second report deals with the cyclic behaviour of the Soufriere Hills \Jvolcano\j in particular, but the details may also be applicable to other andesite volcanoes. While the \Jlava\j dome was growing between 1996 and 1998, the researchers tracked the cycles of earthquakes, changes in steepness of the dome walls, degassing, and explosive eruptions. \p
They then modeled the cycles, and worked out the following sequence: first, there is a system of open conduits, tubes that allow \Jmagma\j to rise. These conduits are plugged by congealing \Jmagma\j, which causes the pressure below to build up, making the dome inflate. \p
The \Jmagma\j becomes sticky after some of the \Jwater\j in the melted rock boiled off, causing mineral crystals to start forming. Partially crystallized \Jmagma\j is much more viscous than uncrystallized \Jmagma\j. As a result, the thick, sticky \Jmagma\j forms a plug in the upper part of the \Jvolcano\j's conduit.\p
The \Jinflation\j of the dome then sets off a series of earthquakes, until the increasing pressure forces the plug out, letting the gas escape. \JMagma\j is then rapidly thrown out and this commonly causes the collapse of the surface \Jlava\j mound, and ash hurricanes. Then a new batch of \Jmagma\j rises to form a new plug, starting the process all over again.\p
The discovery was partly a matter of luck, as the team had just the right equipment in place to monitor the changeover from a steady \Jmagma\j flow to a cyclical one. Instead of trying to reconstruct the changes later, they were able to watch the changes happening in real time. It was also partly a matter of the researchers' developing skills in predicting where and when the \Jvolcano\j would erupt, allowing them to put instruments in the path of the eruptions.\p
The last very large eruption occurred on December 26, 1997, when the south side of the whole \Jvolcano\j collapsed, and an explosive blast completely destroyed the two towns on that side of the island. But if the area now seems to be more or less safe from a seismic viewpoint, the new information on silicosis risks would seem to suggest that the islanders would find themselves locked out of their homes forever.\p
\BKey names:\b Peter Baxter, Barry Voight \p
#
"New Indian Ocean eddies found",972,0,0,0
(Feb '99)
A huge ocean eddy system, "larger than \JTasmania\j" (about the size of Maine, for American readers) has been located in the \1Indian Ocean\c by scientists working for \JAustralia\j's CSIRO and the Scripps Institute of \JOceanography\j. The eddies are a turbulent system, reaching down 1 km (1.6 miles) into the ocean, and spinning away from the South Equatorial Current, which is fed by waters from the northern and southern Indian Oceans, as well as \Jwater\j flowing from the Pacific Ocean through the Indonesian archipelago. The current flows across the Indian Ocean towards \JAfrica\j, and it carries as much \Jwater\j as 250 Amazon Rivers.\p
The eddies serve to mix the waters, and this alters the characteristics of the ocean in a region which is a source for rainfall across southern and western \JAustralia\j, making the find particularly important. A better understanding of the system is likely to help scientists understand rainfall patterns in \JAustralia\j, and may also help them in unraveling new \1El Ni±o\c and La Ni±a features which extend into the Indian Ocean.\p
Once the eddy had been discovered, scientists were able to confirm the eddy's existence using observations taken from the French/US ocean monitoring \Jsatellite\j TOPEX-Poseidon, 1,300 kilometers (2,080 miles) from the \Jearth\j's surface. The \Jsatellite\j is fitted with a sophisticated instrument called a \Jsatellite\j altimeter, which is so sensitive that it can detect elevation changes of just a few centimeters across thousands of kilometers of ocean surface.\p
\BKey names:\b Susan Wijffels, Nan Bray and Jackson Chong.\p
#
"New catalogue of mouse genes and gene fragments",973,0,0,0
(Feb '99)
A February report in \INature \JGenetics\j\i reveals a brilliant new tool for geneticists working on the biologists' favorite model organism for studying mammalian development, the laboratory mouse. A multi-laboratory project has produced a free, publicly accessible catalogue, an \JInternet\j \Jdatabase\j of mouse gene fragments which will " ... help ensure that the mouse remains an important model in the genomic era."\p
Located on the \JInternet\j at \Bhttp://\Jgenome\j.wustl.edu/est/mouse_esthmpg.html,\b the \Jdatabase\j should also be a valuable tool for interpreting and comparing the \Jgenome\j sequences of mouse and human, as vast stretches of \Jchromosome\j sequence are worked out by ever more powerful sequencing systems. The \Jdatabase\j provides \Iexpressed sequence tags\i (ESTs), fragments of sequences from messenger RNA molecules, which reflect the underlying genes. ESTs provide quick access to the genes, and the \Jdatabase\j includes 93 per cent of all mouse ESTs available in the public domain.\p
Today, researchers routinely create "knock-outs" in yeast and in \Jnematode\j worms, model organisms whose genomes are now thoroughly sequenced. The idea is to "knock out" a gene, to stop it operating, so researchers can see what is missing in an organism which lacks that gene, so that a gene sequence can be linked to a specific function. Once the mouse \Jgenome\j is similarly stored, geneticists may be able to inactivate selected mouse genes to try to determine their functions. The capability is there now, but using knock-outs is not yet a routine process in mouse \Jgenetics\j.\p
With some 360,000 ESTs in the \Jdatabase\j, the next task will be to try to discover just how many genes are covered: the best estimate is that a mouse has less than 100,000 genes, so presumably many of the ESTs represent different portions of the same gene. In the longer term, researchers will need the entire mouse \Jgenome\j as well, but when that is available, the ESTs will be a massive help in analyzing the structure and organization of the \Jgenome\j.\p
\BKey name:\b Marco Marra.\p
#
"How smart are snakes?",974,0,0,0
(Feb '99)
Most students of \1behaviorism\c who have bothered to enquire are convinced that snakes are unintelligent animals, unable to carry out even simple tasks. But according to David Holtzmann, an American neuroscientist, people may have set the snakes tasks which failed, in snake terms, to make sense. Unlike rats, snakes were bad on mazes, but that was because the snakes were not interested in the end result. Give snakes a goal they want, like shelter from bright lights, for example, and it can be a different story.\p
In the January issue of \IAnimal Behaviour,\i Holtzmann reports that snakes have a much greater capacity for learning than earlier studies suggested. He set 24 captive-bred corn snakes (species \IElaphe guttata guttata)\i the task of escaping from a black plastic tub the size of a child's wading pool. Various visual (sight) and tactile (touch) clues were set up to help the snakes find their goal: holes in the tub's bottom that offer a dark, cosy spot to hide.\p
Some of the snakes took 700 seconds to find a hole on the first \Jday\j, but this came down to 400 seconds on the fourth \Jday\j, with some snakes finding a refuge in as little as 30 seconds. In other words, instead of confronting snakes with an unrealistic model, they are given a problem that they might easily encounter in real life.\p
Young snakes, those up to three years old, appear to be more adaptable and resourceful, using a variety of clues to find their way to the exit, says Holtzmann, while older snakes tend to rely more on visual clues, becoming a bit confused if the brightly coloured card marking the exit hole is tampered with. The bright orange and red snakes cannot be observed directly during experimental runs, but fitted with tiny metal foil hats, they could be tracked with a video camera.\p
#
"Malaria and AIDS vaccine advances",975,0,0,0
(Feb '99)
Nearly half a million Ugandans have already died of \1AIDS\c from a population of some twenty million. About a million children have been orphaned as a result, since African HIV infections are mainly passed on by heterosexual activity. Current infection levels in the population range from 4-10 per cent in rural areas, and 10-25 per cent in urban areas of \JUganda\j.\p
After years of preparation, a Phase 1 trial of an "AIDS vaccine" began in \1Uganda\c in February. The trial is part of a long-term, joint program that began in 1991 to develop a safe and effective vaccine against HIV, the human immunodeficiency virus.\p
The vaccine, ALVAC vCP205, is based on a canarypox virus that cannot cause disease in humans. Canarypox-based HIV vaccines have already been tested in more than 800 volunteers in the United States and \JFrance\j. No serious side effects have been reported.\p
The trial is a small but symbolic step in developing an effective vaccine for \JAfrica\j. The trial, known as HIVNET 007, involves 40 healthy, HIV-negative adults between 18 and 40 years old who are at low risk of becoming infected with HIV. The volunteers, who have been fully informed about the processes, will be randomly assigned to one of three groups. Half of the volunteers will receive the HIV vaccine; ten will serve as controls by receiving a similar experimental canarypox vaccine for \Jrabies\j; and ten additional control individuals will receive a \Jplacebo\j that does not contain any vaccine. \p
Each person will have four injections over six months in a double-blind study, where neither the study participants nor the health professionals involved will know which type of injection each volunteer receives. The study is planned to last for a year, with a further one-year follow-up period.\p
The follow-up observations will target reactions to the vaccines and tests. It will also look for immune responses directed against HIV itself (neutralizing \Jantibodies\j) or against cells infected with HIV, cytotoxic T lymphocytes (often called CTLs) (see \BAIDS Updates,\b May 1997). \p
The ALVAC vCP205 vaccine should be completely safe, because the vaccine contains just three HIV genes, which by themselves cannot produce an infectious virus. As well, these genes are inserted into a weakened version of the canarypox virus, which serves solely as the gene carrier, or vector, to safely express specific HIV proteins known to make the body develop immune responses against HIV. Most importantly, the canarypox virus can not reproduce in human cells, so no new canarypox virus particles can be formed. \p
The HIV genes come from the virus strain known as clade B, which is the predominant type of HIV in the United States and Europe, and so the study will be looking for reactions to this strain. The researchers on the trial will also look for immune responses to clades A and D, because the former two subtypes cause most HIV infections in \JUganda\j. Recent studies show that CTLs from people naturally infected with clade A or D viruses can recognize clade B viruses in the laboratory, in a process called \Icross-reactivity.\i\p
Laboratory studies suggest that some vaccines of this type can stimulate the formation of "broadly reactive CTLs," and if this sort of cross-reactivity is found in the volunteers, it will be the signal to extend the trials to a larger group of individuals. If there is a low level of cross-reactivity, or none at all, it will be necessary to develop vaccines based on the strains common in \JUganda\j.\p
The work has been sponsored by NIAID (the National Institute of \JAllergy\j and Infectious Diseases), a part of the National Institutes of Health in the USA. More information is available on the NIAID Web site at \Bhttp://www.niaid.nih.gov\b\p
NIAID is also associated with a report of hopeful signs on the \Jmalaria\j front, appearing in the \IProceedings of the National Academy of Sciences USA\i during February. The paper describes a new broad-based \Jmalaria\j vaccine which has shown up well in laboratory tests, though it still has to be tried out on people.\p
The worst malarial attacks are caused by the tiny parasite, \IPlasmodium falciparum,\i which is transmitted to humans by mosquitoes, and then passes through a complex life cycle, making it hard for any one vaccine to stop the whole population of parasites - there always seems to be at least one stage which can avoid the vaccine for one reason or another.\p
The new vaccine combines 21 different segments from nine \IP. Falciparum\i proteins to form a single recombinant protein, which has been used to immunize rabbits. Each of the 21 segments, or peptides, was selected because it was recognized by the immune systems of people with \Jmalaria\j, as shown in earlier studies in \JKenya\j. For good measure, the selected peptides target different branches of the immune system: B cells, helper T cells and cytotoxic T lymphocytes (CTLs).\p
In the laboratory, the vaccine appears to make people form \Jantibodies\j which recognize the parasite at different stages of the life cycle. The \Jantibodies\j also block \IP. Falciparum\i invasion into the rabbits' liver cells and inhibit growth of the organism in their blood, and the researchers are now looking at the T-cell responses in the vaccinated animals.\p
The multiple vaccines are effective for two reasons. First, they stop the parasites in a number of stages of the life cycle, but more importantly, a parasite which changes a single surface protein may escape a conventional vaccine, but there are still plenty of other weapons remaining in a "multicomponent vaccine," as these products are called.\p
The next stage involves trials on monkeys and then on humans: if those trials are successful, the researchers anticipate the development of variations of the vaccine targeted at the malarial varieties in different parts of the world.\p
#
"Arteries, knee cartilage, anything",976,0,0,0
(Feb '99)
A new biomaterial, a hydrogel called "Salubria," was announced on the \JInternet\j during February. Developed at the Georgia Institute of Technology, Salubria could help patients needing artery or knee cartilage replacement, and the inventors even hope to see it used to speed the repair of damaged nerves in patients with spinal cord injuries and as the basis for an implantable drug delivery system.\p
It will be five to seven years before the material can gain final approval, and while announcements like this often go nowhere, the product is interesting for the thinking revealed in the reports released on the \JInternet\j. Salubria is biocompatible with body tissue because it attracts \Jwater\j. It is made from an organic polymer, rather than silicone, and it is possible to adjust its mechanical strength, which is enough to stop the material bursting under normal physiological conditions.\p
Even more importantly, Salubria is said to have enough elasticity and compliance that it will pulsate in rhythm with the heart. It has been tested on rats, dogs and sheep, and \Jplatelets\j do not adhere to it in significant quantities. This is important, because \Jplatelets\j \Ido\i adhere to Dacron, which surgeons have used for artery replacement in the \Jabdomen\j and legs since its development in the 1950s. \p
\BKey name:\b David Ku\p
#
"What is killing America's elephants?",977,0,0,0
(Feb '99)
This may not be a question everybody has asked in the past twelve months, but it has been a serious concern in zoo circles in the US, where a dozen young and healthy zoo elephants have died after hemorrhaging from a previously unknown form of herpes virus which apparently jumped from African elephants to the Asian species. \p
That, at least, is the conclusion of a report published in \IScience\i during February. The virus appears to be latent in African elephants, which means that it may be found in other parts of the world in zoo populations, offering a risk to the captive members of the endangered Asian species.\p
The good news is that quick detection and treatment with antiviral drugs can save the sick elephants' lives, once people are aware of the problem. Just 34 Asian elephants were born in zoos in north America from 1983 to 1996. Of these, seven have died from the virus, and two more with incomplete records are suspected to have died from it. Most of the infected elephants were young. \p
The case began with the mysterious death of an elephant called Kumari, followed by a check of old zoo records, and the analysis of tissue samples preserved from the earlier deaths. Three more cases have since been detected in \JCalifornia\j, Missouri and \JFlorida\j, and the antiviral drug famciclovir is now established as a successful treatment.\p
With assistance from scientists in Zimbabwe and South \JAfrica\j, the researchers have shown that the virus is present in blood and tissue samples from healthy African elephants. It appears that the virus is present in, but non-lethal to, wild African elephants. \p
\BKey names:\b Gary Hayward, Laura Richman, Richard Montali\p
#
"Vitamins in the news",978,0,0,0
(Feb '99)
At the start of February, the American Heart Association (AHA) published a new "scientific advisory" in their journal, \ICirculation,\i which says that the current evidence is not strong enough to recommend vitamin pills containing \1antioxidants\c for the general public as a \Jheart disease\j preventative.\p
A number of studies have shown that fruits, vegetables and whole grains which contain \Jantioxidants\j may lower an individual's risk for \Jheart disease\j, but the AHA believes it is still unclear whether antioxidant supplements, taken as vitamin pills, have a similar benefit. It is better, they suggest, to increase the consumption of antioxidant-rich foods such as vegetables, fruits, and whole grains.\p
The logical link between \Jantioxidants\j such as vitamins E, C and beta-carotene, a form of vitamin A on the one hand and \Jheart disease\j prevention on the other, comes from the belief that oxidants may be important in causing \Jatherosclerosis\j. Yet the same foods high in vitamins C, E, and/or beta-carotene are also lower in saturated fat and \Jcholesterol\j, and higher in fiber, so the link to the vitamins could be a false one. As well, those foods also carry other possibly important \Jnutrients\j such as minerals, flavonoids and other carotenoids. \p
While the evidence is unclear, there seems to be a stronger case for vitamin E supplements where the person has already had a heart attack or stroke, but the links between the vitamin and preventing a first stroke or heart attack appear less reliable.\p
A Johns Hopkins report in the last issue of the \IBritish Medical Journal\i for February indicates that women in \JNepal\j had their risk of death from \Jpregnancy\j lowered by about 40 per cent after taking dietary supplements of vitamin A or beta-carotene, compared with women who did not take the supplements. Mortality rates from \Jpregnancy\j in rural south Asia often run at 50 to 100 times the rates in industrialised countries, so this finding points the way to a major advance.\p
The deaths usually arise from severe bleeding, obstructed labor or infection, with \Jmalnutrition\j, especially vitamin A deficiency, also playing a critical role. Night \Jblindness\j, a condition which is attributed to vitamin A deficiency, often occurs in 10 per cent or more of women during \Jpregnancy\j in the study area, the southeast plains of \JNepal\j, where medical, prenatal and obstetric care are practically non-existent. \p
The women were either given the recommended dietary amounts of vitamin A, or beta-carotene, or a \Jplacebo\j. The report is at pains to stress that the trial's purpose was explained at community meetings, and written consent to participate was obtained from subdistrict leaders during the year prior to starting. Verbal consent to participate, they tell us, was obtained from the women.\p
The study involved more than 44,000 women over three and a half years, during which time just over 20,000 of them achieved just over 22,000 pregnancies, resulting in 110 maternal deaths, few of them involving a medical practitioner, so the researchers had to interview the families to discover the cause of death.\p
Mortality rates per 100,000 for the \Jplacebo\j, vitamin A, and beta-carotene supplemented groups were 704, 426, and 361 respectively, compared with a rate in the USA of just 8 per 100,000. \p
\BKey name:\b Keith West\p
#
"Do wine drinkers just happen to eat right?",979,0,0,0
(Feb '99)
We reported in November 1997 (see \BWine Good for the Heart,\b November 1997) that wine drinkers seem to have fewer heart problems. In December 1997, in \BWhy Red Wine is Good For The Heart,\b we offered a hypothesis that resveratrol might be involved. But the "French paradox" has remained a problem for many heart researchers (it is named for the lower incidence of \Jheart disease\j in \JFrance\j).\p
A recent study of diet and drinking patterns published in the \IAmerican Journal of Clinical \JNutrition\j\i reveals that people who drink wine tend to choose foods that are healthier for the heart. So it now appears as though studies showing wine-drinkers to have lower incidence of ischaemic \Jheart disease\j may be attributable not only to the wine itself, but also to their other dietary choices.\p
The study used a random sample of 48,763 Danish men and women who were questioned about their drinking and eating habits. For the purposes of this study, a "healthy diet" was defined by high intake of fruit, vegetables, salad, and fish; reduced intake of saturated fat; and the use of olive oil in cooking. On that basis, moderate wine drinkers (1 to 3 glasses per \Jday\j) consumed the most heart-healthy diet.\p
Further research is needed: first to see if this is purely a "Danish paradox," and also to sort out the competing effects, if any, of red wine with its anti-oxidants and the healthy diet itself. The simple message for consumers: if you are going to switch to red wine, pay some attention to what you are eating when you sip.\p
#
"Putting off dialysis with a low-protein diet",980,0,0,0
(Feb '99)
Patients with failing \Jkidneys\j are eventually forced to rely on \1dialysis\c to clear their blood of wastes containing \Jnitrogen\j which are usually excreted in the person's urine. These wastes come from the breakdown of \1protein\c, and research reported in the \IJournal of the American Society of Nephrology,\i reveals that reducing the amount of protein in the diet can help these patients put off needing to use a \Jdialysis\j machine by up to a year.\p
A group of 76 patients were put on a very low-protein diet with essential amino acid and/or ketoacid supplements. These supplements provided the protein building blocks which the patients' bodies could not produce. The diet consisted mostly of fruits and vegetables; and entirely excluded such high-protein foods as meat, fish, poultry, cheese and milk.\p
The pre-\Jdialysis\j mortality was only 2.5 per cent per year, much lower than the 24 per cent annual mortality from \Jdialysis\j reported nationwide, according to Mackenzie Walser, M.D., lead author in the study. The authors say that the cost of amino acid tablets averages less than the high-protein foods they replace. And, although some patients were taking in 25 to 50 per cent more protein than recommended, that amount was still substantially less than what they were eating before.\p
#
"Making drinking water safe",981,0,0,0
(Feb '99)
The \1blue-green \Jbacteria\j\c, or Cyanobacteria, represent a serious health danger in many parts of our over-populated world. Growing as a bloom on fresh \Jwater\j, and invariably referred to as "blue-green \Jalgae\j" or "pond scum," these organisms produce a set of toxins called microcystins. The toxins are a threat to humans and livestock drinking the \Jwater\j, as they attack the liver.\p
Acute exposure to microcystins produced by the \Jalgae\j can cause liver damage and it is fatal in extreme circumstances. In one Brazilian case, around 50 \Jdialysis\j patients in \JBrazil\j died in 1996, due to the use of microcystin-contaminated \Jwater\j in their treatment. The good news: there is a way of neutralizing the toxins, according to a report posted on January 26 to the \JInternet\j Web edition of the journal \IEnvironmental Science & Technology.\i The research article is scheduled to appear in the March 1 print issue of the semi-monthly journal, which is one of a large group of peer-reviewed journals published by the American Chemical Society, both in print and on the Web. \p
Scottish researchers have shown that microcystins can be treated by adding \Jtitanium\j dioxide to the \Jwater\j, and then exposing it to light. Strictly known as microcystin-LR (cyanobacterial hepatotoxins), the toxins are very difficult to destroy with normal \Jwater\j purification methods. The WHO standard for safe \Jwater\j requires a microcystin level of less than one microgram per liter.\p
\JTitanium\j dioxide is most commonly used as a whitener in paint and in sunscreens, and the light source used in the laboratory tests was a \Jxenon\j UV lamp. Tests will need to be carried out to show that the \Jwater\j treated with \Jtitanium\j dioxide is safe to drink. If that is the case, the researchers say their detoxification method is quicker and more effective than currently used methods.\p
#
"The return of Minamata",982,0,0,0
(Feb '99)
Minamata disease is an illness of the \Jnervous system\j caused by mercury poisoning. Most people who have heard of the disease know it only as a textbook example of what can happen when \Jpollution\j controls break down. \p
Now the disease has reappeared, not in \JJapan\j, but in fishing communities of the remote Amazon \Jrainforest\j. The immediate cause is a huge increase in levels of methyl mercury in fish in the area, but the source of the mercury is still a matter for argument: it almost certainly comes either from gold mining, or else it leaches from soils following deforestation. \p
This is the first time Minamata disease has been diagnosed outside the Japanese town of the same name. Maszumi Harada of Kumamoto University and Junko Nakanishi from the \JYokohama\j National University say they have seen the same symptoms showing up in the Amazon, and a number of reports appeared on the \JInternet\j during February.\p
Harada says that he found three people with the nervous symptoms peculiar to Minamata disease, including fits of trembling, from a sample of 50 people with known high levels of methyl mercury in their bodies. Previous cases of mercury poisoning in the Amazon have been found among the one million gold miners in \JBrazil\j, who use mercury to purify gold, and their families. In general, these cases arose from people inhaling mercury fumes directly, which does not cause the Minamata symptoms.\p
The new cases, and the Minamata cases in particular, are happening hundreds of miles from the nearest mine, and they involve methyl mercury, which is passed along the food chain, and concentrated in higher-order members of the food chain. Methyl mercury attacks the \Jcerebellum\j, which coordinates voluntary movements, and destroys the personality. The victims have consumed methyl mercury in fish, converted into the more toxic methylated form by \Jbacteria\j living in oxygen-starved conditions in river sediments, just like the \Jbacteria\j which converted industrial discharges in Minamata Bay.\p
The chances of cleaning up the mercury could be remote. It cost the Japanese Government millions of dollars to remove the polluted sediment from Minamata Bay, but this sort of money is unlikely to be available to clean up remote parts of the Amazon basin. There is one hope: to change the eating patterns of the people who rely on fish, getting them to target plant-eating fish, further down the food chain, which carry smaller amounts of methyl mercury.\p
#
"World's smallest Web server",983,0,0,0
(Feb '99)
A Stanford professor of computer science, Vaughan Pratt, has created the world's smallest Web server, a matchbox-sized device that is small enough to "fit into a shirt pocket." Using off-the-shelf components, he has squeezed the hardware and software needed to operate a Web site into a package less than 1.75 inches high, 2.75 inches wide and 0.25 inch thick (44 x 70 x 6 mm). It performs all the basic functions of a typical desktop computer that occupies three thousand times the space.\p
The matchbox computer consists of a 66MHz AMD 486-SX chip, 16 megabytes of RAM, and 16 megabytes of flash ROM. It is connected to the \JInternet\j through a parallel port and runs a cut-down version of Linux, a popular version of the \JUnix\j operating system. Because the machine is used as a Web server, it does not need a keyboard or a display, but can be operated from another computer over the Web connection.\p
The server was first connected to the Web on January 22, and the server's Web page includes useful items such as a picture of the computer posed alongside a most unusual Russian matchbox, and a detailed description of the tiny computer. It also gives instructions on how computer hobbyists can build the server themselves. The project is a product of Pratt's new Wearables Lab, which, like a similar lab at MIT, is developing computer technology that can be incorporated directly into clothing.\p
According to their publicity, "The Wearables group is already working on a more powerful server, one based on an Intel Pentium chipset. They intend to combine a credit-card-size Pentium motherboard that Cell Computing introduced last fall with a new 340 megabyte hard drive from IBM that is a fraction of an inch thick and less than 2 inches on a side."\p
Computing may never be the same again - once somebody comes up with a compact method for inputting data, but right now, Pratt and doctoral student Greg Defouw are working on a special glove that can recognize a digital sign language, called Thumbcode, that they have developed to replace the bulky keyboard. The planned Intel Pentium system, they say, will be powerful enough to run the complete Windows operating system and one of the voice-recognition programs currently on the market, so if Thumbcode does not take off, perhaps we will simply build our computers into our cellular phones.\p
Curiously, Pratt was born in Melbourne, \JAustralia\j, though he has been in the USA for the past thirty years. The previous record holder was made by \1Phar Lap\c, a company named after a famous Australian \Jracehorse\j which died in the USA, and whose stuffed body is now held for museum display in Pratt's birth town.\p
\BReference: http://wearables.stanford.edu\b\p
#
"Making tasty soy cereals and snacks a reality",984,0,0,0
(Feb '99)
Researchers from the University of Illinois say they are closing in on the problems of turning soy material into \Jcereals\j and snacks. In the January issue of the \IJournal of Agricultural and Food Chemistry,\i they say they hope soon to deliver soy's potential health benefits in products that pass public scrutiny on taste and texture.\p
The problems come from the way most breakfast \Jcereals\j are made, by a process called extrusion, where the parts are treated by "aggressive mixing and heating." Isoflavones in the soy are thought to be the reason why soy offers protection against \Jheart disease\j, breast and prostate cancers and \Josteoporosis\j, as well as relieving menopausal symptoms, and reducing bad \Jcholesterol\j levels, but it was not clear whether these compounds would stand up to the extrusion process.\p
So long as the extrusion goes on for less than a minute, the key isoflavones, the genistein and daidzein series, lose very little of their structural profiles when mixed at 80 per cent corn/20 per cent soy in the extrusion process.\p
Other papers in recent issues of \ICereal Chemistry\i and the \IJournal of Food Science\i have reported on taste testing results, and work is under way on the effects of extruded soy-containing \Jcereals\j on cancer cells.\p
#
"Sick buildings and a household fungus",985,0,0,0
(Feb '99)
"Sick Building Syndrome" is most commonly associated with new buildings, and has often been linked to new plastics, which often emit a variety of organic chemicals into the building's air. Now it appears that there is another cause, one which can also be found in older buildings, a \Jfungus\j called \IStachybotrys chartarum.\i This is especially likely if the building has been flooded, or if it has been affected by \Jwater\j damage.\p
People suffering the syndrome typically report that they are suffering shortness of breath, headaches, or sometimes just that they are not feeling quite right, but without any specific cause. Some say they find it hard to concentrate, or they feel easily fatigued.\p
The American Phytopathological Society (APS) has targeted this \Jfungus\j on its Web site for February, which can be found at \Bhttp://www.scisoc.org\b. The evidence of the past 20 years has been reviewed and outlined there. In brief, \Jwater\j damage from flooding or from broken pipes, roof leaks, sewage backups, or condensation can trigger an outbreak of the \Jfungus\j, which occurs naturally in the soil, with spores being carried in by wind, or by the waters themselves.\p
Once in the house, it will attack paper on wallboards, wallpaper and other paper products, carpet made from natural fibers, and just about any other organic material. The \Jfungus\j generally has a black appearance and will be slightly shiny if wet; a powdery appearance if it is dry, but a lot of the time it is present but invisible, hidden behind walls, or in ceiling areas.\p
The APS recommends that if you suspect a problem, you need to carry out careful inspections, and then treat the outbreak as a toxic mold. Surface disinfection will do nothing against fungal growth within the materials being attacked, allowing it to return. Then, home-owners need to correct the moisture problem to prevent further mold development.\p
#
"A better magnet for better fusion",986,0,0,0
(Feb '99)
A model weighing 40 tonnes may seem a bit excessive, but the magnet just completed by a MIT-led team in the United States is to be combined with another similar magnet in \JJapan\j to form a testbed for the design of a 1300-tonne magnet for use in experiments on \1nuclear fusion\c. The completion of the magnet was announced during February, and the assemblage will soon be on its way to the Japanese Atomic \JEnergy\j Research Institute (JAERI). Once there, it will be combined into a second magnet to form a device called the Central \JSolenoid\j Model Coil (CSMC).\p
The superconducting magnets will be used to investigate superconducting performance parameters and manufacturing methods for the full-size magnet planned for the International Thermonuclear Experimental Reactor (ITER) project. The 1300-tonne ITER magnet will provide the magnetic fields needed to initiate and sustain the plasma, or electrically charged gas, necessary for the generation of \Jenergy\j from the fusion reaction.\p
#
"Tempering glass to halt cracks",987,0,0,0
(Feb '99)
Few things are as fragile as glass. We have had chemically and heat tempered glasses for many years, and these glasses can take more stress before breaking than untreated glass, but when they break, they usually break catastrophically. Another problem with tempered glass is that while each individual piece of glass becomes stronger, the variability of strength between parts of the glass increases dramatically. Engineers choosing glass for specific purposes must allow for this wider range of strengths, but perhaps this is about to change. According to a \IScience\i report in late February, it may be possible to engineer the glass so it has a "specific compression profile" which makes the final product stronger and less variable. \p
Conventional tempering of glass changes the outer surface of the glass so that it is under compression, and glass under compression can withstand higher levels of stress before reaching the failure point. The researchers used the chemical tempering process on sodium aluminosilicate glass when they tested their theory, but they believe it will work with other tempering methods and other glass types.\p
Chemical tempering involves replacing some of the sodium atoms near the surface with \Jpotassium\j atoms. The \Jpotassium\j atoms are slightly larger than the sodium atoms and so they compress the layer in which they are substituted, by crowding the other atoms, a bit like three fat people replacing three thin people in a crowded room.\p
Chemical tempering usually occurs only in the outer millimeter of the piece of glass. The new tempering method involves placing the maximum compression layer below the surface, so when cracks spread from the flaws on the surface, they reach the tempered layer and stop. The researchers created these internal compressed layers by subjecting the glass to chemical processing where \Jpotassium\j substituted for sodium, but then they replaced some of the \Jpotassium\j near the surface with sodium atoms. This created glass with an untempered surface, but with a tempered, compressed layer below. \p
Interestingly, the glass made in this way exhibits multiple cracking, with many small cracks developing and stopping at the tempered layer, producing a surface crazing which can be used as a warning that the glass is approaching its breaking point and needs to be replaced. \p
Chemically tempered glass is used in spectacles and sunglasses, while thermally tempered glass is used in \Jautomobile\j windshields (windscreens). This new tempering method could allow thinner glass to be used in such things as photocopying machines, scanners and electronic displays because it would make the glass stronger and lighter. \p
\BKey names:\b David J. Green, R. Tandon, V.M. Sglavo\p
#
"The Australia Prize goes to solar cell researchers",988,0,0,0
(Feb '99)
The \JAustralia\j Prize awards were announced during February, with the winners being Professor Martin Green and Professor Stuart Wenham, of the University of New South Wales (see \BGreen cells for a greener \JAustralia\j,\b January 1997).\p
Last year solar cells produced by their team at UNSW achieved 24.5 per cent efficiency, the current world record for production cells.\p
#
"A grim prospect",989,0,0,0
(Feb '99)
The fear of bioterrorism was in the air during February, with press releases hitting the \JInternet\j from a number of sources, but most of them triggered by a "National Symposium on Medical and Public Health Response to Bioterrorism" in the United States.\p
High on the list of fears: repeats of the sarin nerve gas attack on the \JTokyo\j subway system in 1995 (see \BTokyo Subway Attack,\b 1995), growing evidence of work carried out in the past in the former Soviet Union, and a series of recent discoveries of \JIraq\j's large-scale efforts to produce and "weaponize" biological agents.\p
The symposium was sponsored by the Johns Hopkins University's Center for Civilian Biodefense Studies (CCBS), whose director is D. A. Henderson, M.D., the person credited with leading the World Health Organization's successful fight to eradicate \1smallpox\c from the world. Henderson also had a paper on the subject published in \IScience\i to appear just after the symposium.\p
He argues that, just as the medical community changed people's thinking by drawing attention to the threat of nuclear war in the 1980s, they must now alert the community to the risk of biological warfare. Henderson says that while it might seem risky even to broach such issues for fear of setting an event off, it now seemed clear that the likely terrorists could already envisage every possible scenario. In such a case, it is better to have matters out in the open, and plan a response.\p
The United States, like most other nations, is ill-prepared to cope with a bioterrorist attack, most experts agree. In fact, it could take days to weeks (depending on the microbe) before physicians or public health officials even realized an attack had been made. The first sign of attack is likely to be people becoming sick or dying in emergency rooms or clinics.\p
The proposed plan would involve setting up stores of curative drugs and vaccines, with these being made available first to primary care doctors, emergency room physicians and nurses, infectious disease specialists, infection control professionals, \Jhospital\j epidemiologists, health officers, and laboratory directors and staff, the people who would then be expected to ensure the health of others. Up until now, any military planning appears to have left such people largely out of consideration.\p
The CCBS will focus first on those biological weapons that pose the greatest threat to civilian populations, including \1anthrax\c, smallpox, plague and viral hemorrhagic fevers. Any of these could trigger an epidemic which might spread well beyond the areas actually targeted by bioterrorists. \p
#
"The coming Boom Doom for Western society",990,0,0,0
(Feb '99)
American gerontologist, Edward L. Schneider, warned in \IScience\i in early February that the US needs to spend more of its research dollars on the research and prevention of the diseases of aging. Health care for the aged already consumes a third of the trillion-dollar US budget, yet federally-funded research stands at about a billion dollars.\p
"No corporation that spent a mere 0.3 per cent of its revenues on research would last long in a competitive marketplace," Schneider said. Describing a trend also to be found in the rest of the industrialized world, Schneider said that aging baby boomers and continuing increases in life expectancy will swell the number of Americans aged 65 or older to 35 million in 2000 and 78 million in 2050. \p
Middle estimates from the US \JCensus\j Bureau project about 18 million people who will be 85 or older by 2050, but many demographers believe the bureau's higher projections of 31 million very old Americans are more likely to come true. The solution, he suggested, was to conquer many of the debilitating conditions affecting older people with more research, so that "the average health of a future 85-year-old in the year 2040 resembles that of a current 70-year-old with relatively modest needs for acute and long-term care." \p
#
"Physics societies win in court",991,0,0,0
(Feb '99)
In August 1997, the glossy Swiss-based Gordon and Breach Publishing Group, objected to a physicist's finding that \IPhysics Today\i and \IThe Bulletin\i were hands-down winners on a simple test of effectiveness. After they lost a case against the American Physical Society (APS) and the American Institute of Physics (AIP), the publishers said then that they would appeal the case. \p
In late January, the US Court of Appeals for the Second Circuit unanimously affirmed the decision by US District Judge Leonard B. Sand that the publication and promotional use of a survey of journal prices did not constitute false or misleading advertising. The promotion, based on an analysis of costs per thousand printed characters, showed that that G&B's physics journals were, on average, far more expensive. The 1997 decision upheld the methods used, and noted that:\p
"Barschall's methodology has been demonstrated to establish reliably precisely the proposition for which defendants cited it - that defendants' physics journals, as measured by cost per character and by cost per character divided by impact factor, are substantially more cost-effective than those published by plaintiffs." (The impact factor mentioned here is a measure of citation frequency.)\p
The societies hailed the win as a win for free speech in scholarly discussion, but they failed to gain an order to cover their costs in the case. Other related court cases are continuing in \JFrance\j and \JSwitzerland\j.\p
#
"Martian dunes?",992,0,0,0
(Feb '99)
High-resolution pictures of the Martian surface taken by the orbiting Mars Surveyor \Jspacecraft\j have revealed structures looking very like the sand dunes we know from our own \Jplanet\j's deserts. There are also some small bright sand dunes which may be formed of a sulfate mineral such as \1gypsum\c. These dune fields are "only a few miles" across, and they appear bright in photographs because of the way in which the sand particles reflect light.\p
By comparison, the much larger dark dune fields, "up to tens of miles across," are most probably made of grains eroded from \Jlava\j flows, according to scientists. It seems that Mars has winds strong enough to shift sand grains at certain times of the year, when Mars is closest to the \Jsun\j as it travels on its eccentric \Jorbit\j. One picture even shows "bright stuff blowing off the dunes." The detailed analysis was published in \INature\i in mid-February.\p
The pictures were taken by the Mars Orbiter Camera (MOC) on board the Mars Global Surveyor \Jspacecraft\j. This took images periodically while shrinking its \Jorbit\j toward the final mapping \Jorbit\j reached during February. The \Jspacecraft\j entered Mars \Jorbit\j in September 1997, and it now takes just 118 minutes to circle the \Jplanet\j. The MOC takes images of the surface with a resolution up to 50 times greater than the previously available pictures taken by the Viking \Jspacecraft\j more than two decades ago.\p
The Viking pictures also showed dunes, but now the dark dunes are extremely clear, with the bright dunes as a surprise package, never before seen. The scientists' speculation about \Jgypsum\j is based on Viking and other data which have revealed a lot of sulfates in the Martian soil. \JGypsum\j would be bright and soft enough to erode quickly downwind as the images indicate. It is possible, though, that the bright dunes could be formed of particles cemented by sulfates, or other minerals.\p
#
"Antigravity and NASA",993,0,0,0
(Feb '99)
Antigravity, we all know, is just a \Jscience fiction\j dream. Now it appears that NASA thinks there may be something in the idea after all, and they are about to spend US$600,000 on trying to duplicate the controversial experiments of a Russian scientist who claims to have invented a device that blocks the force of gravity. \p
E. E. Podkletnov, a materials scientist at the Moscow Chemical Scientific Research Centre, reported several years ago that a spinning, superconducting disc lost some of its weight. Since then, he has indicated, in an unpublished paper on the weak gravitation shielding properties of a superconductor, that the disc can lose as much as 2 per cent of its weight. \p
While this is still a long way from the antigravity that supposedly powers UFOs, any device that shields a rocket from the \JEarth\j's gravity is of great interest to NASA. Think of the payload advantages if you only needed a gentle push to get a rocket up through the \Jatmosphere\j and into space. \p
It is unlikely that they will get something for nothing. The logic of standard science, of course, says that you need to put as much \Jenergy\j into the raising of the rocket as you can get back from dropping the rocket back to \Jearth\j. Of course, rockets are inefficient, and it may well prove to be a great deal more efficient to rely on Podkletnov's method, using a smaller amount of \Jenergy\j to set up the superconducting disc, and then to set it spinning.\p
NASA first tried a "small disc, four to five inches in diameter," but found no gravitational effect that could be distinguished from background noise in the nanogee range. Now they say they will be trying a twelve-inch (30 cm) disc, to see if they have any better luck that way. The researchers will be trying to set it up to put radio-frequency signals into the disc. The RF signals used by Podkletnov varied from 100 to 1000 megahertz, and they believe that if they are to test his claims properly, they will need to replicate his methods.\p
Many physicists think that the logic is all wrong, "because gravity comes from mass, not from quantum effects," but science never progressed by people saying "this cannot work, so we will not try it." Rather, it works by people trying out the consequences of strange and bizarre effects.\p
#
"SOHO still a source of new science",994,0,0,0
(Feb '99)
With the SOHO \Jspacecraft\j now regarded by most observers as a lost cause, it remains in the scientific news, as new analysis of old data continues to change our ways of thinking. A February report in \IScience,\i using data from SOHO's Solar Ultraviolet Measurements of Emitted Radiation (SUMER) spectrometer, has identified regions on the \JSun\j where the high speed \1\Jsolar wind\j\c originates. The \Jsolar wind\j flows come from the edges of honey-comb shaped patterns of magnetic fields at the surface of the \JSun\j. \p
The extreme ultraviolet images were taken in September 1996, and they reveal gas at 1.5 million degrees shaped by magnetic fields. There are bright regions which indicate hot, dense plasma loops with strong magnetic fields, and dark regions which imply an open magnetic field \Jgeometry\j. These fields are the source of the high speed \Jsolar wind\j. The winds are accelerated to over 1.5 million kilometers per hour as they stream toward the \JEarth\j.\p
The delay came about because, while the SUMER images are ideal for studying atmospheric motions and turbulence, a careful analysis was required before these slow motions could be identified near the \JSun\j. \p
Because SOHO is located 1.5 million kilometers (about a million miles) out in space, it is able to detect ultraviolet light which is blocked out by the \Jatmosphere\j, and which is, in any case, invisible to our eyes. The \Jsolar wind\j is a high speed stream of electrically charged gas flowing from the surface of the \JSun\j, but its actual source has long been a mystery.\p
When the hot gas is moving with respect to the observer, the wavelengths appear shorter, due to a phenomenon called the \1\JDoppler effect or shift\j\c, and by analyzing this, the speed of the gas can be closely estimated. Where the gas is moving towards us, we see a blue-shifted area in the image, and we infer a presumed source of the \Jsolar wind\j.\p
The novel aspect of the report is that the source areas appear to be concentrated in specific patches at the edges of the honey-comb shaped magnetic fields. There are convection cells just below the \JSun\j's surface, each one associated with a magnetic field, and the \Jsolar wind\j is breaking out around the edges of the convection cells. One of the co-authors, Dr Helen Mason, was quoted on the \JInternet\j as saying that the wind was coming through, rather like grass popping out between paving stones. But, she added, most grass does not travel at 8 km/s, let alone the 800 km/s that the fastest \Jsolar wind\js can reach.\p
#
"A new asteroid near the earth",995,0,0,0
(Feb '99)
A piece of rock some 30 to 50 meters across has been found circling the \JSun\j in an \Jorbit\j close to \JEarth\j's. The small \1asteroid\c is probably a chip off the moon, according to astronomers. Labeled 1999 CG9, the object was spotted on February 10 by an automated asteroid-hunting \Jtelescope\j in New Mexico called Linear.\p
With an \Jorbit\j just 9 million kilometers larger than our own, the asteroid circles the \JSun\j in slightly more than a year in an unusual, nearly circular \Jorbit\j. Most stray rocks, comets and \Jasteroids\j have very eccentric orbits, and the shape of the \Jorbit\j is the reason why astronomers believe the rock may have come from the moon. With its low gravity, the moon is far more likely to have material blasted off the surface at more than the body's escape velocity. As evidence of this, 12 confirmed lunar meteorites have been found on \Jearth\j.\p
An earlier object with a similar \Jorbit\j, 1991 VG, was originally suspected of being a \Jspacecraft\j which had escaped from \JEarth\j's gravity, but 1999 CG9 is considerably brighter, suggesting that it is larger, and certainly too large to be the final stage of any rocket. Further studies, aimed at detecting a spectrum from the object, may tell us more about its composition, which may in turn reveal its origins.\p
#
"Sorting out the molecular clock",996,0,0,0
(Feb '99)
How old is the placental \1mammal\c group? This is the group which includes dolphins, bats, moles, elephants and humans, and on the \Jfossil\j evidence, the group arose about 65 million years ago, around the time when the dinosaurs became extinct at the end of the \1Cretaceous period\c. On the other hand, if we look to the "molecular clock," the evidence from genetic and other biochemical data, this same group of mammals should have appeared 130 million years ago, quite early in the Cretaceous period.\p
One possibility is that the \Jfossil\j record is inadequate, and simply fails to show the early diversification, and a report in \IScience\i in late February looks at exactly this possibility. The conclusion: the quality of the \Jfossil\j record is something like 10 to 100 times greater than the low standard that would be required if we were to allow this hypothesis of missing species diversity.\p
The researchers behind the study believe they are justified in calling the whole molecular clock in question, but there is a major flaw in their logic. The two mammalian groups could have divided up 130 million years ago, but remained closely similar marsupials for many millions of years, until such time as the dinosaurs went, leaving room for the chemically different but anatomically similar mammalian groups to diverge, with one group becoming the modern placentals only at that time.\p
The whole area is a minefield of opinionated scientists arguing for their own sides (and your reporter tends to side with the "molecular clock" people), so that every piece of work needs to be scrutinized very carefully - even this account, and the research it describes. The work began with a paper published in \INature\i (see \BThe spread of the mammals,\b April 1998) which argued that various groups of plants and animals, not just placental mammals, must have had a long history that is simply not found in the \Jfossil\j record.\p
Now we have an analysis which shows that the \Jfossil\j record is more complete than many people have assumed: the next maneuver, no doubt, will come from the molecular clock-makers.\p
\BKey names:\b Mike Foote, John Hunter, Christine Janis and Jack Sepkoski\p
#
"NEAR/Eros update",997,0,0,0
(Feb '99)
Analysis of data gathered in 222 photographs and supporting spectral observations during the NEAR fly-by in December (see \BNEAR finds problems, solutions,\b December 1998 and \BMajor NEAR engine burn completed,\b January 1999) reveals that asteroid 433 Eros is slightly smaller than predicted, with at least two medium-sized craters, a long surface ridge, and a density like that of the \JEarth\j's crust. NEAR remains on course for its meeting with the asteroid which begins in mid-February 2000. \p
Eros has been known for over a century, and had already been identified as an S-type asteroid with high concentrations of silicate minerals and metal, but it is hard to gather any more detailed information than that at a distance. The flyby has shown us variations in surface color and reflected light which suggest that the asteroid has a "diverse surface makeup," and scientists hope that the closer approaches (as close as 15 km/9 miles) in 2000 will give us the fine detail on these surface patches. \p
The size of Eros is now officially 33 x 13 x 13 kilometers (21 x 8 x 8 miles), as opposed to a previous estimate of 40.5 x 14.5 x 14 km (25.3 x 9 x 8 miles). We now know that the asteroid rotates once every 5.27 hours and has no moons, so far as we know.\p
The asteroid's specific gravity is now set at 2.7, or roughly twice that of 253 Mathilde, a C-type, carbon-rich asteroid that NEAR flew past in June 1997 (see \BMathilde - Not Your Average Asteroid,\b July 1997). The \JGalileo\j \Jspacecraft\j flew past another S-type asteroid, 243 Ida, in 1993, and this had a similar value for its specific gravity to Eros, and close to the average density of \JEarth\j's crust.\p
There is a ridge which extends 20 km (12 miles) along the asteroid, and this feature, combined with the high specific gravity, lead scientists to believe that Eros is a homogeneous body rather than a "collection of rubble" such as Mathilde appears to be. It may even be a remnant of a much larger body which was shattered by an impact, they say.\p
The surface of Eros has many craters. The two largest craters are 8.5 and 6.5 km (4 miles and 5.3 miles) in diameter, which is less than half the size of asteroid Mathilde's largest craters. This may indicate that Eros has a relatively young surface when compared with Ida. \p
#
"Project Stardust",998,0,0,0
(Feb '99)
Stardust, a mission to bring back samples of dust from \Jcomet\j Wild-2, took off on schedule on February 5. The mission (see \BGetting ready to collect \Jcomet\j dust,\b July, 1998) is to intercept Wild-2 in 2004, to capture tiny bits of \Jcomet\j dust and debris, and then return them to \JEarth\j for analysis in 2006. It will be the first \Jcomet\j rendezvous mission since the European \IGiotto\i \Jspacecraft\j's fly-by of \JComet\j Halley (1986) and \JComet\j 26P/Grigg-Skjellerup (1992), and the first ever to attempt to return a \Jcomet\j sample to \JEarth\j.\p
Until quite recently, Wild-2 circled the \JSun\j in an \Jorbit\j between Jupiter and Uranus, but this changed in September 1974 when Wild-2 passed within 0.006 AU of Jupiter. This altered Wild-2's \Jorbit\j so that its closest approach to the \JSun\j now lies just inside the \Jorbit\j of Mars. In other words, until recently, the \Jcomet\j has been located in a cold region of space, far from the \JSun\j, meaning that it should be rich in the more volatile materials that astronomers wish to sample. \p
These materials will boil off as the \Jcomet\j comes closer to the \JSun\j, ready to be sampled by the Stardust craft, but because the \Jcomet\j has only passed close to the \JSun\j a few times, it should still provide a good yield. But while the \Jcomet\j is interesting, it is also superbly placed, allowing the craft to almost match velocities with the \Jcomet\j, out beyond Mars, with a difference as low as 21,800 kph (13,600 mph) in velocities.\p
This is about six times the speed of a high velocity rifle bullet, but in space terms, this is almost slow motion, and it means that the Aerogel collectors on the craft will catch and hold particles, rather than having them blast right through.\p
\BCorrection:\b the July 1998 reference inadvertently referred to the craft's mission as catching and returning planetary dust, when in reality, the dust is cometary, but potentially the same as the dust that may go to make up new planets around a star.\p
#
"Antibiotic resistance to vancomycin",999,0,0,0
(Feb '99)
A mid-February report in the \INew England Journal of Medicine\i (NEJM), describes a novel method of dealing with infections caused by multiresistant strains of \IStaphylococcus aureus\i (MRSA) that also show partial resistance to vancomycin, a drug often described as the "last resort" antibiotic against these dangerous pathogens. \p
A team from Rockefeller University used a combination of two commonly used \Jantibiotics\j, oxacillin and vancomycin. This combination produced a synergistic effect and killed the vancomycin-resistant MRSA very efficiently in the test tube. The \Jbacteria\j had been isolated from a 79-year-old patient who died in a New York metropolitan area \Jhospital\j last March. The researchers believe this observation could lead to effective treatments in patients.\p
\BKey names:\b Alexander Tomasz, Richard B. Roberts, Krzysztof Sieradzki and Stuart W. Haber.\p
#
"World population to hit 6 billion soon",1000,0,0,0
(Feb '99)
The latest projection for the world's population sets June 16, 1999, as the most likely date for the world's human population to reach six billion.\p
#
"A sixth sense for feeding under wet sand",1001,0,0,0
(Feb '99)
A \1knot\c is a kind of \1sandpiper\c found, among other places, in the Netherlands. Knots can locate their favorite food, shellfish, under wet sand by inserting their beak half a centimeter into the sand for a few seconds. The Australian \1duck-billed \Jplatypus\j\c uses electrical sense organs in its "bill" to locate live animals in mud, but the method used by the knots has remained a puzzle until now, when the Netherlands Organization for Scientific Research (NWO) has reported on the \JInternet\j about studies carried out at their Netherlands Institute for Sea Research (NIOZ) together with researchers from the universities of \JGroningen\j and \JLeiden\j.\p
It seems that the birds' ability is based on a hydro-dynamic principle. In short, they can detect the shellfish by sensing minute differences in pressure. The researchers showed that the knots were able to detect and uncover small stones hidden in the sand. Stones do not move, their \Jtemperature\j does not differ from that of the sand, and they have no smell or electro-magnetic field, so they could not be detected by any normal sense organs. Tame knots used in the experiments were unable to find hidden stones in dry sand, suggesting that \Jwater\j must be important in the sensing process.\p
The knots have clusters of 10 to 20 corpuscles of Herbst in the bone below the horny layer of the end of their beak. These are sensitive to differences in pressure, and when the bird sticks its sensitive beak into the sand at low tide, it produces a pressure wave because of the \Jinertia\j of the \Jwater\j in the gaps between the particles. \p
The rapid up-and-down movements of the bird's beak loosens the grains of sand, which then become packed together more tightly, displace the \Jwater\j in between, making the residual pressure around the stone (or shellfish) increase. Backing this theory, knots do not bother searching sands with only a few shellfish, and they ignore sands which contain stones, no matter how many shellfish may be found there.\p
#
"Glenn Seaborg dies at 86",1002,0,0,0
(Feb '99)
\1Glenn Theodore Seaborg,\c Nobel Laureate chemist, discoverer of 10 atomic elements including \Jplutonium\j and one that now bears his name, seaborgium, officially given that name in 1997, died on February 25, at the age of 86. He was recovering at home in Lafayette, near Berkeley, after suffering a stroke on August 24, 1998, while in \JBoston\j for the national meeting of the American Chemical Society.\p
At that meeting, Seaborg was honored by being named one of the "Top 75 Distinguished Contributors to the Chemical Enterprise." Seaborg's fame even stretched to strange places: he even appears in the \IGuinness Book of World Records\i for having the longest entry in "Who's Who in America."\p
Seaborg advised a number of American presidents, and was himself president of both the American Association for the Advancement of Science and the American Chemical Society. He was survived by his wife Helen Griggs Seaborg, and five of their six children.\p
#
"March, 1999 Science Review",1003,0,0,0
\JFermilab physicists find new matter-antimatter asymmetry\j
\JOfficial: 1998 the warmest year of the millennium\j
\JBacteria against global warming \j
\JRunaway feedback in the \JArctic\j?\j
\JEcosystem disruption is easy\j
\JBrown tree snakes in the Pacific Islands\j
\JWhooping cranes take flight in Florida\j
\JTurning frying oil into diesel fuel\j
\JProstate cancer and gene switching\j
\JYet another treatment for MRSA\j
\JAn arm and a leg\j
\JHow a virus reproduces\j
\JNew gene may help scientists understand more about how the body grows\j
\JFungal disease threatens global banana production\j
\JNASA looks at extremophiles\j
\JBringing Mars into the Iron Age\j
\JFastrac full-engine, hot-fire test successful\j
\JA map for all seasons\j
\JDrilling a million years of history\j
\JGeophysicists propose a new model of earth's mantle \j
\JA satellite with a view\j
\JWhy deep-sea vents glow\j
\JA new bioluminescent octopus\j
\JFlatworms, symmetry, and us\j
\JA mammal ancestor, older than the dinosaurs\j
\JHelping the blind to navigate\j
\JKeeping fire away from planes\j
\JDyed in the silkworm\j
\JMaggots to the rescue!\j
\JWolves bounce back\j
#
"Fermilab physicists find new matter-antimatter asymmetry",1004,0,0,0
(Mar '99)
"Our result,'' Peter Shawhan said, "is that epsilon prime over epsilon equals 28 plus or minus 4.1 times 10 to the minus 4.'' That may sound to lay readers like an entry in the "most boring first sentence of a novel" competition. Shawhan was addressing a standing-room-only audience at a seminar at Fermilab on February 24, 1999, and it could just be that this dry mathematical statement is the most exciting thing to happen in physics for 35 years.\p
The statement refers to a process called \1CP violation\c, which is short-hand for charge-parity violation. The value e'/e, or "epsilon prime over epsilon" is a measure of CP violation in matter-antimatter interactions, with an expected value of zero, but we will return to that in a moment.\p
The key point is that every particle has its anti-particle. The \Jelectron\j matches the \Jpositron\j, or anti-\Jelectron\j, and every quark has its antiquark as well, or at least, that is what the current theory of the fundamental structure of matter, the Standard Model, says. Some time back in the past, when the universe was young, matter and antimatter were equally abundant; but today the universe appears to be made entirely of matter. Antimatter shows up only in cosmic ray interactions and at \Jparticle accelerators\j such as Fermilab's Tevatron, where \Jantiparticles\j are produced in high-\Jenergy\j particle collisions.\p
So what happened to the antimatter? If matter and antimatter coincide, they annihilate each other, leaving a flash of \Jenergy\j, as most \Jscience fiction\j readers know. But if the antimatter has gone, an equal amount of matter should have gone, and this is clearly not the case. That is a problem for physicists.\p
High-\Jenergy\j collisions produce mesons among other things. Where protons and neutrons, made of three quarks, survive quite a while, mesons are short-lived pairings of a quark and an antiquark. One type of \Jmeson\j, the neutral kaon, is a combination of a strange quark or antiquark and a down quark or antiquark. In 1964, \1James Cronin\c and \1Val Fitch\c did Nobel Prize-winning work on neutral kaons, when they discovered a slight but definite asymmetry existed in the behavior of the neutral kaon and its antiparticle, an asymmetry which we now call CP violation. Until that discovery, physicists had believed that particles and \Jantiparticles\j behaved symmetrically, like mirror reflections of each other.\p
Cronin and Fitch were looking to make a much better test of CP invariance, but found instead that it turned out not to be invariant after all. This CP-violating effect can be described as an asymmetry in the mixing (or quantum-mechanical fluctuation) of the neutral kaon with its antiparticle, and other signs of CP violation all trace back to the same original effect.\p
One way to explain all this lies in something called the Superweak Theory, which posits only mixing effects, with no CP violation in the decays of neutral kaons into other particles. Since 1964, physicists have been trying to spot an asymmetry in the decay of the neutral kaon, rather than in its mixing, and this has been assessed by measuring e'/e, the ratio of different modes of decay of neutral kaons into two \Jpi\j mesons, or pions. Any non-zero value would indicate a new direct form of CP violation, and in fact the Standard Model seems to predict a non-zero, but small, effect.\p
An earlier experiment at \1CERN\c, NA31, led by Heinrich Wahl, gave a similar result, but with a precision of 3.5 standard deviations, it was not yet enough to definitively say e'/e was non-zero. The new result is to a precision of seven standard deviations, and the excitement stems not only from the confirmed non-zero value, but also from the size of the value, which is larger than expected.\p
According to physicists, the finding absolutely rules out the Superweak Theory as the sole source of CP violation. More importantly, while the Standard Model predicts a non-zero effect, the size of the new result raises questions about whether it can be accommodated within the Standard Model.\p
Once again, science shows up as that strange pursuit where people get more excited about a prediction that is wrong than they would ever be about a prediction that is right.\p
The only problem: while the result is in close agreement with CERN NA31, a previous Fermilab result was much smaller, and a detailed analysis of the earlier result cannot explain the difference. A new CERN experiment, NA48, is due to be reported soon, and that should shed some light on the e'/e value.\p
Only time will tell whether \I"Our result,'' Peter Shawhan said, "is that epsilon prime over epsilon equals 28 plus or minus 4.1 times 10 to the minus 4.''\i will go into the hall of fame for all-time great scientific quotations.\p
Finally, here is some background provided by Fermilab:\p
The KTeV experiment (for Kaons at the Tevatron) is an 85-member collaboration of experimental groups from the University of \JArizona\j, the University of \JCalifornia\j at Los Angeles, the University of \JCalifornia\j at San Diego, the University of \JChicago\j, the University of \JColorado\j, Elmhurst College, Fermilab, Osaka University, Rice University, Rutgers University, the University of Virginia, and the University of \JWisconsin\j. \p
The new experiment began construction in 1992 and took its first data in 1996. It used a beam of protons from Fermilab's Tevatron to create two parallel beams of neutral kaons to search for CP violation. An innovative particle detector constructed of cesium iodide crystals gave experimenters unprecedented precision in making experimental observations. Other technological innovations allowed the collaborators to rule out background events and collect data at very high rates. \p
#
"Official: 1998 the warmest year of the millennium",1005,0,0,0
(Mar '99)
A mid-March report in \IGeophysical Research Letters\i says that the 1990s appear to be the warmest decade of the century, with 1998 the warmest year of the decade. The warming trend of the twentieth century reverses a 1,000-year trend of cooling, making this the warmest century of the millennium.\p
The study involved a careful examination of the natural \Jarchives\j (see \BA few hot years\b, April, 1998) provided by tree-ring data and ice cores from \JGreenland\j and the \JAndes\j Mountains. The tree-ring data included three sets of 1,000-year-long tree-ring records from North America, plus tree rings from northern Scandinavia, northern \JRussia\j, \JTasmania\j, Argentina, Morocco, and \JFrance\j. \p
These "proxy indicators," as they are called, allow researchers to extend the short instrumental records that we have, well back into the past. The calibrated natural \Jarchives\j can then be subjected to sophisticated computer analysis and statistics to reconstruct yearly temperatures and their statistical uncertainties, going back to the year AD 1000. \p
While the older data may be less certain, the trends remain clear. And on those trends, 1998 was the hottest year of the millennium, and the turn-around has been remarkably rapid. It seems that \1global warming\c is really happening.\p
\BKey names:\b Michael Mann, Raymond Bradley and Malcolm Hughes \p
#
"Bacteria against global warming",1006,0,0,0
(Mar '99)
\INew Scientist\i carried a story in mid-March about a newly-discovered bacterium which digests \1methane\c, doing its bit to slow global warming. The good news is that it exists quite happily in acidic wetlands in the northern hemisphere, where almost half the world's \Jmethane\j emissions are formed. The bad news is that the bacterium is being poisoned by industrial pollutants, especially \Jnitrate\j and sulphate \Jpollution\j from industry and traffic.\p
Most of the \Jmethane\j-producing \Jbacteria\j live in acidic \1wetlands\c in the northern hemisphere, but these conditions were always thought to be unsuitable for the \Jbacteria\j that take up the \Jmethane\j and extract \Jenergy\j from it. The new discovery was made when scientists noticed that some wetlands in Europe were only producing about half as much \Jmethane\j as expected, and went looking for the cause.\p
They report that the \Jmethane\j output of acidic wetlands is now higher than it was before the industrial revolution, and they attribute this to the decline of the \Jmethane\j-eater, which so far has no name.\p
\BKey names:\b Nikolai Panikov, Svetlana Dedysh and Werner Liesack\p
#
"Runaway feedback in the Arctic?",1007,0,0,0
(Mar '99)
A new problem for climatologists, revealed recently in the journal \IArctic and \JAlpine\j Research,\i relates to the effects on carbon dioxide release of increases in \JArctic\j temperatures. Under most conditions, the \Jearth\j's ecosystems seem to be conservative, resisting change, but not so this time, it seems. Global warming looks set to increase the releases of carbon dioxide from the \JArctic\j \Jtundra\j, promoting even more \1global warming\c.\p
In a controlled experiment, plots of \JArctic\j \1tundra\c had their \Jtemperature\j raised by 2 degrees C, and this caused an increase in carbon dioxide emissions of between 26% and 38%, but snowfall levels may increase under global warming, according to some experts.\p
When increased snowfall was simulated, emission levels rose by between 112% and 326%. The carbon dioxide was produced in the soil beneath the snow, and the \JArctic\j, covering about a fifth of the \Jearth\j, holds almost a third of the world's stored soil carbon. Each year, the \JArctic\j carbon emissions from \Jrespiration\j by plants and soil micro-organisms is greater than the amount of carbon dioxide fixed in \Jphotosynthesis\j during the short summer growing season. \p
When the snowfall increases, the snow will take longer to melt, reducing the growing season still more, and making the emission problem even worse. In the study, the deep snow sites took about four weeks longer to completely melt than the normal-snow sites.\p
The snow levels were manipulated with snow fences which helped to accumulate more snow, while temperatures were managed with small open-top fiber-glass chambers which acted like tiny greenhouses to warm the air, so that soil temperatures increased by about 2 degrees C, or 3.6 degrees F, compared with untouched sites.\p
As the researchers had expected, the results showed that moist \Jtundra\j emitted more carbon dioxide than dry \Jtundra\j, although losses at both types of sites were significant, with both snow levels and temperatures having an effect.\p
\BKey names:\b Michael Jones, Jace Fahnestock, Jeff Welker, Donald Walker and Marilyn Walker\p
#
"Ecosystem disruption is easy",1008,0,0,0
(Mar '99)
Two studies from Oregon were described in the scientific press during March, each showing just how easy it is to bring about disruption in an \Jecosystem\j. Both studies were carried out on the marine rocky intertidal shores of the Oregon coast. \p
Many ecologists would argue that as many as half the species in an environment could be lost without much harm -- the problem is knowing which are the dispensable ones, and which are the keystone species -- the term refers to the one stone in an arch which all the others rest on.\p
The first report, published in the journal \INature,\i suggests that measures to protect \Jecosystem\j health and function must consider not only the keystone species, but also many less prominent species which, at various times, may actually be highly significant. \p
It challenges the conventional assumptions, quoted above about the interchangeability of the background species. This often translates into the faith that, if the keystone species are maintained, the larger \Jecosystem\j will be protected and the lesser species can be ignored or even exploited, because they are of little ecological significance. \p
The dog \1whelk\c is a marine snail which preys on \1mussel\c and \1barnacle\c species, and the study involved manipulating the abundance of whelks on part of the intertidal zone, and comparing the impacts of strong versus weak species effects. Strong predation on mussels had a consistent effect on the prey, but the effects of weak predation were highly variable under all the conditions examined. In some cases, even weak predation had a strong effect on mussel numbers, but the effects varied considerably.\p
Some "weak predation plots" were dominated by mussels, while others had no mussels at all, so that while the weak predation effects averaged out to about zero, it would be a serious mistake to treat the effects as minor or insignificant. \p
The author suggests that there must be other effects involved, and that \Jecosystem\j management requires close attention to many more interactions than normal. He argues that \Jecosystem\j health is likely to depend on the interactions of many species, the abundant ones as well as the rare ones, the keystones and the "not-so-keystones." \p
The second study, published in the journal \IScience,\i indicates that some ecological impacts of global warming might be abrupt, significant, and generally underestimated. Small changes in ocean \Jtemperature\j may affect keystone species, and trigger large, relatively rapid changes in intertidal \Jecology\j. \p
The study looked at the ochre seastar or \1starfish\c, which feeds on the \JCalifornia\j mussel and in the Pacific Northwest intertidal \Jecosystem\j, holds this dominant competitor in check. It seems that the rates of predation by this seastar are very sensitive to small changes in \Jwater\j \Jtemperature\j associated with episodes of wind-driven upwelling.\p
So where we would expect a \Jtemperature\j change to bring about a slow and stately progression of species along the coast, it seems that this may not always be the case. Given an important species which is highly sensitive to \Jtemperature\j, then the effects of small \Jtemperature\j changes on an \Jecosystem\j can be amplified by species interactions, because in this case, the predator is affected, and not the prey. An impact like this, says the report, could cascade through the community, having a rapid system-wide impact.\p
\BKey names:\b Eric Sanford (seastars and mussels) and Eric Berlow (whelks)\p
#
"Brown tree snakes in the Pacific Islands",1009,0,0,0
(Mar '99)
An instructive case study of environmental disruption may have flared on the island of \1Saipan\c during March. It shows how much damage a single introduced species may do, and it also illustrates some of the ways used by scientists to solve such problems. A number of \JPacific islands\j have a growing problem with the brown \1tree snake\c, \I(Boiga irregularis),\i which originated in Papua New Guinea. Probably no island has suffered more than \1Guam\c from the ravages of this snake, but it now appears that Saipan may also be among the islands carrying a population of the snake, and plans were set in place during March to test this suspicion.\p
The snake has caused health and economic hazards for \JGuam\j's residents, as well as devastating native bird, lizard, and bat populations. Saipan has no native snakes, and 400 traps have been set up in an attempt to document whether a population of snakes has been established on the island. The techniques to be used on Saipan are those already established on other islands, mainly \JGuam\j.\p
The snakes have been seen once or more on Saipan, Tinian, Rota, Kwajalein, Wake, \JOahu\j, Pohnpei, \JOkinawa\j, and Diego Garcia, but so far, it seems not to have become established on any of those islands. Your reporter was on Pohnpei in the Federated States of \1Micronesia\c some years ago when a snake was sighted near the airport, and the response was, to say the least, strong. It had to be, because the residents of all of the islands under threat know of the economic and ecological damage which has happened on \JGuam\j.\p
The snake probably reached \JGuam\j in ship cargo from the New Guinea area, some 50 years ago. With no natural predators and no competition from other species, it has thrived on the island. Twelve bird species have disappeared from the island, and several others are approaching \Jextinction\j, and lizards, small mammals and domestic poultry are under threat. Even forest trees are at risk, now that the pollinating birds have been wiped out. Native bat numbers are also down. Yet in spite of all this damage, it was only about 10 years ago that the snake was identified as the cause of the decline in bird numbers.\p
The snakes can reach three meters (10 feet) and 4.5 kilograms (10 pounds). They are only mildly venomous to humans, but they can seriously injure infants and small children. Since 1978, the snakes have caused more than 1,600 power outages. Happily, this annoying problem offers a useful set of data, because the pattern of outages matches seasonal increases in snake frequency and activity.\p
In the snake's original range, northern \JAustralia\j, the Solomon Islands, and New Guinea, the snake does not cause the same problems. The current theory is that introduced lizards on \JGuam\j may provide a diet for the young snakes, increasing the numbers that reach adulthood. Studies are now under way in the "home" area, in the hope of finding some method of population reduction strategies with fewer environmental risks than capture methods or toxicants and diseases. \p
Studies on \JGuam\j have revealed that the snake can move from areas with trees, along power lines and over mowed lawns. This sort of information is needed to guide the development of traps, barriers, and bait stations in natural, urban, and industrial situations. \p
One interesting question is how trapping efficiency can be assessed: without population density data, the effectiveness of traps, baits and lures is no better than a guesstimate.\p
Another interesting line of inquiry is the possibility of finding viruses or \Jbacteria\j which may kill the snakes, because they are sufficiently different from their prey that the same diseases are unlikely to affect birds and mammals to any significant extent. While careful testing would be needed, such a disease would be quick to reduce populations once it was released, so that we can expect zoo disease outbreaks in the \Jreptile\j houses of the world to draw interest from unusual quarters.\p
Poisons are also a possibility, although any possible chemical would need to be assessed as a risk to other non-target species or to the environment in general.\p
With more and more tourism in the Pacific, better methods of detecting snakes on \Jaircraft\j are needed. The use of sniffer dogs is one possibility, but there are still problems with odor masking, and the effectiveness of dogs in dealing with closed containers.\p
\BKey names:\b Thomas Fritts, Gordon Rodda, Michael McCoid\p
\BPostscript:\b in an e-mail in early April, Thomas Fritts told your reporter:\p
"Saipan is now thought to have a high probability of having an incipient snake population (snakes are likely to be present and reproducing, but in such low numbers or localized situations that they are difficult to detect, and hence control).\p
"In recent weeks we moved traps, vehicle and personnel to Saipan to work closely with biologists of the Commonwealth of the Northern Marianas Islands in an intensive effort to: 1. find traces of the snake population; 2. trap or interdict as many individuals as possible; and 3. learn what special problems we will have in trapping snakes on Saipan (as opposed to \JGuam\j where snakes are so abundant and some prey species are now quite rare).\p
"Operational trapping efforts by Wildlife Services (an Agency of the USDA) primarily focused on reducing risks of snake moving to other islands in ship and air traffic from \JGuam\j have resulted in the capture of more than 17,500 snakes in the 1994-98 period, but every snake that we may remove from the incipient population on Saipan as a result of our experimental trapping effort at present may represent 100 or 1,000 snakes that will not need to be captured a decade from now if the population expands as it did on \JGuam\j.\p
". . . clearly the benefits of documenting their presence and reducing their numbers at the earliest possible moment could slow the infestation or even make it possible for the population to die out rather than becoming another ecological disaster."\p
See also: \1acclimatization\c\p
#
"Whooping cranes take flight in Florida",1010,0,0,0
(Mar '99)
During March, eight young \1whooping crane\c chicks were released into the wild at the Kissimmee release site, not far from the metropolitan sprawl of Orlando. Here, they will find open prairie, brush, and marshlands where they may be able to establish themselves, if the bobcats let them. The birds, described as " . . . products of the \Jgenotype\j \IGrus americana,\i . . . are also the product of modern conservation science," and it is this science that is under scrutiny in this report.\p
The birds will, hopefully, enter the breeding population of wild birds, but bobcats are over-abundant in \JFlorida\j due to the decline of larger predators such as the \JFlorida\j \Jpanther\j, and may well take some of the birds: unless they survive six months in the wild, researchers will not count them as part of the population. They recognize that at least some of the birds will die, but they have done their best to shade the odds in favor of the cranes.\p
The migratory cranes reached a low of just 21 birds in the wild in 1941, and numbers in the wild stayed low for many years. At that point, a single storm could wipe out the entire group, as happened in 1940 to a small, non-migratory population of whooping cranes in Louisiana. There were also risks of inbreeding in the tiny population. The recovery program aims to establish a viable, non-migratory whooping crane population in \JFlorida\j, and a second migratory population that will winter in \JFlorida\j and breed somewhere in the north-central United States or southern Canada. Each task depends on the ability of biologists to breed whooping cranes in captivity, and raise chicks suitable for release into the wild.\p
In the wild, whooping cranes lay two eggs, but usually only one chick survives, so for more than 30 years, biologists have tried to use this by harvesting one egg from each nest, and raising the chicks in captivity - this increased the numbers in zoos, and produced a captive breeding population at Patuxent which now stands at 15 to 20 breeding pairs, in the 1970s. But producing whooping crane chicks that could survive and breed in the wild required new physiological and behavioral conditioning techniques.\p
Two other captive breeding programs are now getting under way, one at the \JCalgary\j Zoo and one at the headquarters of the International Crane Foundation in \JWisconsin\j, and the Patuxent workers believe that there will be 50 cranes released each year after 2002. \p
Eggs have now been collected from all the known Canadian nesting sites, and genetic probes are used to identify how closely related individuals are, so that closely related birds do not mate, and to ensure that rare genetic lines are maintained in this critical phase. The eggs are raised in an incubator, and by hatching, their genetic pedigree has identified them as either breeding stock or release stock.\p
Release stock must never be allowed to see humans, as the species is open to \1imprinting\c, and may grow up preferring human companionship to that of their own kind. They need to acquire a crane identity, so chicks must be fed and cared for by cranes - or as close an approximation as the recovery team can devise. Eggs, before they hatch, are played tapes of the brood calls that parent cranes give to their hatching young. Then, crane-costumed scientists, using a hand puppet shaped like a crane head, do all feeding of chicks. After a few days, chicks are placed in small pens adjacent to live whooping crane adults, which serve as "role models" for proper \Jimprinting\j.\p
The only time chicks see real humans is when there is an unpleasant experience coming their way, like a blood sampling - on those days, researchers leave the crane suit behind. So when the nine-month-old birds are ready for release in \JFlorida\j, they have formed some distinct opinions. They have also been given access to a small pond that has been stocked with insects, where they learn to probe for food. It is a long haul: most of the birds released in March will not enter the breeding population for about eight years, but things are looking up for the whooping cranes of \JFlorida\j.\p
\BKey names:\b George Gee, Tom Stehn\p
#
"Turning frying oil into diesel fuel",1011,0,0,0
(Mar '99)
Converting used frying oil to fuel ("biodiesel") is nothing new. This \1\Jalternative \Jenergy\j\j\c source is attractive because it uses waste as its raw material, and produces a biodegradable fuel (handy in case of spills) which burns cleaner and produces fewer potential carcinogens. The conversion also saves the cost of hauling the oil to landfills or "yellow oil" markets, but existing biodiesel is more expensive than petrodiesel.\p
Now a new process has been developed which makes the conversion even more attractive. The current method involves a batch process, with a base and an acid being used, producing a volume of waste \Jwater\j three times that of the fuel, with low-grade \1glycerol\c as a by-product. The new method does away with the base and the acid, cutting the waste \Jwater\j production, and the \Jglycerol\j by-product is high-grade and saleable. Better still, the new process is a continuous one, relying on a catalyst.\p
One interesting side-light: the burned fuel produces a smell rather like fried chicken, leading park rangers in the \JYellowstone\j National Park to worry that if they used the fuel in its "Truck in the Park." tour buses, bears would chase the vehicles in the mistaken perception they were chasing finger-licking-good meals on wheels. It appears that this will not be a problem, so long as nobody starts feeding the bears on fried chicken.\p
With the \Jglycerol\j sale factored in, biodiesel may now be able to sell for the same price as petrodiesel. The researchers have secured a provisional patent for the technology, but now they are looking for a new funding source.\p
\BKey names:\b Bob Fox and Dan Ginosar\p
#
"Prostate cancer and gene switching",1012,0,0,0
(Mar '99)
The start of March saw a fascinating report on cancer in the \1prostate \Jgland\j\c in \INature Medicine,\i suggesting that 90% of prostate cancers are caused by a process called gene switching. Previously, the cancers were thought to have been caused when a gene, called pp32, was somehow mutated, but now it appears to be a simple matter of the gene having been switched off in cancerous cells.\p
The implication is that if the gene has merely been switched off, this must have happened because of some kind of chemical trigger. That in turn means that there could be another trigger which turns the gene back on again. So instead of complicated gene therapy, the condition might be treated by a chemical, a drug, which restores the full health of cancerous cells by switching the pp32 gene back on again.\p
The key question to be answered is whether or not gene switching is reversible. The key question after that is whether gene switching is involved in other genetically controlled cancers like breast cancer, making these cancers equally open to attack with simple reverse switches.\p
There are certainly some cancers caused by mutations that run in families, such as colon cancers, but this will be an interesting field to watch in the next ten years or less -- one of the researchers thinks they may be able to produce reversing switches within two years.\p
Switching occurs when certain members of a family of genes are switched on while others in the family shut down. The process is common in the development of an embryo, but this seems to be the first time gene switching has been linked to cancer.\p
Studies using molecular probes (see \1DNA fingerprinting\c) have shown that the pp32 gene is "on" in normal cells, and generally switched off in cancer cells. The pp32 gene seems to act as a cancer suppressor, keeping cells from turning malignant, though closely similar genes such as pp32r1 and pp32r2 are present and active in about 90% of the prostate cancers they studied. While this looked like a \Jmutation\j at first, it now looks as though gene switching has happened. Researchers say they have now seen a similar-looking pattern in some breast cancers.\p
The discovery may also lead to a new test for prostate cancer diagnosis. Prostate specific \Jantigen\j (PSA) is the present common screen for prostate cancer, but it is not completely reliable in older men, so perhaps a test for the activity of pp32 may improve on this.\p
\BKey names:\b Shrihari S. Kadkol, Gary R. Pasternack, Jonathan R. Brody, Jining Bai, and Jonathan Pevsner\p
#
"Yet another treatment for MRSA",1013,0,0,0
(Mar '99)
Multiply resistant \IStaphylococcus aureus,\i or MRSA, and other resistant \Jbacteria\j are a serious problem when the germs begin to stand up against vancomycin, the "antibiotic of last resort", but a new drug, linezolid, may offer a solution to this sort of \IStaphylococcus\i infection.\p
Linezolid is one of a new class of antibacterial drugs called oxazolidinones that are chemically unrelated to currently used \Jantibiotics\j. These chemicals are strong attackers of gram-positive \Jbacteria\j, including enterococci that are resistant to vancomycin.\p
\IEnterococcus faecium\i is highly resistant to vancomycin as well as other \1antibiotics\c, leaving patients who are infected with the bacterium few choices. A report in the March issue of the journal \IClinical Infectious Disease\i indicates that linezolid offers hope to such patients. The case involved a patient who had undergone \Jchemotherapy\j for lymphoma, so that she had a decreased level of neutrophils, cells that play an important role in fighting infection. \p
She then developed a vancomycin-resistant enterococcal (VRE) bloodstream infection that was resistant to all the standard treatments, and so she was given linezolid, which was administered under a "compassionate use" exemption, in combination with gentamicin, another antibiotic. \p
Blood cultures showed positive to VRE in the four days before treatment, and on the first \Jday\j of treatment, and negative after that time.\p
Gary Noskin, who led the study, says that in the absence of this medicine, the patient would have died from her infection. But while this work confirms his expectations, he cautions more work will be needed on a larger sample of patients to show that the chemical still works. Until then, the emphasis needs to remain on preventing VRE infections in the first place.\p
\BKey names:\b Gary A. Noskin, Lance R. Peterson, Farida Siddiqui, Valentina Stosor, and Julie Kruzynski. The linezolid, a new investigational drug, was supplied through a compassionate use program by Pharmacia and Upjohn, Kalamazoo, Mich. \p
#
"An arm and a leg",1014,0,0,0
(Mar '99)
What makes the difference between a forelimb and a hindlimb in animals? Clearly, birds' wings and legs are based on a standard plan with differences, and just as clearly, our arms and legs are linked by a common plan.\p
Researchers at Harvard Medical School revealed in \IScience\i in mid-March that they have found the molecular instructions that control the differences in chickens, though their discovery can be applied also to mice and humans. The researchers have identified a \1gene\c that can partially transform the upper limb of a vertebrate into a structure that resembles its lower limb.\p
The procedure was comparatively simple: they took a gene that is normally only active in legs and transferred it to the forming wings of chick embryos. The structures which resulted looked in many ways more like a leg than a wing: feathers were gone, claws appeared at the end on the digits, and leg-specific muscles were clearly identifiable.\p
The point to keep in mind here is that the same signals for limb patterning are present in all four "limb buds," the small outgrowths on the sides of an embryo's trunk that represent the beginnings of the future limbs. So if these all have the same instructions, asked the experimenters, why do wings and legs end up different? The answer had to be in the genes that operated in the different limb buds.\p
The gene they used, Pitx1, is one of three genes believed to shape the limbs. The others, Tbx4 and Tbx5, are members of the same gene family and are active only in the leg or in the wing, respectively. With two genes found in the leg, the researchers decided that Pitx1 appeared first, and so probably switched on Tbx4, so they chose to concentrate on that gene.\p
They used a virus to deliver the Pitx1 gene into the cells of the future wings of chick embryos, and found that Tbx4 was activated, proving that they had guessed right. Pitx1 also made two other genes, HoxC10 and HoxC11, which are normally found only in legs later in development, become active. At the same time, delivering Pitx1 to the wing did not affect Tbx5, the gene that is normally present only in the wing. \p
So wing limb buds dosed with Pitx1 carried Tbx4 which brings about leg-like characteristics, while still carrying Tbx5, which causes wing-like characteristics, explaining why the altered limbs were only partly transformed. The "wings" are still dramatically altered by the treatment. Normal wings have just three digits of different size, while the Pitx1 wings had digits that were much the same size, and sometimes there were four, as with the leg. As well, many of the "wings" had claws at the ends of their digits, and muscle patterns were also transformed. \p
Looking beyond chickens, all three of the limb-identity genes are present in mice, and are similarly restricted to the forelimbs or hindlimbs. In humans, mutations in the Tbx5 gene are associated with Holt-Oram syndrome, which results in the forelimb being shorter.\p
In an interview released on the \JInternet\j, Clifford Tabin commented that this was just the beginning of yet another stage in our struggle to find out how the shape of an organism is controlled by its genes: "For example, what cellular properties are changed by Pitx1 such that the limb tissue condenses to form four digits instead of the three in the absence of Pitx1?" he said. "This is the ultimate goal of the field, to understand how developmental control genes actually regulate morphogenesis." \p
The world's tabloid press went madly out of control over this story, with the word "\JFrankenstein\j" appearing in many headlines, or with silly slants about chickens with four drumsticks. In fact, all we had was a single chicken, with one wing that showed some leg characteristics, showing that we are close to identifying the gene which makes a limb bud turn into a leg.\p
\BKey names:\b Malcolm Logan and Clifford Tabin\p
#
"How a virus reproduces",1015,0,0,0
(Mar '99)
A paper published for the March 1999 issue of \INature Structural \JBiology\j\i outlines the structure of the pseudoknot, revealing the three-dimensional basis for a clever replication technique used by viruses. The method allows a \1virus\c, especially the retroviruses, to take control of cells they have invaded so they can direct the manufacture of the proteins they need for their own survival. This knowledge may well lead to new methods of attacking virus infections.\p
The virus is able to create an RNA structure called a pseudoknot, which allows it to control genetic material for its own purposes. The virus does this through a process called ribosomal frame-shifting, but until now, scientists have not been able to explain the process because they did not know the detailed three-dimensional structure of the pseudoknot, which gets its name from the fact that the RNA is not truly knotted, but just tightly bound together and twisted.\p
A pseudoknot forms when two stem-loop "hairpins" of \1RNA\c line up. The hairpin structure is common in RNA. The hairpin form is stable, and it has important folding and protein-recognition properties. When the loops of two hairpins are linked by base-pairing, this is a pseudoknot.\p
The pseudoknot controls a process called frame-shifting, which allows a virus to pack several messages into the same piece of RNA. The \1genetic code\c is made of sets of three bases, so there are three possible points at which "reading" can begin. Usually, two of the three "frames" generate nothing but nonsense, but in a few viruses, more than one reading is possible. As a simple example, consider the words BAD-DON-SAY-EEL-MUD. With a frame-shift, we might extract ADD-ONS-AYE-ELM from the same string.\p
It is hard to give a convincing example, because few combinations of letters work, but most of the 64 possible combinations of bases have a meaning, and in the viruses, where it is important to pack as much information as possible into a small space, \Jevolution\j has produced working frame-shifts in the RNA of viruses. And the trigger for frame-shifting comes from the structure of the pseudoknot.\p
Many viruses, including most retroviruses (the most famous \1retrovirus\c is the one responsible for \1AIDS\c) need to synthesize two different proteins. Typically, the first \1protein\c is a structural protein, involved in building the virus and the second is an \Jenzyme\j, usually a polymerase, used in replicating the virus' nucleic acid, or genetic building blocks.\p
The virus needs many copies of the structural protein and a smaller number of the polymerase proteins, so the virus developed a novel system for regulating the production of these two proteins, involving the use of ribosomal frame-shifting. \p
The new paper describes work which required the researchers to crystallize the RNA pseudoknot formed by the beet western yellow virus. Once that was done, they could work out the pseudoknot's three-dimensional structure.\p
This structure revealed many unusual features, they say, and they have now concluded that the decision to frame-shift or not to frame-shift depends on whether the pseudoknot unravels when it collides with the ribosome.\p
In this case, the beet western yellow virus RNA is able to code for both a polymerase, and for a structural protein. If the pseudoknot does not unravel, the ribosome can slide back one \Jnucleotide\j and then make a fusion protein, involving both the structural protein and the polymerase. On the other hand, if the pseudoknot does unravel, then only the structural protein is made, and not the polymerase. \p
Ribosomal frame-shifting is also used by the AIDS virus, and the researchers hope this work will shed light on the ways in which viruses replicate. And that knowledge, they believe, could lead to the development of new anti-viral agents.\p
#
"New gene may help scientists understand more about how the body grows",1016,0,0,0
(Mar '99)
A report published in the April issue of the journal \IDevelopment\i (which was released during March) describes an interesting new \1gene\c found in frogs. The gene, called derriΦre, plays a key role in the development of the frog embryo from the neck down, including the neural tube and the muscles around the spinal cord. \p
The authors conclude that derriΦre controls the formation of the posterior regions of the embryo, that is, the entire body from the neck down. The evidence for this: the embryos that do not have a functional derriΦre gene developed normal heads, but only had disorganized tissue where the trunk and tail should have been.\p
The derriΦre gene is a new member of a large family of genes called TGF-b. This family plays an important role in many biological functions, including development and cancer. \p
Scientists were surprised to find a member of this familiar gene family performing an unfamiliar function, inducing normal posterior development in very early embryos. They are also quite excited by the find because it represents a new window into aspects of development which are still only poorly understood.\p
The gene appears to be needed to form the tissues which give rise to muscle, so it may also turn out to be important in work aimed at regenerating muscle in wasting diseases.\p
The action of the gene was identified using a yeast-based assay, which can detect proteins in the very early stages of development - at a time when a human mother, for example, might not even be aware that she was pregnant. The researchers were looking for new signaling proteins that control formation of the embryo, and when they found the derriΦre gene active in regions known to be involved in forming the posterior of the embryo, they guessed that it would prove to be important for posterior development. \p
The researchers then carried out two kinds of experiment: "gain-of-function" and "loss-of-function" studies. In the gain-of-function experiments, they tested to see what would happen if derriΦre is activated in the head region of the embryo where it is not normally active. Then in the loss-of-function experiments, they looked to see what happens when derriΦre function is removed from the embryo altogether. \p
When the derriΦre gene is activated in the future head region of the embryo, it prevents the head from forming normally. When the derriΦre was activated in the belly region, extra muscle and nerve tissue formed where the belly should have been.\p
When researchers prevented the derriΦre gene from functioning, they found that embryos developed with normal heads but with only disorganized tissue from the neck down, including no muscle, no obvious neural tube, or tail. \p
It is likely that derriΦre works with other key proteins to establish the formation of the body from the neck down, and the search is now on to identify these proteins. At the same time, researchers will be looking at zebrafish to see if the gene is present there as well, because if it is, then the gene is probably also present in humans.\p
Frogs and zebrafish, of course, are ideal animals for exploring the change from a single-celled egg into a complex, multicellular organism. While the development process in these animals is surprisingly similar to that in human beings, it occurs much faster. And even more importantly, frog and zebrafish embryos are easy to obtain because they grow outside the mother, either in a still pond, or in the laboratory in a Petri dish. \p
\BKey names:\b Benjamin I. \JSun\j, Sara M. Bush, Lisa A. Collins-Racie, Edward R. LaVallie, Elizabeth A. DiBlasio-Smith, Neil M. Wolfman, John M. McCoy, Hazel Sive.\p
#
"Fungal disease threatens global banana production",1017,0,0,0
(Mar '99)
Black Sigatoka, a fungal disease that attacks the leaves of bananas, is spreading through banana production regions of the world. Bananas are a staple food and a major export product in much of the developing world - they are the world's fourth most valuable food crop, so this disease is a serious problem for developing world economies. The disease is also a problem in the developed world, with recent outbreaks of the disease in \JFlorida\j reported during March.\p
Aside from the trade worries, only 10% of the annual banana crop of 86 million tonnes is ever exported anywhere: the rest is eaten as a subsistence crop by growers, or consumed locally. In some parts of the world, bananas are the main source of carbohydrates, fiber, vitamins A, B6 and C, as well as \Jpotassium\j, \Jphosphorus\j and \Jcalcium\j. Modern bananas are mainly hybrids of two species, \IMusa acuminata,\i and \IMusa balbisiana,\i with the heritage of a \Jcultivar\j indicated by the letters A and B for the two original species.\p
Black Sigatoka was first recognized in the Sigatoka Valley of \JFiji\j in 1963, although it was probably common in other areas by then, and it appeared in \JHonduras\j in 1972. It is now common in Asia, the Americas, and also in parts of \JAfrica\j. The disease is caused by an ascomycete \Jfungus\j, \IMycosphaerella fijiensis,\i and it has a wider range than "Yellow Sigatoka," which it seems to be displacing. (Black Sigatoka attacks the ABB cooking bananas which are not usually affected by yellow Sigatoka. It also attacks AAA and AAB varieties.)\p
Chemical controls are possible, but expensive, increasing the cost of export bananas by as much as a third. These costs make fungicides a non-option in many areas, and especially where the banana is a subsistence crop, with spraying costs as high as US$1,000 per hectare per year. As well, the \Jfungus\j is able to develop resistance or tolerance to most systemic fungicides, so the solution is probably to breed bananas that are resistant to the disease, which thrives under the warm, wet conditions that are found in the tropics.\p
Under these conditions, the \Jfungus\j severely damages banana leaves, reducing a plant's ability to capture the \Jsun\j's \Jenergy\j. This damage can reduce fruit production by as much as 50%, but Black Sigatoka also causes premature ripening, which is a serious defect in exported fruit.\p
The disease can spread over long distances when banana leaves are used as packing materials (a common practice in developing countries), or when infected planting material is transported from one place to another.\p
The most common export \Jcultivar\j, Cavendish AAA is extremely susceptible to attack. Resistant cultivars, suitable for subsistence farmers, have lower yields, making them less economically valuable, so new hybrids have been produced, and clones are now being distributed. The new hybrids are not quite so attractive, and have a shorter post-harvest shelf-life. In the long run, this may be what consumers have to accept, unless they are willing to pay a large premium for their bananas.\p
\BKey name:\b Randy Ploetz\p
#
"NASA looks at extremophiles",1018,0,0,0
(Mar '99)
NASA has begun a program for schools, called "Life on the Edge," looking at extremophiles, organisms which live in the most extreme conditions, and which seem to flourish there.\p
The idea is to expose children to some of the basic principles of astrobiology and to explore the possibilities for life elsewhere in the \Jsolar system\j. For example, a package of yeast and \Jbacteria\j has been left for several months to a year at the top of \JCalifornia\j's White Mountains, and samples have also been taken from that area in the hope of capturing extremophiles native to the area.\p
Children will test the responses of microbes to extreme environments, and to see how well they grow under a variety of conditions. The procedures have been clearly designed to make incidental learning easy, even for very young children.\p
The program is directed at "grade-school children. Their registration site at \Bhttp://www.startrails.com/join.html\b allows overseas schools to enter.\p
At the moment, the program is only a couple of months old, but it shows every sign of offering a major boost to children's joy in doing science.\p
See also: \BSome like it hot,\b February 1998, \BLawsuit targets \JYellowstone\j bioprospectors,\b March 1998, \BArchaeans fix DNA better,\b May 1998, \BChainsaw-equipped robot goes after smokers,\b July 1998.\p
#
"Bringing Mars into the Iron Age",1019,0,0,0
(Mar '99)
The standard biological symbol for "male" is a circle with an upward arrow. The symbol, derived from the spear and shield of Mars, the God of War, is also the astrologer's symbol for the \Jplanet\j Mars, and the alchemists' symbol for iron. By an odd coincidence, the \Jplanet\j has a soil that is between 5-14% iron oxide, making it almost ore-grade material, and this iron oxide may be the key to humanity opening up the rest of the \Jsolar system\j.\p
The secret lies in a metal-making process known to the ancient Romans, which could be used to make receivers, able to generate electricity from radio waves beamed from a mother ship in Mars \Jorbit\j. While NASA has no program to put humans on Mars at the moment, it is looking at developing technologies and mission concepts for such a possibility. This notion would allow non-nuclear power options for Mars bases and exploration.\p
The secret lies in transferring the manufacturing work to the moon or Mars, so that the overall cost of rocket fuel is reduced far below what you would need to transport the same materials from \JEarth\j.\p
The ideas under consideration include the Solar Clipper. This combines an ion drive with large solar cell arrays proposed in new NASA studies of \Jsolar power\j satellites. The unmanned Solar Clipper would boost itself to a high \JEarth\j \Jorbit\j over the period of a year, then, after the crew arrives on a small, faster craft, the Solar Clipper would use a chemical rocket to make the final boost to Mars.\p
Because Mars is further away from the \Jsun\j than \JEarth\j, a Solar Clipper would need to be larger at the \Jorbit\j of Mars to give the same power as a similar device orbiting \JEarth\j. On the surface of Mars it would need to be even bigger, due to \Jenergy\j losses through the thin \Jatmosphere\j of Mars, and when people started realizing that they were talking about an area the size of several football fields, backed by fuel cells for cloudy days and night-time, the option of receiving power from space would become tempting again.\p
Another scheme involves an unmanned propellant factory. This craft would refine the Martian carbon dioxide \Jatmosphere\j and mixed it with \Jhydrogen\j (brought along or electrolyzed from \Jwater\j ice in the polar caps) to make liquid oxygen and \1methane\c to power the crew's return from Mars. This factory would then get its power from \Jmicrowaves\j beamed at a rectifying antenna, or rectenna, which converts them directly into direct current.\p
The problem: you need metal, and lots of it, to make the dipole antennas that gather the \Jenergy\j. While iron is too dense a metal to get much use in space, Mars has only 38% of \JEarth\j's gravitational pull, and hardly any oxygen to rust the iron when it is made. And better still, using \Jwater\j to convert carbon dioxide to \Jmethane\j also yields carbon monoxide, a reducing agent that reacts with rust to produce carbon dioxide and free iron. This is the iron-making process that was known to, and used by, the ancient Romans -- though they made their carbon monoxide rather differently.\p
So given a small start-up supply of rectenna materials, the reactor could process Martian soil and air into rocket propellant and more rectenna parts. The Martian iron would be rolled into strips to make the dipoles, or formed into wire to make a mesh reflector, while the waste, or slag, would be used to make insulator strips going between the \1dipole\c pairs.\p
Solar Clipper would need to be in an \Jorbit\j 17,023 km above Mars, placing it in an areosynchronous \Jorbit\j -- the Martian equivalent of a \1geosynchronous \JEarth\j \Jorbit\j\c. This would allow it to deliver continuous power to the ground-station.\p
On current plans, the Mars lander would carry enough materials to make a 1.5 km-wide rectenna that would provide 150 kilowatts of electricity to power the refinery. It would then grow to 20 km in diameter, enough to provide 7 megawatts, while growth beyond that point would only be limited by shipments of small, lightweight \Jrectifier\j circuits from \JEarth\j.\p
Where solar arrays on the \Jplanet\j would need regular cleaning and dusting, the space arrays would not have this problem, and tiny dust particles would have no effect on radio waves. The rectenna would work year-round, night and \Jday\j, with the exception of short eclipses during the spring and autumn equinoxes on Mars.\p
\BFootnote 1:\b did the astrologers and alchemists know something with their choice of symbols? Probably not, as they used the looking glass of Venus as the symbol for that \Jplanet\j and for copper, but there is no evidence that Venus is rich in copper. The red color of the \Jplanet\j Mars would, of course, have hinted at the high "rust" content.\p
\BFootnote 2:\b Areo- words. An areologist is the Martian equivalent of a geologist, the name deriving from the Greek god of war, Ares. While the term has only recently become common, "areology" has been in use since 1881. (The \Jscience fiction\j works of Kim Stanley Robinson are a source where words such as areosynchronous, areobotany, and areophyte can be encountered.)\p
\BKey name:\b Peter Curreri\p
#
"Fastrac full-engine, hot-fire test successful",1020,0,0,0
(Mar '99)
NASA's Fastrac rocket engine began full-engine, hot-fire testing in the middle of March, with a 20-second, full-power test. The Fastrac is a 60,000-pound-thrust engine that will be used for the first powered flight of NASA's X-34 technology demonstrator. The engine is expected to dramatically reduce the cost of launch systems for space transportation. Up to 85 full-engine tests are scheduled for this year.\p
According to NASA, the Fastrac engine is less expensive than similar engines because of an innovative design approach that uses commercial, off-the-shelf parts and fewer of them. Fastrac uses common manufacturing methods, so building the engine is relatively easy and not as labor-intensive as manufacturing typical rocket engines.\p
The Fastrac engine operates with a single turbopump, which includes only two pumps - one for kerosene and one for liquid oxygen, and it uses a gas generator cycle, which burns a small amount of kerosene and oxygen to provide gas to drive the \Jturbine\j and then exhausts the spent fuel.\p
\BKey names:\b Danny Davis, Summa Technology Inc., Allied Signal Inc., Marotta Scientific Controls Inc., Barber-Nichols Inc., and Thiokol Propulsion.\p
#
"A map for all seasons",1021,0,0,0
(Mar '99)
Need a map of some part of the world or other? No longer a problem -- so long as you have a recent Web browser to run some of the Java components required to gather maps from a new interactive Web site created at Cornell University in the USA. The only drawback: these tend to be maps with a strong geological bias, but then, for some of us, this is the best sort of map in any case.\p
Now anybody with a suitable browser can print out maps that show the major geographic features of a region, along with such information as the location of \Jearthquake\j faults, a record of \Jearthquake\j occurrences and technical data about the events. The system draws on a major \Jdatabase\j, also located at Cornell.\p
The site is located at \Bhttp://atlas.geo.cornell.edu\b\p
\BKey names:\b Muawia Barazangi, Dogan Seber\p
#
"Drilling a million years of history",1022,0,0,0
(Mar '99)
In mid-March, The Hawaii Scientific Drilling Project began in Hilo, \1Hawaii\c. This involves a plan to bore 15,000 feet, almost three miles or five kilometers, into the volcanic history of the island. This should provide a continuous sequence from the \Jlava\j flows which formed \1Mauna Kea\c \Jvolcano\j, the famed "white mountain" that rises nearly 14,000 feet above sea level on the Big Island of Hawaii. \p
Ten years in the planning, the drilling program will go as deep -- and as far back -- as technology and available funding will allow. The data collected should provide an insight into the origin of Hawaiian volcanism, a continuous history of \JEarth\j's magnetic field and even into \Jgroundwater\j movements deep within the volcanoes. The main issues will be the chemical compositions of the layers, the isotope ratios in the layers, and any magnetic information buried in the rocks.\p
The drilling site is in an abandoned quarry located near the Hilo airport, between the Mauna Kea and \1Mauna Loa\c rift zones. This is important to reduce the chances of drilling into intrusive lavas or hydrothermal alteration of the subsurface rocks, which might make interpretation more difficult.\p
The first drilling program will take about six months, and hopefully will reach about 2.5 km (8,000 feet). A second program in about three years should complete the drilling task. Preliminary drilling suggests that the base of the completed hole could yield rocks up to a million years old, when Mauna Kea was even younger than Loihi, the submarine \Jvolcano\j forming right now off the Big Island of Hawaii.\p
The drilling program should provide some new knowledge about mantle plumes. Most volcanic activity happens at the edges of tectonic plates, either at the mid-ocean ridges, where plates split apart, or in the Pacific "Ring of Fire" and related features, where plates come together. \p
Hawaii is in the middle of one of the \Jplanet\j's largest plates, and the volcanoes which formed the islands are thought to arise from mantle plumes, jets of very hot, solid rock material that rise through the interior of the \Jearth\j from a depth of almost 3,000 kilometers. On current theory, the plumes melt when they get within about 100 km of \JEarth\j's surface, and this melting produces the \Jmagma\j that is erupted from the volcanoes. \p
#
"Geophysicists propose a new model of earth's mantle",1023,0,0,0
(Mar '99)
A report in \IScience\i in mid-March gives us a new view of the \JEarth\j's mantle. The mantle, nearly 2,900 km (about 1,800 miles) thick, and making up 83% of the \Jplanet\j's volume, is filled with solid rock, but heated by its own \Jradioactivity\j and by heat welling up from the core, it circulates slowly, driving the surface motion of tectonic plates, which in turn builds mountains and causes earthquakes.\p
The mantle divides into two parts: the upper mantle reaching from the surface to around 660 km (400 miles) down, and the lower mantle, from there to 2,880 km (1,800 miles), and the question has been: how does the mantle circulate - in one layer or two? \p
Recent studies by seismologists have suggested that tectonic plates sink from the \JEarth\j's surface deep into the lower mantle, supporting the view that the mantle overturns as a single layer. But at the same time, other evidence has pointed to two layers.\p
The \JEarth\j's heat budget and geochemists' studies of the chemical composition of rocks which have erupted at volcanic islands tell us that large sections of the mantle have been isolated from the surface since the \JEarth\j formed, suggesting that there must be two layers in the mantle. The volcanic islands, formed over plumes that are believed to come from deep inside the \JEarth\j, are likely to be our best samples of the lower mantle that we will ever see, and there are subtle differences in the \Jgeochemistry\j of these rocks.\p
The new model has two mantle layers. The tectonic plates do dive very deep into the mantle, they say, but the plates run into a geological barrier at about two-thirds of the distance to the outer core. When the sinking plate reaches this barrier, it is deflected and largely prevented from mixing with the deep-mantle material. At a few places, hot plumes rise up, pulling a bit of deep-mantle material along with the ancient slab to the surface, erupting in volcanic islands, and explaining the geochemical evidence.\p
Somewhere around 1,700 km (1,100 miles), the new mantle model has shifts in density, due to increased quantities of iron and silicon, but partially offset by skyrocketing temperatures, which produce the barrier. This new "hybrid convection model" fits the seismic records, and is consistent with the heat budget and geochemical evidence. It assumes, however, that elusive "reservoirs" of high radioactive heat production and distinctive chemical composition reside in the bottom 1,000 kilometers of the mantle.\p
\BKey names:\b Louise Kellogg, Brad Hager and Rob van der Hilst\p
See also: \BMapping \Jfossil\j hot spots,\b May 1997, \BHot spots in the news,\b February 1998, \BThe African landscape shaped by single \Jmagma\j plume,\b October 1998, \BFinding a hot spot,\b November 1998.\p
#
"A satellite with a view",1024,0,0,0
(Mar '99)
On August 18, 1960, the first film canister of aerial shots taken over Eurasia was successfully collected by an \Jaircraft\j, starting the CORONA project which ran until May, 1972, providing 800,000 high-resolution shots of areas of land to which the USA was denied access during the Cold War. By the end of the program, most of the rest of the world had been photographed as well.\p
Now information from the top-secret program has been declassified, revealing some of the ways in which modern \Jearth\j sciences have their roots in the secret parts of the Cold War. While the material was released in 1995, it was only in March that the details became widely known, when two geographers, Keith Clarke and John \JCloud\j, told the 95th annual meeting of the Association of American Geographers in \JHonolulu\j of what they had found in the \Jarchives\j.\p
Apart from anything else, the CORONA shots provide higher resolution material than the Landsat photographs, taken more than a decade later, and they extend the time-line back into the 1960s.\p
Their work also provides some useful hints for writers of espionage tales. The satellites were supposedly involved in a series of science experiments involving mice and frogs that were launched into \Jorbit\j, then returned to \JEarth\j in capsules which were then snatched in mid-air by crews from the Air Force's 6594th Test Group, based at Hickam Air Force Base in \JHonolulu\j. \p
The whole project was rated as top secret, but while no details were to be released to civilians, before long, many civilian agencies had their own top-secret labs to process data from CORONA, including the US Geological Survey, the Environmental Protection Agency, and the US Forest Service. The data were used to investigate environmental problems. The two geographers believe that the success of these efforts both assisted and impaired the success of the subsequent civilian \1remote sensing\c \Jsatellite\j programs like Landsat. \p
#
"Why deep-sea vents glow",1025,0,0,0
(Mar '99)
The mysterious ghostly glow that lights up deep-sea \1hydrothermal vents\c may be explained by a simple \Jchemical reaction\j, according to a \INew Scientist\i report in March. The faint light around hydrothermal vents in the ocean floor was first noticed 11 years ago, and most of the light is just thermal radiation from the hydrothermal vent. There is some shorter wavelength light, of an orange-yellow color, which cannot be explained by heat effects.\p
Past explanation attempts have included the tiny flashes that occur as minerals crystallize to the collapse of bubbles under the pressure of the deep ocean, but this is less than convincing.\p
Now two researchers think they have stumbled on the answer while they were simulating the conditions at the vents, where hot \Jwater\j, rich in dissolved sulfides, reacts with the oxygenated seawater around the vent. \p
When a strong reducing agent such as a sulfide ion reacts with an oxidant, free radicals may be formed, and these free radicals are hazardous to life. The researchers wanted to know if radicals form in the vents. \p
It turns out that the reaction is faintly luminescent, and while the light is too feeble to allow its spectrum to be measured, it could be measured with a photomultiplier that reacts to visible light. If the glow is indeed caused by chemiluminescence from sulfide \Joxidation\j, this would explain a puzzle -- why the glow at vents sometimes seems strongest 10 centimeters or so away from the vent openings. This would be the region where the sulfide-rich \Jwater\j collides with the oxygenated seawater. \p
One interesting aspect of the glow is that some biologists speculate that the eerie glow of the deep-sea vents may have driven the first steps in the \Jevolution\j of \Jphotosynthesis\j.\p
Interestingly, a research paper in \INature,\i earlier in March, also reported a week earlier in \INew Scientist,\i describes damage to the "thoracic eyes" of blind shrimp around some of the vents. It seems that submersible craft, fitted with floodlights, may be causing damage to the light-sensitive tissue under the shell of some shrimp species.\p
So far, nobody has a good explanation of what the "eyes" do, far below the photic zone where the \Jsun\j can be seen. One speculation is that the shrimp use the "eyes" to detect the scalding hot \Jwater\j of the vents, in which case, the damage done by floodlights could be a threat to the species.\p
\BKey names:\b David Tapley and Malcolm Shick \p
#
"A new bioluminescent octopus",1026,0,0,0
(Mar '99)
An \1octopus\c from the deep ocean, reported in \INature\i in mid-March, is providing new insight into how animals evolve the ability to make light. Bioluminescence is common in \Jsquid\j and \Jcuttlefish\j, but it is extremely rare among octopods, where it has only previously been seen as a glowing ring around the mouths of breeding females.\p
In the news is \IStauroteuthis syrtensis,\i newsworthy because of where the light comes from: the suckers. A live specimen was captured with a submersible, but when it was placed in an aquarium, researchers noticed that its suckers did not stick to anything. And when they turned out the lights, they discovered bright blue light where the suckers should be.\p
Dissection of the suckers on a second specimen showed that while the "suckers" still had sucker-like traits, many of the muscles, which are a prominent feature of suckers, were replaced by light-producing cells. The transition makes sense in an animal that has moved from the shallow sea, where there are rocks to grasp, into the deep ocean, where there are no rocks, and where there is no light.\p
Shallow \Jwater\j species of \Joctopus\j display their suckers for visual signaling, but with no light, only glowing suckers would do, and since the suckers' only remaining value would have been for communication, this would account for the loss of suckering power.\p
The glowing suckers have both a reflective layer at the back of the suckers and light-producing cells. The octopuses appear to live mainly on copepods, tiny crustaceans, and it seems likely that the glowing suckers serve as a lure to attract the copepods, because the diet otherwise makes about as much sense as a domestic cat living on mosquitoes.\p
\BKey names:\b Edie Widder and S÷nke Johnsen\p
#
"Flatworms, symmetry, and us",1027,0,0,0
(Mar '99)
Bilateral symmetry originated in some unusual type of \1flatworm\c, according to a report in \IScience\i in mid-March. And what is more, the animals with left and right sides may have arisen before the "Cambrian explosion," pushing the origin of animals back further (see \BQuestioning the Cambrian explosion,\b January 1999). At the same time, the report shows that the animals in question belong in a \Jphylum\j of their own: the Acoelans may look like flatworms, but they are an entirely separate group.\p
Most researchers agree that the first multicellular animals had circular shapes that were "radially symmetrical," just like the \Jjellyfish\j, or sea anemones of today. More complex animals like earthworms or humans could only develop when they became bilateral, having a right and left side.\p
A team of researchers from \JSpain\j and the UK believes that the Acoela, a curiously primitive group of flatworms, are the oldest living ancestors to all "bilateral" animals. Or to be more scientifically precise, today's Acoela are the largely unchanged descendants of the animals that were those ancestors, and their descendants include the rest of the flatworms.\p
Bilateralism was essential to many major developments in \Jevolution\j, and most of the features that allow an animal to move itself from one place to another, such as legs, fins, and wings, can only develop on bilateral organisms. Up until now, the origin of the bilaterians has been assumed to lie in the time of the Cambrian explosion, the period from 540-500 million years ago, when we first found \Jfossil\j signs of nearly all the major modern animal groups, or phyla. Now it seems possible that there were bilaterians even earlier, but that their fossils have not, as yet, been recovered.\p
The Acoela do not fit with the rest of the flatworms, as their name (Acoela = no digestive tract) implies. People who specialize in such animals say that the Acoela are "primitive" in other ways as well, and while some researchers have suggested that these are just ordinary flatworms that have lost their digestive tracts, the genetic evidence now seems to be against this.\p
The researchers (named below) took a molecular approach, sequencing the "18S ribosomal DNA" gene for 18 different Acoelans, as this gene had already been sequenced in many other animals, including other types of flatworms. (The data have been stored in the GenBank under accession numbers AF102892 to AF102900 and AJ012522 to AJ012530.)\p
The evolutionary road map that can be drawn from the information assumes that the more different two genes are, the longer the owners have been on independent evolutionary paths, away from a common ancestor. And the message from this "road map:" the Acoela were the first group of organisms to split off from the radial organisms, well before the other flatworms arose in the midst of the Cambrian explosion. From this, the researchers suggest that the Acoela deserve to be classified in their own \Jphylum\j.\p
There is another important distinction between the radial animals and the bilateral animals. Radial animal embryos are diploblasts that develop from two primary layers of embryonic cells, while the bilateral animals are triploblasts, which means they develop from three layers of cells. The major finding of the study, which included animals as different as a \Jnematode\j worm, an anemone, a snail, a leech, a mealworm, a \Jtoad\j, a mouse, and a sponge, was that the Acoelans, while they are triploblasts, came closest of all to the diploblasts.\p
Other evidence, based on the \Jnervous system\j of the Acoelans and the patterns of cleavage in the embryo, also points to them being a very ancient group, distinct from the other flatworms. They are the most primitive of all the triploblasts, with their origins probably lying at some point before the Cambrian. In lay terms, the Acoelans can no longer be seen as degenerate flatworms: they are relics of the original ancestors of us all, and the surviving Acoelans are the place where we should look for hints about how the radial body plan changed over into the bilateral body plan that we all have today.\p
\BKey names:\b Jaume Bagu±α, I±aki Ruiz-Trillo, Marta Riutort, D. Timothy J. Littlewood and Elisabeth A. Herniou\p
#
"A mammal ancestor, older than the dinosaurs",1028,0,0,0
(Mar '99)
Announced quietly in the \IProceedings of the Royal Society\i of London in late February, a South African \Jfossil\j looks interesting - at least to specialists in vertebrate \Jevolution\j. It has been hailed as "a new species of \Jfossil\j animal 260 million years old, which reinforces South \JAfrica\j as the place of origin of all mammals" by scientists at the University of the \JWitwatersrand\j.\p
The fossilized skull of a sheep-sized animal comes from the most primitive member yet found of a group of plant-eaters on the evolutionary line to mammals. These animals, the Anomodonts, were the dominant land creatures during the Permian period, long before even the dinosaurs had appeared on \JEarth\j. The Anomodonts were therapsid reptiles, and the name Anomodontia is now used to describe animals otherwise known as the dicynodonts, and their relatives.\p
The new \Jfossil\j is named \IAnomocephalus africanus,\i which literally means "lawless-headed one of \JAfrica\j." The Anomodonts, like the other therapsid reptiles, have always been thought to have evolved in \JRussia\j, but now the South African researchers say the South African portion of the supercontinent, Gondwana, was actually an evolutionary hotspot 260 million years ago, and the site where the first reptilian steps to the \Jmammal\j form were taken.\p
"Previously the ancestors of the dicynodonts were the Venjukovioids and all came from \JRussia\j. Now we have two forms from South \JAfrica\j, Patranomodon and Anomocephalus, which are more primitive (basal) than the Russian venjukovioids," said Bruce Rubidge, explaining the find.\p
\BKey names:\b Bruce Rubidge, Sean Modesto, John Nyaphuli and Johann Welman\p
#
"Helping the blind to navigate",1029,0,0,0
(Mar '99)
Two separate papers read to the 95th annual meeting of the Association of American Geographers in \JHonolulu\j in late March looked at new ways of helping the blind. One report described how a system of talking signs and mobile receivers has helped blind travelers in Santa Barbara, \JCalifornia\j, to navigate unfamiliar buildings and transit stops. The system has been developed by Professor Reginald C. Golledge, a blind geographer, and graduate student James Marston of UC-Santa Barbara.\p
This covers the end of a three-year project aimed at helping the visually impaired by increasing their spatial information, use of public transit and independence. The system relies on high-tech transmitters, costing around US$450 and receivers used by the blind, costing around US$150 each.\p
As little as five minutes of training allowed blind people to use receivers to detect the location of facilities such as ticket machines, exits, information windows, and telephones much more quickly than those without receivers. Following the success of this first phase, a project is to begin in San Francisco in April. An interesting development: receivers can be programmed in a variety of languages, allowing the possibility for international visitors to hear instructions in their native tongue as they explore a new city on their own.\p
A second report from Dan Jacobson, also of UC-Santa Barbara, describes a plan to place sound maps on the \JInternet\j for the visually-impaired, with computerized audio and tactile features that will move users through landscape features, streets, and buildings via their ears and fingers. Blind users who have tested the sound map prototypes obtained better and faster information than a control group using only tactile maps, according to Jacobson.\p
The idea is to allow the blind to access the digital maps via the \JInternet\j on a personal computer. The system uses a conventional computer with Web browser software and a "touch window." Environmental sounds, such as traffic noise to indicate road, and spoken location names help users to navigate around the map, as well as more obvious sound and touch clues, and "Earcons," which are tone-based symbols combining pitch and rhythm. These provide instructions for navigating around the map.\p
Ten people, five visually impaired and five blind, tried the system out and, after 15 minutes, all were able to successfully navigate the map after 15 minutes of instruction, even those who had never previously used a computer.\p
Jacobson is now working on a haptic mouse, which delivers force \Jfeedback\j through the hand holding the mouse. This gives a three-dimensional picture of the map through the applications of different haptic effects. A "virtual wall" can offer slight resistance, and so the user has to provide extra force in order to pass through. This allows perception of the shape and layout of the map features. An opposite effect, called a gravity well, pulls a user into a predetermined object.\p
#
"Keeping fire away from planes",1030,0,0,0
(Mar '99)
As you sit in any commercial \Jaircraft\j, just about everything you see is a polymer -- perhaps with the exception of the seat supports, and in a crash, many of these polymers will not burn, but they break down, producing toxic gases, or gases which burn. In "survivable" air accidents, 40% of deaths are due to fires, and since air accident deaths are equally divided between those happening in survivable and non-survivable crashes, that means that 20% of all air fatalities are caused by fires.\p
The American Chemical Society was told, during March, about a series of studies into new fire-resistant polymers, designed to improve air safety. One polymer, known as PHA (polyhydroxyamide), decomposes very little in contrast to other polymers. And more importantly, the PHA that did decompose was converted to \Jwater\j vapor, and another nearly nonflammable polymer, PBO (polybenzoxazole), was extremely fire-resistant.\p
The obvious solution, using PBO in the first place, is not really an option, as it is too hard making useful products from PBO. PHA is a good alternative, because it can be made and processed by mild 'green chemistry' processes, yet when subjected to fire dangers, it converts into strong, stable PBO.\p
The researchers also described new techniques that use just milligrams of a material in order to assess its combustibility. And to cap it off, they added that PHA might also have prospects for fire-safe clothing for military uniforms.\p
\BKey names:\b Phillip Westmoreland, Richard Lyon\p
#
"Dyed in the silkworm",1031,0,0,0
(Mar '99)
In early March, a report in \IGenes & Development\i described a method of producing genetically altered, green fluorescent silk fibers that are spun by the \1silkworm\c, \IBombyx mori.\i The method has potential applications for silk or other economically important proteins: if an insect can produce one "foreign" protein, it can produce others, such as the \Jspider\j silk protein spidroin, which has potential industrial uses ranging from the fibers in bullet-proof vests to parachutes. \p
The silkworm larvae were infected with a genetically engineered insect virus (a baculovirus) that carried an altered version of a silk protein. They fused the gene encoding the light chain of the fibroin protein - a major protein component of silk - to the gene encoding the green fluorescent protein from \Jjellyfish\j.\p
Then, after a process called homologous recombination, the silkworm's natural fibroin gene was replaced with the new altered version. Curiously, when the silk glands are lit up with ultraviolet light, the glands glow with an eerie green color, allowing the successfully infected individuals to be identified.\p
\BKey names:\b Hajime Mori, \JKyoto\j Institute of Technology\p
#
"Maggots to the rescue!",1032,0,0,0
(Mar '99)
A \1maggot\c or two is useful in treating infected or necrotic wounds, according to a Welsh doctor, Dr Steve Thomas, writing in the \IBritish Medical Journal.\i Not just any old maggots, of course, but sterile maggots of the common greenbottle fly, \ILucilia sericata.\i Thomas and his colleagues say that over the past three years the clinical use of maggots has been reintroduced into the UK and elsewhere, in well over 400 centers, with considerable success. \p
Nobody knows quite how the fly larvae kill the \Jbacteria\j in wounds, but it is likely the effect is due to one or more of the maggots eating and destroying the \Jbacteria\j, changing the pH (acidity) of the wound. Or perhaps they produce a natural antibiotic of some sort, which would make sense, given the conditions in which fly larvae usually find themselves.\p
Most of the time, treatment with maggots is regarded as a last resort, used when all other options have failed. But if \Jmaggot\j treatment is started earlier, Thomas and his colleagues argue, it might even be possible to avoid the use of \Jantibiotics\j altogether.\p
#
"Wolves bounce back",1033,0,0,0
(Mar '99)
The wolves of Isle Royale (see \BThe wolves of Isle Royale strike hard times,\b March 1998) have made an excellent comeback in the past year. A survey in the northern winter of 1997-98 showed plummeting numbers, but this year, numbers have jumped from 14 to 25 in the \JMichigan\j park, the best number since 1981.\p
Two of the park's packs produced six pups each, and survival was helped by the moose population being in poor health, with many moose calves, and a lot of elderly moose, making easy prey for the wolves.\p
Dead wolves found in the past few years have been disease-free and showed no direct signs of any genetic problem that biologists thought might have caused poor reproduction in past years. Just one wolf has died on Isle Royale in the past year, and biologists established that it had been killed in a territorial dispute by other wolves.\p
#
"April, 1999 Science Review",1034,0,0,0
\JApril 1999\j
\JA new solar system\j
\JA sundial for Mars\j
\JEvidence for a hypernova?\j
\JTesting the Big Bang theory\j
\JNew ways of going to the stars\j
\JA new member of the human family?\j
\JDid Neandertals and modern humans interbreed?\j
\JThe jaw of the primitive shark\j
\JGiant bacteria\j
\JA tree grows in Morocco, long, long ago\j
\JA surprise in the nerves of plankton\j
\JCrosswords the easy way\j
\JMaking faster chips\j
\JAbrasives in your chips?\j
\JExplaining superconductivity\j
\JA vaccine against melanoma\j
\JDo it, but don't overdo it\j
\JWhy a modified cold virus kills cancer cells\j
\JMicrogravity and gene transfer in plants\j
\JControlling apoptosis\j
\JSeeing the core-mantle boundary\j
\JBad news for forests\j
\JNext, now for the weather next century . . .\j
\JThe threat to tropical coral reefs\j
\JCounting sprites\j
\JAntarctic ice shelves breaking up\j
\JTelomerase broadens its scope\j
\JTobacco secrets\j
\JSeabed silt in the Indian Ocean\j
#
"April 1999",1035,0,0,0
Two main stories seized the public imagination in April: the discovery of a star, Upsilon Andromedae, with no less than three planets around it, and the suggestion that a new small-brained \Jhominid\j \Jfossil\j, \I\1Australopithecus\c garhi,\i may be the remains of the first of the human line to use tools. Almost as exciting, researchers have found the largest of all the \1bacteria\c, \IThiomargarita namibiensis,\i in seafloor ooze off the coast of \JAfrica\j, which is as large as a full stop or period. There is also a very cute piece of science, describing how the common cold virus an attack \1cancer\c.\p
As usual, most stories will develop further - when we completed our story on the giant bacterium, there were just four or five references to it on the Web, and last time we looked, the number was close to 100, so the terms and names mentioned in the stories are intended to help readers pursue further links on the \JInternet\j. This applies particularly to the key names which are listed after many of the accounts, as any combination of a topic with one or two of the surnames is likely to tease out further references.\p
#
"A new solar system",1036,0,0,0
(Apr '99)
Stars with a single \1planet\c are now very ordinary news, with almost 20 extrasolar planets now known, but in the middle of April in a paper in the \IAstrophysical Journal,\i news broke of a multi-\Jplanet\j solar system around a star which can be seen from \JEarth\j by the naked eye. The star, Upsilon Andromedae, is not visible right now, as it is "up" in the daytime, but later this year, it will be visible in the night sky once again.\p
Teams from San Francisco State University and the Anglo-Australian Observatory both reported the find. Geoffrey Marcy and R. Paul Butler had reported a near \1Jupiter\c-sized \Jplanet\j orbiting the star in 1996, about 7.5 million km (4.7 million miles) away, but a careful analysis of eleven years' worth of signals gathered at the \1Lick Observatory\c revealed signals of what appears to be two additional planets within the same system. So the Upsilon Andromedae system now has at least three planets, giving us the first evidence ever found of a solar system that mimics our own.\p
The new planets are further away from their star than the original find, which is so close that it whizzes around the star once every 4.6 \JEarth\j days. They are also larger: the middle \Jplanet\j is estimated at twice the size of Jupiter, and the outermost \Jplanet\j, four times Jupiter's mass. Like the other extrasolar planets found so far, they are in an elliptical \Jorbit\j.\p
The find was also confirmed by a second team of astronomers from the Harvard-Smithsonian Center for Astrophysics (CfA), working on data gathered at the Smithsonian's Whipple Observatory. Butler is the lead author of the paper, while contributing colleagues include Marcy, Debra Fischer of San Francisco State, and several researchers from the CfA. \p
Astronomers find a new \Jplanet\j by looking for movement in a star. All stars move through space, but if there is a large enough \Jplanet\j orbiting a star, the star "wobbles" under the gravitational pull of the \Jplanet\j, in time with its \Jorbit\j. So with careful accurate measurements over a sufficiently long period, planets can be detected, even when they cannot be seen.\p
The two groups had been collaborating in their analysis, expecting to find a second \Jplanet\j around Upsilon Andromedae, but in fact they found two new planets, not one. The analysis began with observations of the star and, after eliminating the effects of the 4.6-\Jday\j orbital period of the known \Jplanet\j (about 75% of the mass of Jupiter) from the data, another periodic variation of about 1200 days showed up, suggesting a second \Jplanet\j lying about 2.5 Astronomical Units (AU) from the star, while the \Jmagnitude\j indicated a mass four times that of Jupiter.\p
After both these periods had been eliminated from the data, there was still a fair amount of "noise" in the fluctuations, allowing the scientists to pinpoint a third \Jplanet\j, about twice the mass of Jupiter, orbiting in about 250 days at a distance of just under 1 AU. \p
The method is biased, showing up large planets, close to their stars - Jupiter, for example, would be undetectable from Upsilon Andromedae by this method. But if we cannot detect \JEarth\j-sized planets that far away, we \Ican\i say for certain that there are no \JEarth\j-sized planets around \Ithis\c star. The simple presence of a massive \Jplanet\j so close to the star would most certainly have either absorbed any habitable \Jplanet\j, or hurled it right out of the Upsilon Andromedae system. Still, at least we know now that multiple planets can form around other stars, so we do not need to expect that stars will have just one big orbiting object, slurping up the rest of the mass around the star, as some astronomers have suspected.\p
What remains is the question of whether the three planets formed where they are now, or whether they migrated inwards as the result of some event such as a close encounter between two planets or another star passing nearby.\p
#
"A sundial for Mars",1037,0,0,0
(Apr '99)
Well-known American public \Jtelevision\j science presenter, Bill Nye, unveiled NASA's most recent awareness-raising stunt, a sundial for Mars, during April. Inscribed with the motto "Two Worlds, One \JSun\j," the sundial will travel to \1Mars\c aboard NASA's Mars Surveyor 2001 lander. After the lander arrives in January 2002, its panoramic camera will reveal the solar time on Mars for all on \JEarth\j to see. While the sundial \Imight\i be seen as little better than a stunt, there is a serious scientific purpose, as the sundial will make people much more aware that Mars, like \JEarth\j, is a \Jplanet\j that rotates as it orbits the \Jsun\j, contributing to public understanding and appreciation of science.\p
The sundial, fitted with central black, gray, and white rings, and corner color tiles will also act as a \Jcalibration\j target, a kind of test pattern, to adjust the brightness and tint of pictures taken by the camera. "Our ancestors made astonishing discoveries about the nature of the heavens and our place in it by closely watching the motion of shadows," said Nye, unveiling the sundial design at a press conference at Cornell University on April 21. "Now, at the dawn of the next century, we can make observations of new shadows, this time on another \Jplanet\j."\p
The sundial will be 3 inches (about 8 cm) square, and will weigh just over 2 ounces (60 grams). It is made of \Jaluminium\j to minimize its weight. The anodized metal surfaces will be black and gold and, as indicated, the photometric surfaces, to be used to calibrate the Mars Lander's Pancam camera, are made of a special silicone \Jrubber\j compound. \p
Children played a part in developing the design, communicating their suggestions by e-mail. One common theme was the suggestion for writing in many languages. In accordance with this principle, the face of the sundial is engraved with the word "Mars" in Arabic, Bengali, Braille, Chinese, Danish, English, French, German, Greek, Hawaiian, Hebrew, Hindi, Inuktituk, Italian, Japanese, Korean, Lingala, Malay-Indonesian, Portuguese, Russian, Spanish, and Thai. \p
Together, these languages are used by more than three quarters of \JEarth\j's population, but even the past is catered for. Ancient languages of Sumerian and Mayan are also included, because the \Jplanet\j Mars figured prominently in both the Sumerian and Mayan cultures.\p
Once the \Jspacecraft\j lands on Mars and the exact orientation of the sundial can be determined, viewers will be able to tell local Martian time from sundial images and a computer-generated overlay which will be posted on the World Wide Web. Mirrored segments along the outer ring of the sundial will also reveal the color of the sky above the lander. Over the course of a \Jday\j, viewers on \JEarth\j will thus see the passage of time on Mars recorded in the sweep of a shadow of the sundial's central post and the changing colors of the Martian sky. The shadow will also reveal the changing Martian seasons throughout the life of the mission. \p
#
"Evidence for a hypernova?",1038,0,0,0
(Apr '99)
The first gamma ray bursts, more commonly called GRBs, were detected in the 1960s by the US military Vela satellites. We now know that about once every eight hours, one of these massive flashes of gamma rays will be detected, but astronomers remain puzzled about the origin of the gamma ray bursts. We now assume that they are localized reactions involving some kind of massive cosmic explosion in something we now call a "gamma ray burster," but we do not yet know what a "burster" is. Bohdan Paczynski suggested in 1998 that the bursters might be hypernovae, explosions which are orders of \Jmagnitude\j greater than supernovae.\p
These hypernovae, he suggested, might be associated with the formation of black holes, perhaps during the collapse of massive stars and/or the merger of massive stars with dense or compact objects of large mass. These would have been the most energetic events in the universe - apart from the Big Bang. The recently observed burst, known as GRB 990123, appeared to produce more \Jenergy\j in a single burst than an ordinary star in its entire life, and this might point to the source being a hypernova. Another possibility is that the gamma radiation was emitted in a tight beam, which just happened to "illuminate" the \Jearth\j as it flashed by. In that case, the total \Jenergy\j of GRB 990123 might have been much less.\p
If any evidence of hypernova remnants can be found, perhaps there is no need to search for mechanisms which would explain the "beaming" phenomenon, so there was a certain amount of excitement when Daniel Wang told the High \JEnergy\j Astrophysics Division of the American Astronomical Society in mid-April that he had found just the right sort of tell-tale signature in two objects previously classified as \Jsupernova\j remnants, some 25 million light-years away. The two shell-like nebulae or clouds are in galaxy M101, also known as the Pinwheel galaxy because of its beautiful spiral shape. His work will also appear in the May 20 issue of the \IAstrophysical Journal Letters.\i \p
One nebula, MF83, with a radius of over 430 light-years, is one of the largest \Jsupernova\j remnants known, and the other, NGC5471B, is rapidly expanding at a velocity quoted as "at least 100 miles per second," or about 160 km/second. Each has an x-ray luminosity about an order of \Jmagnitude\j brighter than the brightest \Jsupernova\j remnants known in our galaxy. From the \Jenergy\j required to produce remnants of that size and speed, with the light signature of the x-ray radiation that Wang found, he believes they are good candidates to be hypernova remnants.\p
The area around NGC5471B is a star-forming region, and this makes it likely that the event involved the collapse of a massive star. M101 is one of only a few nearby galaxies with vigorous star formation going on, and this probably explains why one single galaxy could contain two relatively rare hypernova remnants with ages less than about a million years. \p
A hypernova remnant would probably last for tens of millions of years, but it quickly loses most of its brightness. As a result, the two hypernova remnants cannot be seen in a typical visual picture in optical light, though they are visible in x-ray and in emission lines from some specific atomic transitions. \p
See also \BSmall galaxy disappears, not many hurt\b (March 1997), \BGamma ray burst interpreted\b (April 1998).\p
#
"Testing the Big Bang theory",1039,0,0,0
(Apr '99)
In mid-May, a new research \Jsatellite\j called FUSE is due to take off from \1Cape Canaveral\c. Its task will be to peer through the cosmos, to spy out the fossilized record of the origins of the universe. Seeing traces of the past is a hard task, because between us and the story we are looking for, there is a huge assortment of chemicals, all playing a part in concealing the relics left behind by the Big Bang. FUSE, the Far Ultraviolet Spectroscopic Explorer, is able to see past this chemical stew and show us everything from nearby planets to the edge of the galaxy.\p
Ideally, FUSE will shed light on the conditions in the universe just moments after the Big Bang and explain how the basic components of the universe were formed. Because of its position in space, FUSE will be able to detect wavelengths which ground telescopes are blind to, because the wavelengths are absorbed by our \Jatmosphere\j. One targeted data set, relating to the abundance of \1deuterium\c in the first few moments of the universe, may test the limits of the Big Bang theory, according to planners.\p
The data gathered on deuterium in a variety of places, from the inner recesses of our solar system to the furthest parts of the Milky Way, will set an accurate benchmark for the amount of deuterium in the galaxy, and that will allow researchers to calculate conditions in the infant universe moments after the Big Bang. \p
As well, star creation depends on deuterium being used up as \Jhelium\j is formed from the fusion of deuterium nuclei, so a map of deuterium abundances in many regions of our galaxy will give scientists a better understanding of how chemicals are mixed and distributed and destroyed. \p
#
"New ways of going to the stars",1040,0,0,0
(Apr '99)
In an era when technology is progressing quickly, any space mission we send beyond the closest planets will arrive equipped with outmoded technology when it arrives. The simple problem is that it takes too long to get a \Jspacecraft\j to anywhere interesting. More importantly, the cost of getting a payload anywhere is too high. These were the two main issues to emerge from the International Conference on Advanced Propulsion at \1Huntsville\c in April.\p
In some cases, old technology can be converted and reprogrammed, as happened with the \1Voyager mission\c, but in the future, we need to plan for that option, rather than doing something after the event, say David Noever and Subbiah Baskaran, who urged the notion of "Darwinian selection" in the design of \Jspacecraft\j. While their idea has considerable merit, it appears to have little in common with the notions of \1Charles Darwin\c, and more to do with the idea of planning ahead and keeping options open.\p
Yet while the idea of evolving \Jspacecraft\j may seem far-fetched, the most recent NASA missions, like the Mars Pathfinder, the \JEarth\j Observer and the Deep Space 1 Interplanetary Probe, all have artificial intelligence capabilities, and the future may allow us to have craft which grow and improve as they are traveling, assisted by information-uploads from "home."\p
In such a case, "fitness" would be a matter of successfully meeting mission goals, say the visionaries. They suggest that Deep Space 1 will be the first major \Jspacecraft\j that will "learn" during its long and lonely trip through the solar system. The key notion from their paper is that while you can update information in space, updating materials is harder because of the severe payload costs imposed on every gram of extra material sent into \Jorbit\j or beyond.\p
If transport were cheaper, higher levels of redundancy and flexibility could be built in; the researchers proposed the \1dandelion\c as a model. The common dandelion has evolved to produce large numbers of identical seeds which can spread and take root, far from the parent plant. If something disastrous happens to a few of the seeds, there are still plenty more left to carry on the dandelion "mission." In the same way, mini-rovers on the surface of Mars, equipped with their own "thinking" capacity and communications, could work cooperatively while carrying out their own parts of a larger meeting.\p
At the same conference, Gary Lyles from NASA described some of the prospects for nuclear-powered rocket systems which might reduce the cost of orbiting material for just $10 a pound (equivalent to US$22 a \Jkilogram\j) - about the same cost per pound or per \Jkilogram\j as flying an adult human once around the world. Coupled with reusable "hands-off no-maintenance" vehicles which could fly a hundred times a year, these would open up an entirely new view of space flight.\p
While it will alarm environmentalists, NASA is now looking seriously at NERVA, the Nuclear Engine for Rocket Vehicle Applications as a source of propulsion beyond \JEarth\j \Jorbit\j. This system pumps liquid \Jhydrogen\j over a \Jgraphite\j core reactor powered by uranium 235, producing a superhot exhaust gas.\p
Test models of similar systems were fired in the 1970s, so the technology is well-proven. Other nuclear possibilities include a fission system based on americium-242m, or even a fusion model using deuterium and tritium.\p
The research ideas do not stop there, even spreading, in a controlled and scientific way, to science fictionish ideas like canceling \Jinertia\j or forming wormholes, and even antigravity (see \BAntigravity and NASA,\b February 1999). The future, whatever it holds, looks interesting.\p
#
"A new member of the human family?",1041,0,0,0
(Apr '99)
A new \1hominid\c specimen, tentatively placed in the \Jgenus\j \1\IAustralopithecus\c\i and called \IAustralopithecus garhi,\i after the local Afar people's word for "surprise," has been found in the Afar region of \JEthiopia\j. The find, described in \IScience\i in late April, was turned up by a team of Ethiopian, American, and Japanese researchers. The \Jhominid\j has been widely hailed as "a missing link," because while it has a small brain like "Lucy" and her family, \IAustralopithecus afarensis,\i which flourished around 3.2 million years ago, the cranial and tooth remains are associated with finds which indicate that hominids in the area at that time were walking on humanly proportioned legs and using stone tools to strip meat and scrape marrow from the bones of antelopes and horses.\p
Two papers were published in the same issue of the journal. The first, \IEnvironment and Behavior of 2.5-Million-Year-Old Bouri Hominids,\i describes some of the finds in the area, while \IAustralopithecus garhi: A New Species of Early \JHominid\j from \JEthiopia\j,\i describes the actual find.\p
The tool-users who left cut marks on the bones were not necessarily \IA. garhi,\i but the temptation is strong to assert that it \Iwas A. garhi\i which first used tools, and whoever they were, they represent a very early instance of tool use. The site is dated at 2.5 million years ago, so it is comparable with a 2.6-million-year-old site at nearby Gona, 80 km (50 miles) south, where nearly 3,000 stone tools were found in 1997. \p
The fossils were found near the small village of Bouri, described as "a hard two-\Jday\j drive northeast of Addis Ababa." This is in a harsh, desert region of \JEthiopia\j called the Middle Awash, already famous for other \Jhominid\j finds. As usually happens, the fossils were found because the bone remnants were sticking out of the ground.\p
The researchers were looking for signs of the smaller-toothed \IA. africanus,\i a smallish, upright-walking \Jhominid\j which roamed southern \JAfrica\j two to three million years ago. As it turns out, \IA. garhi's\i big teeth and projecting face best resemble \IA. afarensis.\i It has a small brain case (larger than \IA. afarensis,\i but only about a third that of modern humans), but it was found some 300 meters from the other finds of cut bones and stone tools, which leaves some doubt about any links that may be inferred.\p
The molars and premolars are "huge," while the front teeth are comparatively smaller. The face projects forward, while the braincase is crested and small, a new combination of features which explains why the specimen has been assigned to a new species name. Tim White from the University of \JCalifornia\j believes that if the skull remnants belong to the limb bones, it would have been a 1.2 meter (4 feet) tall individual.\p
The arm and leg bones found in the same geological layer are from several other \Jhominid\j individuals, but without associated evidence in the form of more jaws, these fossils cannot reliably be assigned to a species. The limb bones show proportions intermediate between those of apes and humans. Lucy's group at 3.2 million years had relatively long arms, compared with their legs, and \IHomo erectus \iat 1.7 million years ago had the shortened forearms and longer femurs of modern humans. The unidentified Bouri hominids fall right in the middle, showing that the \Jfemur\j lengthened at least one million years before the forearm shortened.\p
One of the \Jhominid\j legs was found buried next to catfish and \Jantelope\j bones, and the \Jantelope\j bones show very clear cut marks from stone tools. The same geological layer yielded \Jantelope\j and three-toed horse bones with similar tell-tale signs of butchery: one \Jantelope\j lower jaw showed marks suggesting that the tongue had been cut out, while leg bones appear to have been purposely fractured at both ends, indicative of marrow extraction. The cuts are consistent with what would be expected from the tools that have been found.\p
Eating meat is an effective way of becoming a much more active species with more spare time to spend on experimentation and discovery. So whether the owners of the arms and legs were hunters or scavengers, this find may represent the start of modern humans: they walked around on human-like legs, and they used stone tools to slice meat and pull marrow from the bones of large animals that thrived in the open, grassy plains once surrounding an ancient lake, as indicated by a variety of evidence such as the catfish bones.\p
This unprecedented and unique access to high-fat meat and marrow would have been a dietary revolution which may have been the trigger which allowed later hominids to spill out of \JAfrica\j, to spread across the world, but it will take more evidence before we can label \IA. Garhi\i as the key species which made the jump.\p
The main problem right now is that only a few isolated tools were found, strewn about the surface of the site, and none during excavation. So the tools lack reliable dates, but with no local features such as large rushing streams with cobbles or rock-outcroppings to provide the material for tools, researchers suspect that the material may have been carried in from the Gona site, and that the Bouri hominids, including \IA. garhi,\i must now be considered strong candidates for the Gona tool-makers.\p
But while the news stories have featured a "missing link" slant, the find leaves even more links missing. We seem to begin the story with what are effectively bipedal big-toothed chimps, and we end up soon after with meat-eating large-brained hominids.\p
Certainly Tim White, co-author of the second paper, has no doubt about what they have found, claiming "It is in the right place, at the right time, to be the ancestor of early humans, however defined." Co-researcher Berhane Asfaw says it is possible that \IA. Garhi\i is the direct ancestor of \IHomo\i and the most successful descendant of Lucy.\p
It is important to keep in mind here that \I"Australopithecus"\i is a bit of a dumping area for primitive hominids which do not fit into the \Jgenus\j \IHomo,\i or into the \IParanthropus\i group, the robust former members of the \Jgenus\j \Iaustralopithecus\i who have been split off in the past ten years. So one scientist's \Iaustralopithecus\i may be another scientist's \Ihomo,\i or even an entirely new \Jgenus\j. For now, though, the name \IAustralopithecus garhi\i will have to do.\p
\BKey names:\b Tim White, Berhane Asfaw, Jean de Heinzelin, J. Desmond Clark (a total of some 40 scientists from 13 countries around the world were involved).\p
#
"Did Neandertals and modern humans interbreed?",1042,0,0,0
(Apr '99)
There are two views of human \Jevolution\j: either the modern type of human developed in \JAfrica\j (the "out of \JAfrica\j hypothesis"), or modern humans developed in a variety of locations around the world (the "regional continuity, or multi-regional hypothesis"). The rather more popular "out of \JAfrica\j" model assumes that a new breed of human developed in \JAfrica\j, and then migrated from there, replacing other local populations.\p
Central to this theory is the belief that there was no interbreeding between the earlier Neandertal people of the Middle East and Europe with the later human types, and some "out of \JAfrica\j" supporters even go so far as to give the Neandertal people a different species name.\p
This explains why the "regional continuity" supporters are so enthusiastic about the discovery of a \Jfossil\j, probably of a four-year-old boy, close to 24,500 years old, and apparently a hybrid between the \1Neandertal\c and Cro-Magnon stocks. The \Jfossil\j, dated by radiocarbon methods, was found on a hillside near Leiria, \JPortugal\j, in the Lapedo Valley 130 km (80 miles) north of \JLisbon\j. It was uncovered by a group of Portuguese archaeologists led by \JPortugal\j's director of antiquities, Joao Zilhao.\p
The skull of the skeleton was crushed when a farmer bulldozed the then-undiscovered site six years ago, and the rest of the \Jfossil\j find was located when an archaeologist stuck his hand down a \Jrabbit\j hole and pulled out the well-preserved skeleton's left forearm. The body was buried in red ochre, with a pierced shell, indicating a ritual burial.\p
Washington University anthropologist Erik Trinkaus (a supporter of regional continuity) says that the skeleton, excavated in December 1998, shows a mix of Neandertal and modern features. The prominent chin appears characteristic of early modern humans, the stocky trunk and short limbs suggest Neandertal origins and other arm bones point to early modern human parentage.\p
Although there is considerable evidence that the people generally referred to as \1Cro-Magnon man\c, who became modern humans, lived side-by-side with and interacted with Neandertals, which died out about 30,000 years ago, a study published in 1997 of \1DNA\c taken from the original Neandertal skeleton, discovered in \JGermany\j's Neander valley in 1856, indicated it was too distant genetically to have been an ancestor of modern humans.\p
Anthropologists who believe in racial replacement have taken this as final evidence that the two populations are distinct, but those who believe in regional continuity argue that all it shows is that the Neandertals were not the same as modern humans. This, they say, we already knew, without the need for any DNA study.\p
Curiously, there is nothing in the basic "out of \JAfrica\j" model which would be ruled out by any slight interbreeding between the two branches of \IHomo,\i except perhaps to establish that the Neandertal people were the same species as the Cro-Magnon people who lived in the same area, and interacted with the Neandertals for some thousands of years.\p
See also: \BArchaeology: was Eve an African?\b\p
#
"The jaw of the primitive shark",1043,0,0,0
(Apr '99)
To paleontologists the word "primitive" means something slightly different. Having a collar-bone is a sign of a primitive \Jmammal\j if you are a vertebrate, but all humans have a \1clavicle\c or collar-bone. Scientists see no contradiction in this, though many believe that we have acquired a collar-bone as a secondary characteristic, because our ancestors traveled by \1brachiation\c.\p
When we say that chondrichthyans, the \1cartilaginous fish\c that we know as the sharks and their relatives, are primitive because of their jaw characteristics, it means we think that sharks are close to the state which was seen in the earliest members of the group. A recent detailed description of a 400-million-year-old primitive shark relative from \JBolivia\j named \IPucapampella,\i offers evidence that this assumption may be wrong.\p
John Maisey, a curator in the Department of Vertebrate Paleontology at the American Museum of Natural History, described the \Jfossil\j at a conference in London during April. The \Jfossil\j contradicts the belief that chondrichthyans are primitive, and it also sheds new light on how the jawed vertebrates evolved from the jawless state, and that was, of course, a crucial first step towards human \Jevolution\j. \p
We know little about how jaws originated, because there is a poor \Jfossil\j record of the critical time when the first jaws evolved, some time before the \1Devonian period\c (about 412-354 million years ago). We have tended to rely on a 370-million-year-old shark called \ICladoselache\i for our model of jaw \Jevolution\j because good fossils of it have been available to study for more than a century. Now \IPucapampella\i has taken the history back a further 30 million years by giving us the ". . . earliest shark braincase that we can actually study in any detail," according to Maisey.\p
The standard view has been that the \1bony fish\c, the osteichthyans, had more "advanced" jaws, presumably building on the primitive chondrichthyan model, and improving on it. That, by definition, makes the chondrichthyan jaw primitive. But \IPucapampella,\i right at the base of the chondrichthyan lineage, right at the start, has its jaw attached to the skull in much the same way as a bony fish - which makes the bony fishes' jaws primitive, and the sharks' jaws advanced.\p
Previous researchers have found it difficult working out how the "primitive" shark jaw was developed from jawless animals such as lampreys, but that is no longer a problem. Now we need to work out how the jaw of the bony fishes was derived from the lamprey form.\p
Fossils of \IPucapampella\i are found in what was once a part of the southern hemisphere and covered by a cold, shallow ocean unlike the warm, tropical waters modern sharks prefer. Today, the remains of the sea floor are found in \JBolivia\j and South \JAfrica\j, which were geographically closer during the Devonian period than they are today. No doubt there will be more researchers in those areas over the next few years, looking for more pages in the story of life.\p
#
"Giant bacteria",1044,0,0,0
(Apr '99)
A report in \IScience\i in mid-April describes a giant bacterium which may play a significant role in the environment. While sampling sediments off the coast of \1Namibia\c, a group of German, Spanish, and American. researchers have found the biggest \1bacteria\c ever known, also the largest prokaryotic organisms ever seen. They are typically between 0.1 and 0.3 mm in diameter with the largest of them around 0.75 mm across, the size of a fruit fly's head, and visible to the naked eye. They are nearly 100 times larger than the previous bacterial record-holder. \p
Dubbed \IThiomargarita namibiensis,\i which means "Sulfur Pearl of \JNamibia\j," the giant bacterium is about the size of a normal period or full stop). The new microbe links two natural cycles that were previously thought to be entirely separate: the sulfur and \Jnitrogen\j cycles. \p
The microbes store \Jnitrate\j in a huge central vacuole, a sac which is 98% of the organism, and elemental sulfur in small pieces, just beneath the cell wall. The \Jnitrate\j levels in the vacuole can be up to 10,000 times those found in surrounding seawater.\p
The \IThiomargarita\i cells grow in loosely-linked strings, rather like a string of pearls, explaining their name. The huge size of the newly-discovered bacterium has seriously excited publicists for the scientists, who point out that if the largest \IThiomargarita\i was as big as a blue whale, then an ordinary bacterium would be slightly smaller than a new-born mouse. And the largest previously known bacterium, \IEpulopiscum fishelsoni,\i which lives in the guts of \1surgeonfish\c, would only be about as big as a lion.\p
The researchers had been looking for two other kinds of sulfur \Jbacteria\j, \IBeggiatoa\i and \IThioploca,\i which they had been studying off the Pacific coast of South America. Each west coast features the same strong ocean currents running parallel to a north-south continental shelf and an upwelling of deep ocean \Jwater\j that is unusually rich with the \Jnutrients\j on which \1phytoplankton\c rely.\p
The upwellings are driven in each case by the apparent force generated by the \1Coriolis effect\c which is in turn caused by the eastward motion of the \JEarth\j. The upwellings provide an environment in which other marine organisms can thrive, but as the tiny algal cells which make up the \Jphytoplankton\j sink to the bottom, they produce a rich anaerobic sediment, high in sulfur, and just right for \Jbacteria\j like \IThioploca.\i \p
Contrary to their expectations, the researchers found few \IBeggiatoa\i and \IThioploca,\i but the sediment, at a depth of more than 100 meters in \1Walvis Bay\c off the coast of \JNamibia\j, teemed with the previously unknown \IThiomargarita.\i Like the closely related \IThioploca,\i the new bacterium has to get around the problem of using sulfide by oxidizing it with the help of nitrates. \IThiomargarita\i gets its \Jenergy\j by stripping the electrons from sulfide ions, but to do so, it needs an \Jelectron\j acceptor. In most sulfur \Jbacteria\j, this \Jelectron\j acceptor would be oxygen, but in the anaerobic sea-floor sediments, the only possible candidate is \Jnitrate\j ions.\p
While \Jnitrate\j ions are found in seawater, they do not normally penetrate far into the oxygen-poor, sulfide-rich sediment where these \Jbacteria\j are found. Their elegant solution is to store both sulfur and nitrates, and this tells us that the apparently independent sulfur and \Jnitrogen\j cycles are coupled closer, perhaps far closer, than anybody has realized before. This will depend on how widely spread \IThioploca\i and its relatives turn out to be. Sulfide ion levels are high in the plankton-rich upwelling regions where these microbes have been found, making it more likely that they are important. The biomass levels for the bacterium, up to 47 gram per square meter, also support this view.\p
While they have many similarities, \IThioploca\i and the giant \IThiomargarita\i have very different ways of scavenging. \IThioploca\i cells form filaments which cling to each other and secrete a sheath of mucous film which surrounds them. The sheath forms a vertical tunnel up through the sediment to the overlying \Jwater\j, allowing the \IThioploca\i filaments to glide up and down, shuttling between their food source and the \Jnitrate\j they need to metabolize it.\p
\IThiomargarita\i microbes, on the other hand, do not form filaments and are not mobile. They exist in strands of single, unattached cells evenly separated by a mucous sheath. Most of the strands observed so far have been linear, containing an average of 12 cells, and the longest chains of 40 to 50 cells tended to break apart easily when they were manipulated. A few chains branched or coiled together in a ball. \p
The \Jballoon\j-like \IThiomargarita\i seems to favor loose, fluffy, plankton-rich sediments where \IThioploca\i does not occur at all, and it looks as though the vertical sheath tunnels of \IThioploca\i can not be supported. Instead, \IThiomargarita\i hangs passively, waiting to come in contact with supplies of \Jnitrate\j when a passing storm stirs up the sediments - apparently the \Jbacteria\j can withstand this form of starvation easily.\p
\BKey names:\b Heide Schulz, Thorsten Brinkhoff, Mariona Hernßndez MarinΘ, Andreas Teske, Timothy Ferdelman, and Bo Barker Jorgensen.\p
#
"A tree grows in Morocco, long, long ago",1045,0,0,0
(Apr '99)
Trees are amazing examples of natural \Jengineering\j design. The woody strength built into the successive rings of growth in a tree, the protective bark, the roots - all of these features must have taken millions of years to develop. Now we have some idea of when those millions of years occurred, thanks to a chance find, made some years ago, and analyzed in a paper in \INature\i in late April.\p
\IArchaeopteris,\i an extinct tree that made up most of the forests across the \Jearth\j in the late \1Devonian period\c around 360 million years ago, seems to have had the same structure as modern trees. This has become apparent after decades of study of \Jfossil\j material, but the story really begins with the discovery of a five-meter length of trunk in \1Morocco\c in 1991.\p
Jobst Wendt was studying marine deposits in the Moroccan part of the \1Sahara Desert\c, where he has been mapping marine rock formations for many years. Some years ago, he found logs that had been buried in ancient marine sediment which he says would have been hundreds of miles off the ancient coastline. Now, the shifting desert sands have exposed the sediments.\p
As a result of studies on some of Wendt's finds, a major expedition was mounted in 1998, resulting in a truckload of fossils, more than 150 pieces from three locations in the Mader Basin and Tafilalt Platform. The fossils are presently housed at \1Montpellier\c University. The fossils reveal the first evidence of trunk branching on \IArchaeopteris,\i with hundreds of examples now available, and they also found big roots, which they say had previously been mostly a matter for conjecture.\p
Cell details from slices of the trunks reveal that these ancient trees also had lateral buds on their trunks and branches. In other words, this was the only plant at that time that could bud and continue growing after the main axis tip died; although modern seed plants now have that ability.\p
The branches were attached to the trunks in the same way as modern trees, with a swelling at the branch base to form a strengthening collar and with internal layers of wood dovetailed to stop the branches breaking away. This feature was always assumed to be a modern one, but it seems now to have been found in trees, right from the start.\p
The researchers believe that these are not the ancestors of modern trees, which probably stem from a sister line of plants, the "progymnosperms." They think it significant that \IArchaeopteris\i reproduced by releasing spores rather than by producing seeds, and they suggest that their finds are at best the "aunts" of modern trees, and that the group became extinct within a short period of time at the end of the Devonian age. \p
Nonetheless, the trees were probably important. Aside from the evidence they give us now about \Jevolution\j, they were present at a time when carbon dioxide levels were plummeting from 10% down to 1%, and when oxygen was climbing from 5% of the \Jatmosphere\j up to the modern level of 20%. All of this happened over a 50-million year period in the late Devonian age. \IArchaeopteris\i was important because it made up 90% of the forests during the last 15 million years when these changes accelerated. The trees would also have provided litter to the world's streams, supporting freshwater and marine life, while the trees' roots would probably have had a strong influence on soil chemistry, and \Jecosystem\j changes like these were probably changes that would remain for all time.\p
\BKey names:\b Brigitte Meyer-Berthaud, Stephen E. Scheckler, and Jobst Wendt.\p
#
"A surprise in the nerves of plankton",1046,0,0,0
(Apr '99)
Vertebrates have one big advantage over the invertebrates according to the \Jbiology\j textbooks. The axon, the long part of a nerve cell in a vertebrate, has \Jinsulation\j in the form of a \1myelin\c sheath, while this is missing in the axons of invertebrates, even those with highly developed nervous systems like squids.\p
A paper in \INature\i in mid-April appears to stand that claim on its head: because researchers at the University of Hawaii say that a tiny calanoid \1copepod\c, a small crustacean called \IUndinula vulgaris\i has nerve cells that are coated with \Jmyelin\j. These copepods are very quick to respond to danger, responding to stimuli about a hundred times faster than humans. This is a major advantage for a small animal that needs to find food and to avoid being food; but how does the copepod achieve it?\p
Part of the answer may be size: a typical calanoid copepod is only 3 millimeters long, so the pathways are short, but that is not enough. Most of the group eat only \1phytoplankton\c, which does not take evasive action, but the calanoid copepods are the largest part of the zooplankton in the sea, the largest source of protein in the ocean and a critical link in the marine food chain between the \Jphytoplankton\j on which they feed, and the krill, fish, and whales that feed on them. They play the same role in the ocean food chains as insects do in land-based food chains.\p
Davis was the first to notice the characteristic onion-ring-like \Jmyelin\j circles around axons in copepod cross sections. \JMyelin\j-looking artifacts can occur as chemical fixation anomalies, so she and Carvalho confirmed the results by preparing samples using an ultra-rapid freezing technique. \p
Later tests showed that the copepods with \Jmyelin\j surrounding their axons exhibit consistently and significantly faster response time to stimulus in laboratory tests. \JMyelin\j does not occur in other less-evolved copepods, so it appears that it has developed independently from the \Jmyelin\j of the vertebrates. It seems reasonable to expect that sections of quite a few invertebrates will be carefully examined over the next few months.\p
\BKey names:\b April Davis, Tina (Weatherby) Carvalho, Petra Lenz, and Daniel Hartline.\p
#
"Crosswords the easy way",1047,0,0,0
(Apr '99)
When Gary Kasparov lost to Big Blue in 1997 (see \BDeep Blue beats Kasparov,\b May 1997), it looked as though the programmers had succeeded, using brute force, in beating the wit and skill of even the best human chess players. But if brute force can work in chess by analyzing ahead, along many lines of approach, the makers and solvers of crossword puzzles believed that their own intellectual corner was secure.\p
In fact, \IThe New York Times\i crossword puzzle editor, Will Shortz, went so fast as to proclaim that machines could never match humans in solving crossword puzzles, a statement that he must now be regretting, after a team at Duke University decided to treat this as a challenge. "Computer honor," said Michael Littman, "was being challenged."\p
"Proverb," a crossword solving program, managed to get 95.3% of the words correct and 98.1% of the letters correct after tackling a sample of 370 crosswords. The puzzles, chosen randomly from \IThe New York Times\i and several other sources, were each solved in less than 15 minutes. \p
The work will be officially unveiled at the annual conference of the American Association of Artificial Intelligence in July, but it was being circulated on the \JInternet\j in April. Previous conferences have featured machines competing with humans in chess, backgammon, and Scrabble, but this year, crossword puzzles will be a key feature.\p
"Proverb" (the name comes from "probabilistic cruciverbalist," meaning a crossword solver based on probability theory) has already taken on human competition at second hand, and done very well. Given the seven puzzles used at the 22nd Annual American Crossword Puzzle Tournament held in Stamford, Connecticut between 12-14 March, and directed by Shortz, the program performed equivalent to the 147th placegetter in the 254-entrant competition. On one puzzle, the software came in 20th, while one difficult puzzle with Spoonerized clues, saw the machine come in at 251st position. Overall, if this one strange puzzle was left out, the program was well in the top half of the best human competitors.\p
Some examples of Proverb's results can be seen at http://www.crosswordtournament.com/1999/art4.htm\p
Proverb depends on extremely fast computers with vast memory and disk storage, and uses tremendous amounts of data in machine-readable form. It uses several different and independently communicating programs, running as many as 30 different "Expert Modules" running on as many as 14 different computers. The different parts are written in four different computer languages: Java, Perl, C and C++, but just as computer chess works in a very different way from human chess, so too do the human and computer crossword-solving methods.\p
The human puzzler finds a likely answer, writes it in, and then looks at the cross-clues to see what words might fit. After a portion is locked in, expert solvers work out from there. Proverb, on the other hand, analyzes the word and letter clues separately, finding the best choices for each clue, and then attempts to combine them. \p
Each expert module is a different large \Jdatabase\j, some crammed with facts on subjects like music, geography, movies, writers, quotations, and synonyms, while other modules store information on word relationships.\p
And most importantly, experience plays a part. A \Jdatabase\j of around 400,000 crossword clues was used in the design of the system, and since many clues recur, this is a key to Proverb's success -- though it also explains why the Spoonerized crossword was so hard (sample clue: \I"Home is near"\i has to be read as \I"Nome is here,"\i leading to the answer "\JAlaska\j"). Other components of the system merge the clues, using probability to assess various combinations and building up the final solution.\p
The project had its genesis in a conversation between Littman and Keim after a seminar on artificial intelligence, and was then taken up by a group of 10 students undertaking a seminar course. But while the crossword solver can produce convincing solutions, it lacks any understanding of the English language or the way the world works which would give it a human sort of confidence that it had found the right answer.\p
A chess computer manipulates numbers to produce winning chess moves, but never visualizes the chess pieces on a board, and the crossword solver does not understand what is on the puzzle grid. When we achieve true artificial intelligence, we can expect to have systems that do indeed have an intuitive feel for the problems they are trying to solve. All the same, "Proverb" is good enough to compete at a high level.\p
Your reporter will only be really impressed when a computer system can both create and solve cryptic crosswords.\p
\BKey names:\b Michael Littman, Greg Keim, and Noam Shazeer.\p
#
"Making faster chips",1048,0,0,0
(Apr '99)
The speed at which a \1chip\c works depends mainly on how densely the components are packed together. A new technique, described in the \IJournal of Vacuum Science and Technology\i dated May/June 1999, offers some considerable advances of new promise for increasing the density of chips even further. The method, called "ion-assisted trench filling," is used to link components on a chip. (The "trench" in this term is a very tiny slot, etched into the chip's surface.)\p
As the \1lithography\c methods used to lay patterns on the chips get finer and finer, the resolution will go from the current 250 nanometers (billionths of a meter), to 180 nanometers this year, and as low as 100 nanometers by 2006. The components on a chip are connected by filling tiny trenches in the semiconductor chip's surface with conducting metal, usually \Jaluminium\j or \Jaluminium\j alloys. Multiple levels in a chip are connected by passing through layers to connect with layers above and below, usually with \Jtungsten\j.\p
The problem with \Jaluminium\j is simple: as the scale gets smaller, it simply will not do the job because the resistance gets higher. As well, \Jaluminium\j does not resist electromigration, the drift of metal atoms when the conductor carries high current densities, which can create voids.\p
A further problem is that the conductor needs to be compatible with the materials with a lower \1dielectric constant\c which have been introduced by chip manufacturers to improve \Jinsulation\j and reduce circuit delays. Copper is a good alternative, because it is more conductive, which makes finer wires possible, and it is also significantly less vulnerable to electromigration than \Jaluminium\j, and copper is less inclined to fracture under stress. There is just one problem: copper diffuses easily into silicon, "poisoning" it. The poisoned silicon develops defects that stop a chip from operating.\p
After \JIBM\j and Motorola announced the first commercial copper-wired chips less than two years ago, this ceased to be a problem. The new chips use a diffusion barrier which lines the trench and separates the dielectric from the conductor. Motorola uses \Jtitanium\j nitride for the barrier, and other possible barrier materials include tantalum, tantalum alloys, and tantalum nitride.\p
The original solution was to electroplate copper over the diffusion barrier to produce a "copper-wired chip," but the new technique offers a number of advantages. Ion-assisted trench-filling produces a thinner and more uniform layer of metal in a variety of architectures, and it can be used in narrower trenches with higher depth-to-width aspect ratios. It is described by the developers as filling trenches from the bottom up, automatically eliminating uneven deposition that can lead to voids in the metal lines.\p
The technique involves etching trenches into a substrate wafer, then putting the wafer under a plasma source. A pulsed-bias voltage is then used to accelerate ions toward both the sides and bottom of the trench so that a layer builds up either evenly or from the bottom. Once the right thickness is achieved, the process can be stopped. Films made of different layers can be produced by using different cathodes, such as copper, tantalum, or tantalum nitride.\p
#
"Abrasives in your chips?",1049,0,0,0
(Apr '99)
While the results are still preliminary, a report from a University of \JDelaware\j engineer to an April Materials Research Society meeting suggests that the common abrasive, silicon carbide, may help us make chips which can handle hot, high-powered, high-frequency microelectronic and microelectromechanical (MEMS) devices better than silicon. The results were published in January in \IApplied Physics Letters,\i but described in greater detail at the April 6 meeting.\p
\JGermanium\j offers higher speeds than silicon, and silicon carbide can operate at much higher temperatures. Unfortunately, it is hard to add a lot of \Jgermanium\j to silicon because the \Jgermanium\j atoms are so large that they strain the silicon lattice. If carbon is added, this seems to offer a more stable structure.\p
Even adding 1-2% of carbon to a silicon-\Jgermanium\j alloy requires heroic efforts, but starting with silicon carbide, the group has been able to get results as high as 4%. This alloy conducted twice as much current, compared with pure silicon carbide. Silicon carbide is a pure form of the abrasive material used on sandpaper, carborundum. It stands up to very high temperatures, which means that the chips made of the material could be well-suited to use in high-\Jtemperature\j environments such as \Jautomobile\j and jet engines.\p
Up until now, silicon carbide has not been commonly used as a semiconducting material, mainly because researchers find it difficult "doping" it with impurities and making electrical contacts. The researchers describe the new approach as "bombarding the silicon carbide with big \Jgermanium\j atoms, to embed them in the substrate's surface." They ionized \Jgermanium\j atoms with a hot wire, removing electrons and giving the atoms a positive charge. The ions were then fired at the silicon carbide at high speed, forcing them into place. Lastly, a quick blast of heat was used to get the atoms to reorganize themselves within the crystalline lattice. \p
During this implantation process, the relatively translucent, greenish silicon-carbide turned into a gray alloy, while heating seemed to lighten the color to reddish. This change may reflect a rearrangement of the material's electronic structure, or band gap, which alters its ability to absorb light. This development may be worth watching.\p
\BKey names:\b James Kolodzey, Gary Katulka, and Cyril Guedj.\p
#
"Explaining superconductivity",1050,0,0,0
(Apr '99)
When \1superconductivity\c was discovered in 1911, it was only seen in very cold metals, where the \Jtemperature\j was less than 5 Kelvin, about 290 (\JCelsius\j) degrees below room \Jtemperature\j. At these low temperatures, electrons can flow through the material without any resistance. This means that a superconductor carries electrons without losing \Jenergy\j when it generates heat.\p
Yet while this means that currents could potentially flow in a superconductor forever, traditional superconductors have found few practical applications. In simple terms, the problem is that a lot of \Jenergy\j has to be invested to cool them down to the required temperatures. In 1986, when chemical compounds which superconduct at much higher temperatures were discovered, all of that changed. At normal pressure, the best superconductors keep their powers at 135 Kelvin, which means much less cooling is required to initiate and maintain \Jsuperconductivity\j.\p
These "high \Jtemperature\j" superconductors have layers of copper and oxygen, with other layers sandwiched between them. But while we know this, we lack any real theoretical understanding of high \Jtemperature\j \Jsuperconductivity\j in the copper \Joxides\j. \p
The theory of low \Jtemperature\j \Jsuperconductivity\j in ordinary metals was developed in 1956 and is now well accepted. It goes like this: electrons that normally move through the material individually lose \Jenergy\j by colliding with impurities, while they are paired up in the superconducting state. Electrons have "spin," a tiny magnetic moment, but the spins of two electrons in a pair of electrons are lined up in an antiparallel fashion so the pair is actually non-magnetic. \JElectron\j pairs, which can move through the material without dissipating \Jenergy\j, also exist in high \Jtemperature\j superconductors.\p
But what keeps the pairs together in the copper \Joxides\j? That remains a mystery, but most theorists now agree that the mechanism that leads to pairing in traditional superconductors, vibrations of the atomic nuclei, cannot be responsible for high \Jtemperature\j \Jsuperconductivity\j.\p
An experiment reported in \INature\i in mid-April provides some clues about what may take the place of atomic vibrations in pairing up the electrons. Using neutrons to excite and detect fluctuations of the \Jelectron\j spins in a particular high \Jtemperature\j superconductor of chemical formula Bi\D2\dSr\D2\dCaCu\D2\dO\D8\d, they found a "collective" spin excitation in the material. In other words, the \Jelectron\j spins suddenly begin to move in unison in the superconducting state. \p
Usually this type of collective spin excitation is only found in magnetically ordered materials such as iron. So if similar effects occur in high \Jtemperature\j superconductors, this suggests that there is a magnetic pairing mechanism there as well.\p
As yet, there is still no comprehensive theory to explain high \Jtemperature\j \Jsuperconductivity\j, but at least we now appear to be several steps closer - and that may mean we are several steps closer to discovering even better materials which superconduct at even higher temperatures, perhaps even at room \Jtemperature\j. \p
#
"A vaccine against melanoma",1051,0,0,0
(Apr '99)
A paper read in April to an American Association for Cancer Research meeting in Philadelphia suggests a novel approach to malignant \1melanoma\c, treating it with a custom-made vaccine created from a patient's own cancer tumor cells. The treatment appears to have promise in treating patients where the \Jmelanoma\j has spread to two \Jlymph\j node areas. The treatment was already known to be effective in \Jmelanoma\j patients generally, but this study shows its usefulness in patients with a poor prognosis. Each patient in the study had had the visible growth removed surgically, but had secondary infection in two \Jlymph\j node areas.\p
As a rule, 39% of patients in this category are expected to live three years, but in a sample of 62 patients, 36 were still alive after 55 months. This gave projected five-year relapse-free and overall survival rates for these patients at 45% and 58% respectively, comparing favorably with the 15-25% survival rates for patients treated by surgery alone. A larger phase III trial involving 400 patients in the USA is now underway.\p
\BKey name:\b David Berd.\p
#
"Do it, but don't overdo it",1052,0,0,0
(Apr '99)
Two psychologists reported an interesting discovery about sexual activity to a meeting of the Eastern Psychological Association in Providence, Rhode Island during April. Based on immunoglobulin A (IgA) levels, one or two sexual encounters a week are better for you than no sex, or a great deal of sex. IgA is an \Jantigen\j found in \Jsaliva\j and mucosal linings, and higher levels of IgA indicate a boost to their immune systems. \p
IgA binds to pathogens at all the points of entry to the body, then calls on the immune system to destroy them, so people with higher levels are likely to be healthier. Carl Charnetski and Frank Brennan carried out the study by asking 111 Wilkes University undergraduates, aged 16 to 23, how frequently they'd had sex over the previous month. They also measured levels of IgA in the volunteers' \Jsaliva\j.\p
Those participants who claimed to have had sex less than once a week had a tiny increase in IgA over those who abstained completely. Those with one or two sexual encounters each week had a 30% rise in levels of the \Jantigen\j, but those reporting very frequent sex, three times a week or more, had lower IgA levels than the abstainers.\p
One explanation may be that sexually active people are exposed to more pathogens and infectious agents. The immune system responds by producing more IgA, but the "high" group IgA levels are more puzzling. If these people were stressed for some reason, stress and anxiety cause IgA levels to drop, but this remains a speculation.\p
An alternative explanation could be that health levels, as indicated by IgA, determine how many encounters people report, with unhealthy people having lower self-esteem, and so claiming a higher incidence. Only time will reveal which interpretation is the most reliable.\p
#
"Why a modified cold virus kills cancer cells",1053,0,0,0
(Apr '99)
The American Association for Cancer Research (AACR) annual meeting in Philadelphia in April was told that UC San Francisco researchers might have determined how the experimental anti-cancer "drug" ONYX-015 kills cancer cells. More importantly, these insights may provide a deeper understanding of cancer \Jbiology\j. (ONYX-015 is not actually a drug, but a \1virus\c - all the same, it needs to be assessed by the same protocols that are used for drugs.)\p
This understanding may help the interpretation of further phase II trials of ONYX-015, but it could also lead to modified approaches for killing cancer cells with tumor-specific viruses, an active line of research at the moment. ONYX-015 is already in phase II trials against head and neck cancer - these trials are small-scale studies designed to determined effective dosage of a new drug, and the developers say that further phase II trials will now target patients with colon, pancreatic, and ovarian cancer.\p
The agent is a genetically engineered version of the cold virus, known as an adenovirus, and several recent studies have brought into question the presumed mechanism of action of this treatment. Briefly, the virus is active against one large class of \1tumor\c cells, and might also be able to kill other tumor cells, apparently by an unknown mechanism. And that rings the alarm bells, because if it is killing "bad" cells in an unknown way, it may also kill "good" (healthy) cells.\p
The basis of the agent's action is delightfully simple. Certain classes of cancer, about 60% of all malignant tumors, arise because a gene known as p53 has somehow been knocked out. The p53 gene stops cells from slipping into uncontrolled cell growth, a common feature in cancer. It also has a role in \1metamorphosis\c.\p
ONYX-015 was designed by Frank McCormick to take advantage of this. The adenovirus contains a gene called E1b which disables p53, allowing it to attack a healthy cell. If an adenovirus has no E1b gene, it cannot invade normal healthy cells because it cannot disable the healthy p53 gene. Tumor cells, on the other hand, if they lack the p53 gene, should be an easy target. In other words, given a disabled adenovirus, we have a magic bullet, a smart bomb, an agent that can tell friend from foe, and wipe out the foe. The viruses would enter the tumor cells, and with no p53 gene to inhibit them, would replicate continuously, disrupting cell behavior, and ultimately causing cell death. \p
That, at least, is the theory behind McCormick's work, but recent laboratory studies have also shown that ONYX-015 is able to replicate in tumor cells in which p53 remains intact. So if that part can go wrong, what else might the virus be doing? Is it a cause for worry?\p
Apparently not. McCormick has taken into account some work reported in \INature\i last September to offer a possible explanation of what is happening. The explanation involves a second gene, p14ARF, which appears to be mutated and defective in some tumor cells - it seems that damage to p14ARF may indirectly disable the p53 function. \p
McCormick reports that when normal cells are infected with ONYX-015, p53 is induced, and the virus is shut down. When tumor cells with their p53 genes intact are missing p14ARF, the p53 induction does not occur, leaving the cells open to adenovirus attack.\p
The next step will be to try putting the p14ARF gene back into these tumor cells and seeing if it really does stop the virus from growing. "If the p14ARF gene's function is proven, it could also serve as a new target for cancer therapies in some tumors lacking functional p53 genes," says McCormick.\p
\BKey name:\b Frank McCormick (founder of Onyx Pharmaceuticals).\p
#
"Microgravity and gene transfer in plants",1054,0,0,0
(Apr '99)
Transferring desirable genes into crop plants is highly improbable, with success rates typically running out at something like 1 in 1000. But like lotteries and other games of chance with low odds, the payoff when you are successful seems to make up for it.\p
Yet while most gamblers dream of finding a way to shade the odds in their favor, and always fail, a recent research project on the Space Shuttle suggests that \1microgravity\c might enhance genetic \Jengineering\j of plants. When you remove gravity from the mix, it seems the odds of success get much better. Researchers had thought a two-fold increase would be nice, but in the event, the success level jumped ten-fold, compared with a control experiment carried out on the ground.\p
The method developed at the University of Toledo uses \Jbacteria\j to transfer genes. It begins with deliberate damage to the meristem (growing region) of a plant seedling. Next, \Jbacteria\j with the desired gene are inserted into the damaged cells, so that they are infected. Then with luck, the later plant parts that derive from the infected meristem cells will contain the desired gene.\p
Usually, the \Jbacteria\j die off harmlessly, but the microgravity environment caused such high levels of infection that the vascular systems of the plants were blocked. Future missions will use lower levels of \Jbacteria\j in the application to try to get around this.\p
Now the problem is to explain the microgravity effect. It is likely to be explained by the fact that cell materials do not settle out in microgravity, so there is more freedom of movement. With less restriction on their movements, the \Jbacteria\j probably hit their targets more easily, say researchers.\p
Gene transfer techniques are becoming increasingly important as a complement to traditional plant breeding, developing strains with better yields and higher resistance to pests and disease. The experiment used a gene that merely causes \Jfluorescence\j, allowing it to be tracked, but there are now plans to "transfer a gene that has been shown to relieve certain human autoimmune diseases."\p
Around 30% of the International Space Station, now being assembled in \Jorbit\j, is dedicated to private commercial use, so this sort of discovery also has a straightforward commercial interest, aside from what it tells us about science. All that seems to be needed is for the experiment to be refined and repeated to ensure that the seedlings survive. \p
#
"Controlling apoptosis",1055,0,0,0
(Apr '99)
Apoptosis, or controlled cell death, is essential to the \1metamorphosis\c and survival of complicated life forms. Human fingers are formed when the cells that web the fingers together die off, and cell death can deprive an invading virus of the \Jinfrastructure\j it needs to replicate and infect other cells. But while we know a good deal about what apoptosis does, and a fair amount about how it does it, science has been largely ignorant of the way this orderly, programmed cell death is switched on.\p
A report in \IScience\i in late April reveals that nitric oxide (NO), a well-studied molecule implicated in a host of communication pathways in and between cells, is also a switch that controls whether cells live or die. The 1998 Nobel Prize for \JPhysiology\j or Medicine went to Robert F. Furchgott, Louis J. Ignarro and Ferid for their discoveries concerning "nitric oxide as a signaling molecule in the cardiovascular system" (see \BNobel Prize for \JPhysiology\j or Medicine,\b October 1998), but this latest discovery surely makes nitric oxide one of the most common signaling molecules in all of \Jbiology\j.\p
It binds to proteins and regulates their activity, and there is now increasing evidence that it regulates cell growth, and cell differentiation. It is also implicated in the operation of the drug Viagra, and now it appears to control the death program of cells. The NO molecules are reported to occupy a critical site on the \Jenzyme\j caspase, a molecular "executioner" within human cells. So long as the NO molecule is in place, it blocks the communication path that would activate the caspase molecules, and brings about the death of the cell. All that needs to happen is for nitric oxide production to be blocked within the cell and the cell becomes more susceptible to death. On the other hand, if NO production is restored again, cell death is prevented.\p
There is a biochemical chain of events known as the Fas pathway which triggers apoptosis. When it is set off, the Fas pathway initiates a cascade of signals within the cell that ultimately turns on caspase, so long as the crucial site on the caspase molecule is not blocked - or so long as Fas does not somehow take the NO away.\p
The researchers believe that this may lead to ways of preventing damage caused in cases of heart failure, liver damage or arteriosclerosis by blocking apoptosis. They also suggest that manipulating the NO switch might control the formation or progress of tumors.\p
The next step will involve finding out how the NO is removed from its position on the caspase \Jenzyme\j, but there must be things that remove the nitric oxide molecule, so it will hopefully be just a matter of time before the pathway is found.\p
\BKey name:\b Jonathan S. Stamler.\p
#
"Seeing the core-mantle boundary",1056,0,0,0
(Apr '99)
Geologists may never see deep inside the \JEarth\j in the sense that \BJules Verne\b had in mind when he wrote \IVoyage au centre de la terre,\i or "Journey to the Center of the \JEarth\j," but a seismologist at Washington University in St. Louis has just provided an unprecedented view of the \JEarth\j's core-mantle boundary through analysis of seismic waves from a unique array of eastern US \1seismograph\c stations. \p
Michael E. Wysession is part of a seismological team that installed the Missouri-to-\JMassachusetts\j (MOMA) network of 18 sophisticated seismographs in 1995 and recorded data until 1996. The data have now provided a detailed picture of the bottom of the mantle, where rock meets iron.\p
The base of the mantle is made up of two types of rocks that are distinctly separated. Describing his work in \IScience\i in early April, Wysession details the rock layers. One is made up of cold slabs of recycled oceanic floor that are spreading horizontally at the core-mantle boundary, while the other layer is made up of mantle dregs which are pushed around by the descending slabs.\p
Wysession reached his conclusion by looking at the ratios of the two types of seismic waves that come from an \Jearthquake\j. Seismic P waves shunt their way through, with each piece of rock shoving the next, rather like a sound wave, while S (or shear) waves travel in a sideways movement, rather like a rope which has been given a shake. The speeds of these two wave types vary as they travel through different materials, and so a comparison has long been used to map different types of rock at the \JEarth\j's surface. This is the first time that the core-mantle boundary has been investigated in this way.\p
Normally, the P and S waves vary in tandem, and this effect is even stronger when the variations are due to changes in \Jtemperature\j. Given that slabs of ancient sea floor sink to the base of the mantle, the researchers expected to see a gradual change as the slabs spread across the top of the core and heated up. \p
Instead, they found rapid change. On the boundary beneath \JAlaska\j, the S waves traveled fast and the P waves were slow. But further south, there was a sudden switch, and the P waves were fast while the S waves were slow. The change was abrupt, and Wysession described it as being ". . . like standing on a shoreline with the continent on one side and the ocean on the other."\p
The rock at the base of the mantle beneath \JAlaska\j was once part of the Pacific Ocean sea floor, but it sank into the mantle more than one hundred million years ago. Since then, this cold rock has sunk all the way down to the top of the core, pushing aside the chemical boundary layer into two large lumps, one beneath the central Pacific, and one beneath western \JAfrica\j. These lumps serve as the birthplace for most of the \JEarth\j's hotspot plumes (see, for example, \BThe African landscape shaped by single \Jmagma\j plume,\b October 1998, and \BFinding a hot spot,\b November 1998).\p
Just as the relatively buoyant continents float on the mantle, these cold slabs "float" on the core, though they probably gain heat and begin to rise again soon after.\p
The evidence came from core-diffracting waves emanating from 50 earthquakes that occurred in the \JEarth\j's major \Jearthquake\j belt, stretching from New Zealand to \JJapan\j. These waves diffract, or bend around the core in much the same way that sound waves can be heard, even around the corner of a building. The P and S wave speeds observed at the core-mantle boundary, the researchers realized, had to be passing through an anisotropic material.\p
\1Anisotropy\c occurs when the microscopic mineral grains that make up a rock are preferentially aligned in a particular direction. The most common cause is some sort of shearing force: in the upper mantle, anisotropy is commonly seen in olivine which becomes aligned in the same direction that the rock is convecting or flowing. In this case, the anisotropy shows up as a splitting of the S waves according to their polarization. In simple terms, the S waves oscillating in one direction arrive faster than the S waves oscillating at right angles to that direction. This sort of S wave splitting was apparent in the core-diffracted waves beneath \JAlaska\j. \p
The unusually slow P waves under \JAlaska\j can also be explained by anisotropy in the area, which tells us that the ancient sea floor slabs are flowing outward near the top of the core, one of the few examples we can point to of such detailed evidence from so far down.\p
All the same, this may not be the whole story; Wysession believes that we may be on the verge of some major discoveries about mantle dynamics, as far-reaching as the discovery of plate tectonics. It is impossible to predict where such insights may lead, but a better understanding of hot spot plumes alone would make this a major new branch of \Jgeology\j.\p
"We're getting much better glimpses of processes that shape the deep \JEarth\j, and also an understanding of the circulation of rock from the surface to the core and back up again and how that shapes the \Jevolution\j of our continents," Wysession says. \p
#
"Bad news for forests",1057,0,0,0
(Apr '99)
Taking trees from rainforests is bad enough, but the ecological damage done is heightened further by the staggering numbers of gorillas, elephants, and other wildlife, which are being killed and sold as "bushmeat." A report by the Bronx Zoo-based Wildlife Conservation Society (WCS), published in the journal \IScience\i in late April, says that markets for wild game have been boosted by new logging roads which have opened up some 6 million hectares (23,000 sq miles) of previously inaccessible land.\p
In the \JCongo\j, the levels of hunting in communities adjacent to logging roads is 3-6 times higher than in roadless areas. The result is that even when logging policies seek to protect rainforests through "sustainable \Jforestry\j," the result may still be an ecological disaster.\p
Part of the problem is that many of the larger animals like elephants and tapirs help to regenerate trees through seed dispersal, and the larger animals are the most likely targets for hunters. Then there is the human aspect: local residents who have relied for a long time on subsistence harvesting of bushmeat now find themselves faced by a local shortage.\p
#
"Next, now for the weather next century . . .",1058,0,0,0
(Apr '99)
The latest results from a new climate system model developed at the National Center for Atmospheric Research (NCAR) suggest that CO\D2\d emissions over the next century could increase global average temperatures by 2░C (3░F) by the end of the century. Rainfall in the US Southwest and Great Plains could rise by 40%.\p
On the other hand, if the CO\D2\d increase was limited to half the expected level, most of the extra rain and snow could be dried up, and the \Jtemperature\j increase would drop by about a third. In either case, the changes will be three to four times greater than those experienced since 1900.\p
The model simulates the \JEarth\j's climate from 1870 to 1990, and then continues the simulation to 2100 under two different scenarios. The first was a "business-as-usual" increase in greenhouse gases in which atmospheric carbon dioxide doubles over the next century. In the second, carbon dioxide increases are stabilized at 50% above today's concentrations.\p
In the first projection, changes in precipitation (as rain or snow) vary markedly by region and by season. The changes are reduced when carbon dioxide emissions are limited, according to the model. \p
The two scenarios show no clear separation until 2060, although the CO\D2\d levels begin to diverge in 2010. The half-century lag in reaction time is put down to the large thermal \Jinertia\j in the \JEarth\j's climate system, especially in the oceans, according to NCAR scientists.\p
NCAR scientists say their model is one of the first which does not need special corrections to keep the simulated climate from drifting to an unrealistic state. They add that it is one of only a few models in the world capable of realistically simulating the chemistry and transport of individual greenhouse gases and sulfur compounds. This is important if it is going to take into account the likely changes in sulfur dioxide. This gas brings about cooling in the \Jatmosphere\j, but because it is also a pollutant, most societies will be taking steps to reduce their sulfur dioxide emissions.\p
#
"The threat to tropical coral reefs",1059,0,0,0
(Apr '99)
The problems of increasing levels of carbon dioxide may go further than global warming effects. A paper in \IScience\i in early April looks at the risk that tropical \Jcoral\j reefs may be under threat, and that some reefs may already be declining.\p
The study looks at \Jcoral\j reefs located in surface waters between 35 degrees north and 35 degrees south of the equator, and the authors predict that the reefs in greatest danger are those where the production and destruction of \Jcalcium\j carbonate are closely balanced. These include some higher-latitude reefs, like those off \JBermuda\j; reefs in areas where colder, deeper waters rise to the surface, like those off the Galapagos Islands; and many reefs already stressed by human activity. \p
A \Jcoral\j reef develops as \Jcalcium\j carbonate accumulates from the activities of corals and other \Jcalcium\j-secreting organisms, such as coralline \Jalgae\j. While the seas in such areas remain saturated with \Jcalcium\j, the reef remains strong, but the effect of higher carbon dioxide levels is to raise the \Jcalcium\j levels required for saturation.\p
The effect is simple: \Jcalcium\j carbonate is highly insoluble, but extra carbon dioxide allows the formation of \Jhydrogen\j carbonate ions (previously known as "bicarbonate ions"), and these are very soluble. Putting it another way, the dissolved carbon dioxide makes seawater more acidic, so that the amount of \Jcalcium\j present, even though it remains the same, is no longer enough to saturate the seawater. The extra carbon dioxide causes \Jcalcium\j production to decline, so that \Jcoral\j and algal skeletons weaken and reef building slows or even stops. At that point, the reef is more vulnerable to erosion effects.\p
The study used a moderate projection of CO\D2\d levels, which predicts that levels will increase to twice the pre-industrial level by 2065. It is assumed that this production will have an effect on oceanic CO\D2\d levels, but this appears to be the first study to explore the ecological consequences of increased levels of CO\D2\d at the tropical sea surface.\p
The study is speculative, but it appears to make a confident prediction that the level of \Jcalcium\j carbonate saturation will drop by 30%, compared with pre-industrial levels. The study will, however, need to be tested with laboratory and field measurements, but if the speculations prove to be well-founded, many species of \Jcoral\j reefs could be vulnerable.\p
Recent \Jcoral\j bleaching events have been blamed on warmer sea temperatures, but some workers have suggested that warmer waters might be useful for reefs in chillier waters. If the \Jcalcium\j carbonate saturation rate is as important as \Jwater\j \Jtemperature\j in reef building, then warmer waters will not save higher-latitude reefs. \p
\BKey name:\b Joan Kleypas.\p
#
"Counting sprites",1060,0,0,0
(Apr '99)
\JLightning\j strikes are sometimes associated with a "sprite," a high-altitude luminous red glow. The sprites are electrical phenomena that appear above thunderclouds, reaching the lower ionosphere. A report in \IGeophysical Research Letters\i in early April describes a method of accurately estimating the numbers of sprites in a storm.\p
About one \Jlightning\j strike in 200 is associated with a sprite. The sprite can tower up to 90 km (55 miles) above a thundercloud, occurring simultaneously with a \Jlightning\j strike, and can be seen with the naked eye, sometimes from as far away as 600 km (about 400 miles).\p
Sprites can be difficult to detect, especially when they are masked by clouds, so the new method is interesting, especially because the researchers believe it is possible to count every sprite in a whole hemisphere with just four low-cost radio receivers.\p
Sprites are found above all the major landmasses of the \JEarth\j, and they provide "spectacular luminous evidence of electrodynamic coupling between the neutral \Jatmosphere\j in which \Jweather\j processes occur and the higher altitude (60-90 km) ionized regions of the \Jearth\j's \Jatmosphere\j," in the words of a member of the research team. These ionized regions are the levels we call the \Jmesosphere\j and the lower ionosphere.\p
The key to the new detection method lies in the difference between the radio signals produced by \Jlightning\j discharges that lead to sprites and the radio signals due to other \Jlightning\j discharges. After a careful analysis of a \Jthunderstorm\j over Kansas in 1996, a storm with no less than 98 sprites, the researchers have found that it is the \Jlightning\j itself, not the sprite, which makes the difference. A typical \Jlightning\j strike occurs in one-tenth of a millisecond, but the sprite-related \Jlightning\j discharges emit a much longer-lasting electrical current with a lifetime of several milliseconds. This is a relatively long discharge, and the difference is easily detected when the radio signals are examined.\p
As the researchers have pointed out, a storm in \JBrazil\j could be monitored by stations in \JCalifornia\j and \JAntarctica\j, so while video recording is unreliable and hugely expensive, it will now be possible to monitor all the sprites in the world. While sprites do not cause any serious problems, so far as we know, they may cause chemical changes in the \Jatmosphere\j, but without the sort of study which now seems possible, we would never know.\p
\BKey names:\b Steven Reising, Umran Inan, and Timothy Bell.\p
#
"Antarctic ice shelves breaking up",1061,0,0,0
(Apr '99)
Researchers at the University of \JColorado\j at Boulder's National Snow and Ice Data Center and the British Antarctic Survey reported on the \JInternet\j in April that two ice shelves on the Antarctic Peninsula known as the Larsen B and Wilkins, are in "full retreat" and have lost nearly 3,000 square kilometers of their total area in the last year.\p
They say the cause is global warming, pointing out that the trend has caused the annual melt season to increase by two to three weeks over the last 20 years. \JSatellite\j photos monitored by NSIDC show that the Larsen B ice shelf has continued to crumble after an initial small retreat in spring 1998 (see \BLarsen B Ice Shelf loses 200 square kilometers,\b April 1998). \p
In a series of events that began in November 1998, an additional 1,714 square kilometers (670 square miles) of shelf area have been lost, according to Research Associate Ted Scambos of CU-Boulder's NSIDC. On top of that, Scambos adds, on the opposite side of the peninsula, the Wilkins Ice Shelf retreated nearly 1,100 square kilometers (400 square miles) in early March 1998.\p
Radar \Jsatellite\j images have confirmed that a breakup is underway. The images show a large area of completely shattered ice, indicating an ice front 35 kilometers (22 miles) back from its previous extent, and thousands of small icebergs. The sudden appearance of these icebergs suggests that the shelves are essentially broken up in place and then flushed out by storms or currents afterward.\p
The British Antarctic Survey scientists had predicted one of these retreats, using computer models to demonstrate that the Larsen B was nearing its stability limit. With the small breakup observed last spring, the shelf had already retreated too far to continue to be supported by adjacent islands and shorelines. \p
Scientists at both institutes expected the two shelves to fail soon, but the current disintegration is occurring at an even faster rate than earlier breakups gave reason to anticipate. While the retreat appears to have been going on for about 50 years, previous losses only added to about 7,000 square kilometers (2,700 square miles). That is about the same size as the remaining Larsen B ice shelf, while the Wilkins ice shelf is about twice that size.\p
So will this affect sea levels? Not at all, because the ice shelves are floating on the sea already, and the melt \Jwater\j they produce will have a smaller volume than the ice, but other effects are possible: the world's albedo will be different, and it may have a significant effect on long-distance ocean currents.\p
#
"Telomerase broadens its scope",1062,0,0,0
(Apr '99)
The \Jenzyme\j telomerase (see \BA breakthrough with human embryonic stem cells,\b November 1998) still has some surprises for us in its method of operation. UC San Francisco researchers reported in the \IProceedings of the National Academy of Sciences\i during April that it can wield its power in an unexpected way. \p
If you add telomerase to normal cells, it extends the healthy life span of the cells indefinitely. The \Jenzyme\j synthesizes telomeres, snippets of DNA on the ends of chromosomes which work rather like the plastic tips on the ends of shoelaces, preventing the gene-bearing, threadlike spindles from unraveling. \p
With each cycle of cell division, a small part of the tips of telomeres drop off, until the "chemical bookends" become so eroded that the chromosomes become unstable, a condition which signals cells to stop dividing. This natural control on cell division is known as the Hayflick limit. Cancers develop when cells increasingly ignore or override such signals of \Jchromosome\j impairment. \p
So how do our cells hold on long enough to grow complex things like us? Simple - our body cells derive from stem cells which have special powers, because their telomeres are maintained in a complete form, but once cells derive from the stem cell stock, a "clock" is started, a count-down that limits further replication. \p
The standard understanding has always been that telomerase extended a cell's life by lengthening the telomeres, but the new report suggests that telomerase can extend the life span of human fibroblast cells without lengthening telomeres. Telomerase, the researchers discovered, caps the ends of telomeres and, in so doing, somehow contributes to telomere stability and makes the \Jchromosome\j more stable, regardless of telomere length. \p
Some of the cells, according to Elizabeth Blackburn who co-discovered telomerase in 1985, thrived with very short telomeres. In fact, the telomeres were shorter than in another set of cells not given the \Jenzyme\j and which had stopped replicating. Conclusion: the theory may be on the right track, but it is not yet the whole story. There are new opportunities out there, new options for exploitation. The finding does not suggest that telomere length is not important to cell life span. Rather, it reveals a new factor which contributes to \Jchromosome\j stability, and it offers a new direction - or directions - for potentially manipulating the \Jenzyme\j for therapeutic purposes.\p
The discovery of telomerase has opened up a whole field of inquiry into prospects for turning the \Jenzyme\j on or off to prolong life or combat cancers respectively. For example, while the \Jenzyme\j is "turned off" in many normal cells in humans, researchers can activate it by inserting the gene for its protein component, known as TERT, into cells. This usually extends the life span of the cells, apparently forever, and cuts down the occurrence of abnormal chromosomes, but it has always done so by lengthening the telomeres, and that did not happen this time. Instead, the telomerase appears just to have placed a cap on the telomeres.\p
The human fibroblast cells had been set up to imitate cancer cells, to be more likely to begin dividing excessively. They were not full-fledged cancer cells, since some molecular checkpoints remained in place, but once telomerase was added, they went into uncontrolled division, revealing that the \Jenzyme\j is a cancer-promoting factor in cells which are already part way along the road to malignancy. \p
The find is probably a matter of lucky timing, with the chromosomes in the cultured cells just happening to have very short telomeres at the moment when the \Jenzyme\j was added. As a result, the researchers managed to catch the molecular to-and-fro of telomere length at the unusual moment when telomeres were very short and telomerase's role was needed to kick-in to stabilize the \Jchromosome\j tips, thus revealing telomerase's unique contribution. Instead of the telomeres getting longer, they were capped by the telomerase.\p
Because the telomeres were still getting shorter, even in the presence of telomerase, says Blackburn, the \Jenzyme\j could not keep up with the telomere loss. She believes there may be three or four methods that chromosomes have of buttressing themselves, and that these methods co-operate. So if one device is taken away, the \Jchromosome\j remains stable, but the multiple levels remain hidden unless one method is completely hidden, as happened in this case, where the capping effect was unseen until the telomeres got very short.\p
The next problem in this chain will be to work out how the telomerase works in its newly revealed capacity. Researchers expect to be looking at mutants in yeast and human cells to try to figure out this mechanism.\p
Yet even while it is not understood, the effect probably means that it will be easier to replenish cells by telomerase manipulation. Originally, scientists thought it would take many tens of cell-replication rounds for telomerase to build up the length of telomeres to an effective length, but now it looks as though telomerase can provide "a quick fix." As an example, Blackburn's team cites cases where it is necessary to build back a sufficient supply of healthy cells quickly, such as during bone marrow transplants. It may be possible to bring in the telomerase and then turn it back off again, before it can do harm, and start causing a risk of cancer.\p
\BKey names:\b He Wang and Jiyue Zhu, Elizabeth Blackburn, and J. Michael Bishop.\p
#
"Tobacco secrets",1063,0,0,0
(Apr '99)
At the very end of March, the University of \JCalifornia\j, San Francisco released onto the \JInternet\j more than 2,000 pages of previously secret \Jtobacco\j industry documents. These include hundreds of pages of internal industry memos detailing strategies to counter anti-\Jtobacco\j legislation and sentiment in \JCalifornia\j, covering the years 1990 to 1995. The released material is a small fraction of the 33 million pages of \Jtobacco\j industry documents now held at the State of \JMinnesota\j Depository, the result of a successful suit by the US state against seven American \Jtobacco\j companies and several \Jtobacco\j trade groups. The suit was settled in 1998. \p
The released documents are located at http://www.library.ucsf.edu/\Jtobacco\j/ and their presence on the Worldwide Web shows one of the more interesting powers of the \JInternet\j.\p
#
"Seabed silt in the Indian Ocean",1064,0,0,0
(Apr '99)
Scientists at the Netherlands Institute for Sea Research (NIOZ) have been studying the flows of marine material which are settling to the deep ocean floor, using automated sediment traps. They have found that almost 90% of the silt on the floor of the Indian Ocean consists of the remains of plankton that bloomed in the course of the summer monsoon.\p
This means that studies of the oceanic sediments can provide insights into climatic changes in the past, since the blooming is greatly dependent on the monsoon climate, and the monsoon climate depends on large climatic shifts. \p
\BKey names:\b Geert-Jan Brummer, W. Helder.\p
#
"May, 1999 Science Review",1065,0,0,0
\JMay 1999\j
\JFossilized emu egg shells have a story to tell\j
\JYet another gamma-ray burster\j
\JThunderstorms make gamma rays too\j
\JEvidence of plate tectonics on Mars?\j
\JMapping Mars in three dimensions\j
\JWhen was the Big Bang again?\j
\JWhat was that satellite?\j
\JA new volcano\j
\JLiving with arsenic\j
\JEngineered corn and monarch butterflies\j
\JNew uses for the humble flatworm\j
\JDisabling bacteria\j
\JAn experimental staph vaccine\j
\JSlimy bacteria and chronic infections\j
\JIntestine transplant survivors\j
\JOral control of diabetes a step closer\j
\JNear-sighted children and the light\j
\JSafer gene delivery\j
\JTelomeres are big loops\j
\JMutton dressed up as lamb?\j
\JWomen under anesthesia\j
\JA neurological basis for dyslexia?\j
\JThe human capacity for mathematics\j
\JA hormone treatment for autism\j
\JCleaning up video with VISAR\j
\JAre the birds modern dinosaurs?\j
\JExplaining the scale of things\j
\JHigh carbon dioxide boosts Duke forest growth 25%\j
\JThe cost of low cholesterol\j
\JBacteria clean up DDT\j
\JA marker for malignant brain tumors\j
\JMelatonin not a good idea?\j
#
"May 1999",1066,0,0,0
The major news story for May seems to have been NASA's success in mapping the Martian surface. Other highlights include reports that new wonder-crops, engineered with Bt genes, came under attack because they may present a risk to the caterpillars of monarch butterflies. We also have a new age for the universe, and some interesting medical developments, but our award for elegant science this month is found in the first story, with researchers showing how fossilized emu egg shells tell us what the \Jweather\j was like in \JAustralia\j, three times further back than any previous record.\p
#
"Fossilized emu egg shells have a story to tell",1067,0,0,0
(May '99)
The Australian environment in the recent past has always been something of a mystery to researchers. The \Jpollen\j grains that scientists usually rely on for data in other parts of the world just cannot survive in the harsh and arid Australian environment. Because of that, and because other vegetation records are also sparse, there is nothing at all beyond 18,000 years ago.\p
Now all of a sudden, we have information back to 65,000 years, and the results are most interesting. A mid-May report in \IScience\i reveals that there is a detailed account of past climates stored in the fossilized egg shells from \JAustralia\j's emu, an \Jostrich\j-sized, flightless, fast-running bird which eats a wide variety of plants. Some of the carbon \Jisotopes\j taken up in emus' food ends up in the egg shells, where the ratio of the stable carbon 12 and carbon 13 \Jisotopes\j remains locked as a permanent record.\p
The actual isotope ratio depends on what the emus eat, and studies of modern-\Jday\j emu shells show that the ratios depend on the types of plants eaten. Australian plants fall into two groups, depending on the biochemistry of their \Jphotosynthesis\j, and the distribution of the two groups depends on climate. One set of plants converts carbon dioxide to a compound with 3 carbon atoms during the process of \Jphotosynthesis\j, and these C3 plants (mostly trees, shrubs and winter grasses) dominate southern \JAustralia\j where it rains in the winter and there are cooler temperatures.\p
The second group of plants changes the carbon dioxide into a compound with four atoms of carbon, and these summer-loving C4 grasses are dominant in northern \JAustralia\j where the rainfall is mainly in summer and there are warmer temperatures. The first reaction, when carbon dioxide is fixed into the C3 and C4 plants is different in the two plant types, and controlled by different biochemical enzymes. And because of differences in the CO\D2\d fixation methods, the C3 and C4 plants incorporate slightly different amounts of carbon 13 into their organic material.\p
About 1.1% of all carbon on \JEarth\j is carbon 13, and almost all the rest is carbon 12, with just a trace of radioactive carbon 14, just enough to allow carbon dating on recent material. All matter derived from living things has a measurable concentration of carbon 13, and scientists describe this in terms of the \Jcalcium\j carbonate in a \Jfossil\j shell. On this standard, the C4 plants are very "heavy," having only a little less carbon 13 than the standard, while the C3 plants have a distinctly smaller amount of carbon 13. The difference is great enough that the ranges found in the two plant types never overlap. (There are also some plants, called the CAM plants, with an intermediate range of carbon 13, but there are very few of these in \JAustralia\j.)\p
So the carbon which is fixed into the \Jcalcium\j carbonate emu egg shells is going to have a measurable amount of carbon 13, and that amount, once measured, tells us what types of plants the emus must have eaten, around the time they were laying their eggs.\p
The isotopic ratios come from fossilized eggshell samples from the Lake Eyre Basin of \JAustralia\j, an arid region which is about a sixth of the size of the entire continent. Some of the shells date from between 65,000 and 45,000 years ago, and it appears that the C4 grasses were abundant and readily available for the emus to eat during their breeding season at that time. Emus lay their eggs in winter, so it follows that the summer rainy season, which is influenced by the monsoon from the north, must have lasted longer back then. This would make the C4 grasses still present and a significant part of the diet when emus were laying their eggs in winter, a time when the summer-growth C4 grasses might otherwise have been less easy to find.\p
Between 28,000 and 15,000 years ago, the world's last glacial period was at its peak, and the isotope ratios tell us that the C4 grasses were almost entirely missing from the central Australian scene because of the colder climate. When the \Jearth\j warmed again in the Holocene period 10,000 years ago, the \Jplanet\j's monsoon systems were reactivated. In \JAustralia\j the C4 grasses expanded their range somewhat, but the \Jisotopes\j in the egg shells tell us that the central Australian \Jecosystem\j did not fully revert to its former C4 abundances. This suggests that over the last 65,000 years, environmental factors other than climate have significantly influenced the Australian ecology. \p
On current evidence, the first humans arrived in \JAustralia\j about 60,000 years ago - or at least, numbers of humans large enough to affect ecosystems only arrived and became established then. The early Australians seem to have used fire as a farming tool - given the Australian soils, fauna and flora, this was the most appropriate way to farm - and this probably brought about some of the changes. Probably the key was the transfer of moisture from the plants and soils to the \Jatmosphere\j which acts as a \Jfeedback\j mechanism, pulling the monsoon rains of summer down into the middle of the large flat island continent.\p
The use of "firestick farming" would have changed the proportions of the plants, reducing this effect on the monsoons and reducing the rains which reached what is now the arid Australian interior. There is no doubt that at about the same time as the first Australians arrived many large Australian animal species died, but whether there is a causal link or not is still not proven. It is, however, a topic that promotes passionate argument among anthropologists and biologists.\p
The climate changes which allowed humans to travel to \JAustralia\j may also have killed the megafauna, or humans may have hunted the megafauna to \Jextinction\j, or human actions may have caused climate changes which led to the loss of these species. On the face of it, the emu eggs seem to be telling us that it was the third option, since many of the animals appear to have become extinct at just the time when the rainfall levels, as shown in the emu egg shells, dropped.\p
What we do know is that European colonization, beginning in 1788, and really taking off around 1850, further reduced the proportion of C4 grasses. This modern change, though, is more related to over-grazing than to any further shift in climate, with other human activities and introduced species also suspected of playing a role.\p
\BKey names:\b Beverly J. Johnson and Marilyn L. Fogel in the USA, John Magee, Gifford Miller, Michael Gagan, and Allan Chivas in \JAustralia\j.\p
#
"Yet another gamma-ray burster",1068,0,0,0
(May '99)
The Burst and Transient Source Experiment (BATSE) aboard the Compton Gamma Ray Observatory captured a gamma ray burst in southern skies on May 10, giving it the name GRB990510. It was a strong burst, in the top 4% of bursts for its peak flux, and lasting about 100 seconds. It was described as having a lot of structure. \p
Beppo-SAX, an Italian-Dutch \Jsatellite\j, also "saw" the burst, both in gamma rays and x-rays. The \Jsatellite\j then used its wide-field camera to provide a more precise location of the burster than BATSE can provide. From the Beppo-SAX data, the peak for x-ray brightness was 4.3 times that of the Crab Nebula, while it had an average brightness of 0.4 Crab.\p
The burster was located in a \Jconstellation\j known as the Chamaeleon, and southern astronomers quickly set to work to gain as much information as possible about the GRB's visible counterpart. Astrophysicists are eager to determine if burst sources are associated with galaxies or other objects, and to measure how they fade through optical and radio wavelengths.\p
A number of sightings were made, and a \Jredshift\j of z = 1.619 was calculated, setting the GRB at about 10 billion light years away. Given that distance, the \Jenergy\j output of the blast has been estimated at 1.6 x 10\U46\u joules, or the \Jenergy\j that would be put out by our \Jsun\j if it shone continuously for 1.3 trillion years, about a hundred times this month's estimate for the age of the universe.\p
#
"Thunderstorms make gamma rays too",1069,0,0,0
(May '99)
Late in May, and publicizing a June International Conference on Atmospheric Electricity, NASA scientists reported the discovery that powerful thunderstorms emit TGFs (Terrestrial Gamma-ray Flashes), which can be detected in space by gamma-detecting instruments like the Burst and Transient Source Experiment (BATSE) aboard the Compton Gamma-Ray Observatory.\p
TGFs are short blasts of gamma-ray \Jenergy\j which only last a few milliseconds, about as long as the sound from a snap of the fingers, and they can only be detected by satellites orbiting \JEarth\j. BATSE was never designed to look inwards; rather, it was designed to observe the universe in gamma ray wave-lengths, much as the Hubble Space \JTelescope\j observes in visible light. But because it was designed to "look" in all directions, \JEarth\j falls within its field of view.\p
TGFs are now known to be associated only with massive thunderstorms, and about 70 of the events have now been recorded. The Compton Observatory is the only \Jsatellite\j \Jspacecraft\j that has detected TGFs and its \Jorbit\j follows a restricted path over \JEarth\j's surface, never straying more than 28 degrees from the equator, so only a small fraction of the likely number of TGFs of the past few years are likely to have been spotted.\p
As well, BATSE only triggers into "gamma-ray burst mode" if the number of gamma rays in that 64 millisecond interval exceeds a prescribed background level, because the background level is always there, but with TGFs taking place in just one or two milliseconds, the average level may sometimes be too low to trigger a response.\p
The \Jlightning\j probably does not have enough \Jenergy\j by itself to generate a TGF, but "red sprites" and "blue jets," huge colorful emissions associated with upward-moving \Jlightning\j, may be able to trigger the flashes. Until there are satellites with a 1-2 millisecond gamma ray trigger coupled with cameras operating in visible light, we are unlikely to know the whole story.\p
#
"Evidence of plate tectonics on Mars?",1070,0,0,0
(May '99)
At the very end of April, NASA scientists revealed in \IScience\i that the Mars Global Surveyor has discovered surprising new evidence of past movement of the Martian crust. Scientists have detected magnetic stripes, not unlike those seen in spreading seafloors on the \Jearth\j's surface, suggesting that an ancient Mars has been a more dynamic place than scientists have believed up until now.\p
The \Jspacecraft\j's magnetometer has revealed banded patterns of magnetic fields on the Martian surface. The adjacent magnetic bands point in opposite directions, just like the stripes found in the Atlantic sea floor in the 1960s, stripes which finally pushed the theory of plate tectonics into respectability.\p
On \JEarth\j however, the stripes are generated when a worldwide magnetic field reverses, while on Mars the various Martian magnetic fields are much more localized. The "dynamo," a hot core of molten metal that generated the Martian magnetic fields, is now extinct, so that the stripes detected so far are mere fossils of the original magnetic field.\p
Just as the dynamo that generated Mars-wide magnetic fields is long gone, so too, say the scientists, the convection currents that drove Mars' plates must now be extinct, making the whole structure a \Jfossil\j.\p
On the other hand, the banded magnetic structure may have nothing at all to do with plate tectonics. The bands may simply have developed as an ancient and uniformly magnetized crust that broke up under volcanic activity, or even tectonic stresses from the rise and fall of nearby areas.\p
The stripes are most prominent in the southern highlands near the Terra Cimmeria and Terra Sirenum regions, where the bands have been seen running roughly east-west, and are around 150 km (100 miles) wide and 1,000 km (600 miles) long, with the longest band stretching out over twice this distance. Unlike the bands on \JEarth\j, which have a clear mirrored symmetry about a central axis, no symmetry has been found in these bands so far.\p
The Martian bands are wider than the seafloor bands on \JEarth\j, perhaps indicating that the spreading took place at a faster rate, or perhaps indicating that magnetic reversals were less frequent, a question which may be resolved when exploration of the banded area takes place.\p
The whole discovery was a matter of luck, as the Mars Global Surveyor was pushed into a circular \Jorbit\j by repeated aerobraking. This involved dipping into the upper \Jatmosphere\j of Mars so as to gradually shape the probe's \Jorbit\j into a circle. Because of a problem with a solar panel on the \Jspacecraft\j, the lowest point of each pass curved below the \Jplanet\j's ionosphere, allowing the magnetometer to obtain better-than-planned regional measurements of Mars. At its planned height of 320 km (200 miles), the magnetic interference would have been too great, and nothing would have been detected.\p
The new magnetic map offers an interesting insight into another problem: the origin of a striking difference in the appearance of the smooth, sparsely cratered northern lowlands of Mars, compared with the heavily cratered southern highlands. The map reveals little magnetism in the north, suggesting that the northern crust formed after the dynamo died. \p
This was probably within a few hundred million years of Mars' formation, and it seems reasonable to suspect that later asteroid impacts followed by volcanic activity heated and shocked large areas of the northern crust, wiping out any magnetic fields in the area, and also smoothing out the terrain. As the new crust cooled, there would be no global magnetic field to leave its imprint on the newly frozen rocks.\p
#
"Mapping Mars in three dimensions",1071,0,0,0
(May '99)
May saw the release of pictures of Mars, including maps constructed with the Mars Orbiter \JLaser\j Altimeter (MOLA) on the Mars Global Surveyor \Jspacecraft\j. This instrument has been used to measure the elevation at points on a regularly spaced grid. The result is a map which is described as more detailed than many continental regions on \JEarth\j.\p
The first pictures appeared in \IScience\i at the end of May, and among the more exciting features revealed in this publication are an impact basin "deep enough to swallow Mount Everest" and "surprising slopes in Valles Marineris." The huge Valles Marineris \Jcanyon\j, it appears, slopes away from nearby outflow channels, with part of it lying about a kilometer, (a half-mile) below the level of the outflow channels. \p
The map is based on a grid of some 27 million points spaced 60 km (37 miles) apart at the equator and closer elsewhere, and surveyed during the past two years. The accuracy of the heights is to within 13 meters (42 feet) at worst, with large areas of the flat northern hemisphere known to better than two meters (six feet). As the Global Surveyor passed overhead, MOLA flashed a \Jlaser\j at the surface and timed the reflections to get the height estimates. This level of accuracy will make it much easier to understand the \Jplanet\j's geologic history and the way that \Jwater\j has flowed across its surface during the past four billion years.\p
From the lowest point of the deepest ocean trench to the top of the highest mountain on \JEarth\j, there is a range of about 20 km (13 miles), while on the smaller \Jplanet\j Mars the range is about 30 km, or 19 miles. While the northern hemisphere is low and flat, the southern hemisphere is heavily cratered and about 5 km (3 miles) higher than the north. The MOLA data show that the northern hemisphere depression is distinctly not circular, so it seems likely that the flattening was produced by some sort of internal geologic processes during the earliest stages of the \Jplanet\j's development.\p
The Hellas impact basin in the southern hemisphere is some 9 km (6 miles) deep, and 2,100 km (1,300 miles) across. The basin is surrounded by a ring of material that rises around 2 km (1.25 miles) above the surroundings and which stretches out to 4,000 km (2,500 miles) from the basin center.\p
The basin was probably created by the impact of an asteroid, and the ring of material that surrounds it makes up a major part of the southern high topography. The high southern hemisphere would have been the major influence on the global-scale flow of \Jwater\j early in Martian history, as the northern lowlands would have drained three-quarters of the Martian surface. \p
While the north and south \Jpoles\j of Mars appear quite different, they show similar elevation profiles, leading NASA scientists to believe that the south pole contains quite a lot of \Jwater\j ice as well as carbon dioxide ice. They now say that the upper limit of their estimate of total Martian surface \Jwater\j is about 3.2 to 4.8 million cubic kilometers (800,000 to 1.2 million cubic miles), or about 50% more ice than is found in the ice cap of \JGreenland\j. This would be enough to cover the \Jplanet\j to a depth of between 22 and 33 meters (66-100 feet), and would be about a third of the volume of the ancient ocean that scientists believe was once found on Mars.\p
The map will be refined over time; MOLA is adding a further 900,000 elevation measurements each \Jday\j, and by the time the Mars Polar Lander mission sets down in early December, engineers will have a great deal of information to help them select a suitable landing site.\p
#
"When was the Big Bang again?",1072,0,0,0
(May '99)
The latest estimate for the age of the universe, published in \IScience\i during May, sets the Big Bang at 13.4 billion years ago. According to Sydney astronomer, Charles Lineweaver, the secret to calculating the date lies in last year's revolutionary discovery that the cosmos appears destined to expand forever. The reasons for this expansion are as yet unexplained, but they are believed to involve some sort of "antigravity" effect called the "cosmological constant" (see \BTop ten breakthroughs of 1998,\b December 1998).\p
The previous estimate for the Big Bang set it at about 15 billion years ago, so the new estimate makes the universe younger than before, but still a little older than the oldest known stars. Several years ago, when some estimates set the Big Bang at 8 billion years ago, the universe was dated younger than its contents, but this embarrassing discrepancy is not a problem with the new estimate.\p
Lineweaver bases his calculation on the Hubble constant and the cosmological constant. The Hubble constant, of course, describes how fast the universe is expanding right now, while the cosmological constant allows us to estimate the forces pushing the universe apart.\p
The embarrassingly young age for the universe was based on a high estimate for the Hubble constant, published in 1994, but Lineweaver's lower value for the Hubble constant was also confirmed by NASA during May.\p
Since the realization in 1998 that the universe will not stop expanding, and will not come together again in a "Big Crunch," scientists have been looking for an explanation of this. The most likely explanation is that there is some sort of "vacuum \Jenergy\j" which somehow works against the force of gravity.\p
Lineweaver suggested last October that 70% of the matter in the universe is in this form, and that has allowed him to estimate a value for the cosmological constant. This leads him to his estimate, but it is still subject to error; his published figure is actually 13.4 billion years, plus or minus 1.6 billion years - and however you look at it, that error bar is time for a great deal to happen.\p
Appropriately, the new NASA estimate for Hubble's constant came from the Hubble Space \JTelescope\j, and represents the end of an eight-year effort to measure precise distances to galaxies, far, far away, as they say in the movies.\p
Once this key measure is in place it becomes much easier to determine the age, size and fate of the universe. In reality, before Hubble, the best age estimate was no better than somewhere between 10 and 20 billion years ago. And with that degree of uncertainty, there was little to be said, according to NASA, about "... most of the basic questions about the origin and eventual fate of the cosmos."\p
Now the universe's origin, \Jevolution\j, and destiny can be argued about with a proper degree of scientific reliability. Hubble's constant was first estimated by Edwin Hubble some 70 years ago, when he realized that the galaxies were rushing away from each other at a rate proportional to their distance. The further away two parts of the universe are, the faster they separate, so Hubble's constant is stated in terms of the relative velocities of two points a megaparsec (3.26 million light years).\p
Up until the launch of the Hubble \Jtelescope\j the best estimates for the constant were in the range from 50 to 100 kilometers per second per megaparsec. Now the estimate is set at 70 km/sec/mpc, plus or minus 10%, and the race is on to tie the value down even more precisely.\p
Yet even moving from a factor of two to a factor of 10% is a big step forward, achieved by using the Hubble Space \JTelescope\j to observe 18 galaxies which are at distances up to 20 mpc, and almost 800 Cepheid variable stars were found.\p
Cepheid variables are very special objects, because they are "standard candles," stars which vary at a rate proportional to their absolute brightness. In other words, if an observer measures the variation, then the real brightness of the star is known. When this is compared with the star's relative brightness, its distance can be estimated very reliably. \p
The next step is to use the Doppler shift of the star's radiation to estimate its velocity relative to us, and then, with the distance and the velocity in our notebooks, it is a simple matter to calculate Hubble's constant.\p
Even though Cepheid variables are rare, the 18 galaxies gave enough of them for the NASA team to undertake many measurements, and to calibrate many different methods for measuring distances. \p
Coincidentally, another Australian scientist, Jeremy Mould of the Australian National University, is a co-leader of the team, and he also announced an age for the universe - 12 billion years - which is in the range set down by Lineweaver, who is at the University of New South Wales.\p
#
"What was that satellite?",1073,0,0,0
(May '99)
Back in the 1950s when satellites were a rarity, people would stand outdoors at dusk, looking up in the western sky for satellites, glinting in the light of a \Jsun\j which had disappeared below the horizon. Now there are somewhat more than 8,000 \JEarth\j objects orbiting our \Jplanet\j and our chances of identifying any one of them is remote indeed.\p
Some of the satellites, the ones in geostationary orbits, circle \JEarth\j at the equator once every 24 hours, so they appear to remain in one place as the stars turn around the celestial pole, and these are hard to see. Other satellites are too small to reflect a significant amount of light, but others are in polar or other orbits which carry them across the star field, and they are large enough to be seen with the naked eye: these are the satellites that NASA's new J-Pass program targets.\p
A typical \Jsatellite\j in low \JEarth\j \Jorbit\j (LEO) circles \JEarth\j once every 90 minutes, traveling at about 7.5 kilometers per second (27,000 km/hour or 17,000 mph). They are best sought-out at dawn or dusk, and some of the best trophies, such as the new International Space Station, \JRussia\j's Mir Space Station, and the Space Shuttle, when in flight, are also the brightest objects. Some of them have a visual \Jmagnitude\j of -1.0, on a par with Sirius (Alpha Canis Major), the brightest star in the sky. Yet for all their brightness, successful \Jsatellite\j spotting depends on knowing exactly when and where to look. \p
Using NASA's Liftoff to Space Exploration web site, you can find out which satellites will be passing over your home. Liftoff's Java program, called J-Pass, uses information provided by the North American Strategic Defence Command (NORAD) for more than a hundred bright satellites.\p
NORAD actually keeps track of all of the more than 8,000 objects traveling above \JEarth\j. More than 2,500 of these are actual satellites, either working or not, while other bits and pieces make up the rest. Nosecone shrouds, lenses, hatch covers, rocket bodies, boosters, payloads that have disintegrated or exploded, and even objects that have escaped during manned \Jspacecraft\j missions - all are out there circling \JEarth\j.\p
All you need is a recent version of either Netscape or \JInternet\j Explorer to use J-Pass. This gives you \Jsatellite\j rise and set times for your location, and indicates the part of the \Jsatellite\j pass that will be visible. The chart also includes positions of visible planets and bright stars, and you can print sky charts for use outside as reference guides. \p
Even those lacking an \JInternet\j browser are catered for with a mailing list which alerts subscribers to upcoming \Jsatellite\j passes by e-mail.\p
The Web address for J-Pass is http://liftoff.msfc.nasa.gov/RealTime/Jpass/20/ and the address for the e-mail service is http://liftoff.msfc.nasa.gov/RealTime/Jpass/PassGenerator/\p
#
"A new volcano",1074,0,0,0
(May '99)
News surfaced during May of a previously undetected submarine \Jvolcano\j near \JSamoa\j. The \Jvolcano\j, which has yet to surface itself, still rises an impressive 4,300 meters (some 14,100 feet) from the ocean floor in the area of the \JSamoa\j Islands, an area which would previously have been called well-surveyed.\p
The \Jvolcano\j is more than 35 km (22 miles) across, and reaches to within 600 meters (2,000 feet) of the surface. A Woods Hole Oceanographic Institution group, led by geochemist Stan Hart, made the discovery after a 1995 \Jearthquake\j swarm in the region suggested that the \Jvolcano\j might be present. It has been dubbed Fa'afafine, a Samoan word that very loosely translates to "wolf in sheep's clothing."\p
Maps of the seafloor, made by \Jsatellite\j altimetry a few years ago, give little indication of the actual size of the \Jvolcano\j. They simply show a small hill-like geologic feature, and yet these are generally regarded as the most reliable maps available.\p
It took a detailed survey using the research vessel Melville to reveal the true size of the \Jvolcano\j. Hart and his colleagues wanted to test the idea that the \JSamoa\j Islands are a volcanic hotspot chain, and hoped to gather evidence to show that the islands' formation is not just a result of being near the \JTonga\j Trench, as some \Jearth\j scientists believe. \p
The classic example of a hotspot island chain is the Hawaiian Islands where, what will someday be the newest island Loihi, is a seamount rising toward the ocean surface on the southeast flank of the island of Hawaii, and Hart believes that Fa'afafine represents the location of the hotspot which has produced the \JSamoa\j Island chain.\p
Hart and his colleagues have indicated their interest in a return visit, maybe with a remotely controlled submersible, to survey the caldera which is a prominent feature of the \Jvolcano\j, and also to search for hydrothermal hot springs and associated life forms.\p
#
"Living with arsenic",1075,0,0,0
(May '99)
Fish, shellfish, and \Jcoral\j in waters of Tutum Bay, Ambitle Island, near Papua New Guinea, live in waters with the highest known marine concentration of naturally occurring \Jarsenic\j found anywhere in the world. Yet in spite of this, the animals all appear to be flourishing, according to a research report that appeared in May in the journal \IEnvironmental Science & Technology.\i\p
According to the report the animals are likely to be protected by iron minerals in the sediment in this hostile environment. The answer is worth seeking at a practical level, since it could yield a method for dealing with \Jarsenic\j-tainted waste, since \Jarsenic\j is both extremely toxic and one of the most difficult waste contaminants to remove. \p
The \Jarsenic\j comes from hydrothermal vents in the seafloor of Tutum Bay, where the researchers found boiling fluids blasting into the sea with all the force of a fire hose, and carrying "extremely high \Jarsenic\j concentrations," more than 400 times the trace levels usually found in seawater. This puts the \Jarsenic\j at a concentration high enough to cause neurological damage, and the authors call it "the highest \Jarsenic\j concentration reported from any marine setting, including black smoker vents from midocean ridges." \p
In spite of this, the skeletons of corals and the shells of clams do not show any more \Jarsenic\j than other samples taken from outside Tutum Bay. According to the report, two effects seem to act to protect the life in the area. In the first place, the vented \Jwater\j is diluted by seawater, reducing the concentration, but more importantly, iron minerals precipitate out when the vent \Jwater\j mixes with normal seawater, and \Jarsenic\j is captured and held in these iron minerals as they form. The iron minerals form a bright orange layer found all over the seafloor, but the minerals were most common near the vent openings.\p
\BKey name:\b Thomas Pichler.\p
#
"Engineered corn and monarch butterflies",1076,0,0,0
(May '99)
Corn crops, under severe attack by the European corn borer, are now protected by the genetically engineered \IBt\i-corn, which carries genes from the bacterium \IBacillus thuringiensis\i (Bt) spliced into the plant. The bacterium, which produces a toxin, has been stripped of these genes so the corn plant itself can be protected against corn pests. The engineered corn is safe for human consumption, and in 1998, more than 3 million hectares (7 million acres) of the hybrid crop were mainly planted by US farmers to control the European corn borer.\p
At the same time, there has been an increasing clamor, mainly based on uninformed and emotive cries of "\JFrankenstein\j food" - cries which indicate a poor understanding of both literature and \Jgenetics\j. But if the critics tend to parade their ignorance, this does not mean that there are no risks associated with the genetically-engineered foods, and so a great deal of research is under way, on both sides of the political divide.\p
It is important to stress that the research is motivated by interests of one sort or another, because proofs that the genetically-modified crops are safe, like proofs that it is dangerous, need to be scrutinized very carefully before they are accepted. The tests which need to be applied relate mainly to the qualifications of the researchers, and few of us are able to assess this.\p
Luckily, there is no need for us to do so, because the standard procedures of science do it for us: if the research is reported in a refereed journal such as \INature\i in Britain or \IScience\i in the UK, this should indicate that the best scientific minds in the field have studied the work and declared it to be reasonable and legitimate. While the best minds in the world cannot always rule out \Jfraud\j, they can normally be expected to do so, and any fraudulent work is likely to be soon exposed as other researchers from the other side of the "divide" attempt to replicate the study, and find flaws in the methodology.\p
So when a report in \INature\i says that the genetically-engineered corn kills monarch butterfly larvae in laboratory tests, it is time to feel at least a little bit worried. The study only relates to the laboratory, but a reasonable judge would expect the same risk to occur in the field, so perhaps we need to be just that little bit more worried.\p
Original studies showed that unlike many pesticides, the \IBt\i-corn has been shown to have no effect on many "non-target" organisms. Pollinators like honeybees, or beneficial predators of pests like ladybirds (ladybugs) were unaffected by the \IBt\i toxin, so people felt fairly certain that the corn was safe. The problem comes from the fact that corn is pollinated by the wind, not by insects or birds, meaning that corn produces vast amounts of \Jpollen\j so male plants can hit the vital parts of the female flower.\p
And here is the problem: the \IBt\i-modified corn produces \Jpollen\j containing crystalline endotoxin from the bacterium genes. This \Jpollen\j is spread by the wind, and when the \Jpollen\j lands on other plants, including milkweed, it coats the leaves with a dusting of toxic \Jpollen\j. And milkweed is the exclusive food of monarch caterpillars and is commonly found around cornfields. Worse, as one commentator put it, the monarch butterfly is "the Bambi of the insect world."\p
In reality the \Jpollen\j only travels about 60 meters. In any case, monarch butterflies are ubiquitous, found just about all over the world, and a local drop in populations around cornfields would not represent a lasting problem, but other species are probably also affected by the \Jpollen\j, and there is no telling what will happen if the milkweed is protected in the vicinity of cornfields.\p
As well, the monarch caterpillars are more likely to be found close to corn plantations. The larvae feed exclusively on milkweed because it provides protection against predators - the milkweed plant contains cardenolides, which are toxic, bitter chemicals that the monarch caterpillar incorporates into its body tissues, making it taste horrible to predators. Worst of all, milkweed grows best in "disturbed" habitats, like the edges of cornfields. So, like it or not, there appears to be a problem here.\p
This is especially so in north America, where the monarch butterfly is a migratory species, with the second generation of the year laying its eggs in areas around the Midwest Corn Belt, so that the caterpillars will be feeding on milkweed during the period when corn is shedding \Jpollen\j. In other words, they may be in the right place at the right time to be exposed to \IBt\i-corn \Jpollen\j, and if one generation in the cycle is attacked it may be enough to cause serious damage to the whole species.\p
The real risk lies in assuming that there are monarch caterpillars in other areas that will survive. If (and this is untested!) the cornfield milkweeds are a more attractive environment, as the population dives, so the remaining adults will be drawn in to the milkweeds lying near the cornfields. This is similar to the problem with fishing grounds, where trawlers keep working the best grounds, and keep finding stocks of fish because of recruitment from less rewarding areas, until suddenly those remnants plummet as well.\p
In the study, monarch caterpillars were fed milkweed leaves dusted with so-called transformed \Jpollen\j from a \IBt\i-corn hybrid. They ate less, grew more slowly and suffered a higher mortality rate, with nearly half of them dying. In contrast, all of the monarch caterpillars that were fed leaves dusted with non-transformed corn \Jpollen\j or leaves without corn \Jpollen\j survived the study.\p
This of course is only the beginning of a long argument. At least 18 different \IBt\i-engineered crops have been approved for field testing in the United States. As of last year, transformed corn, potatoes, and cotton had been approved by the US Department of Agriculture for commercial use.\p
There is no easy answer here: the use of \IBt\i-engineered crops is reducing \Jpesticide\j use, and doing a great deal for the environment, and some of the arguments and evidence being put forward may be tainted by special interests from one side or the other. All we can really say is that there is a clear scientific case to believe that there is a good chance of a risk to one or more species, and that some of these species may turn out to be important. And because \Jextinction\j is a one-way process, it is better to be safe than sorry.\p
#
"New uses for the humble flatworm",1077,0,0,0
(May '99)
A quiet announcement on the \JInternet\j during May revealed startling insights into \Jbiodiversity\j and where the world's \Jbiodiversity\j hotspots are located, and the insights came from the simple flatworm.\p
As the \Jextinction\j of species accelerates, we are entering a time best labeled a \Jbiodiversity\j crisis, and this crisis needs to be regarded as a serious global problem. One key to retaining as many species as possible is to identify those areas where large numbers of species are found, the world's \Jbiodiversity\j hotspots.\p
Most studies have taken very little notice of the "lower animals," especially those which are lower both by virtue of their apparent evolutionary status, and also because they live in the soil. The "evolutionary status" view of "lower animals" is at best scientifically misguided, while the soil fauna are generally essential to the maintenance of any \Jecosystem\j.\p
Whatever the reason, most \Jbiodiversity\j research is done on the attractive and "interesting" animals such as butterflies, birds, and mammals, but these are only a very small proportion of the total number of species on \JEarth\j. So long as we judge \Jbiodiversity\j on these pretty, cuddly, furry species, the data we get and the decisions we make are likely to be seriously biased.\p
Biologists from the Zoological Museum at \JAmsterdam\j University have taken a different tack, looking at terrestrial flatworms, and they say they have found three new "hotspots" of biological diversity: New Zealand, southeast \JAustralia\j, and \JTasmania\j. \p
There are some 822 different species of terrestrial flatworm on record, and the biologists consider that these can act as a general model for the lower invertebrates because the diversity of this group closely reflects the diversity of soil fauna organisms generally. \p
Other areas, already related as \Jbiodiversity\j hotspots, such as the coast of \JBrazil\j, Java, and Sri Lanka are also flatworm hotspots, as are the hotspots for higher plants: New Caledonia, Madagascar, and \JSumatra\j. \p
#
"Disabling bacteria",1078,0,0,0
(May '99)
In early May researchers at the University of \JCalifornia\j, Santa Barbara, announced in \IScience\i an interesting method for controlling bacterial infections. They claim in the report that pathogens can be disabled by turning "on" a "master switch," and when this is turned "on," the \Jbacteria\j are completely unable to cause disease. In a press release just before publication, they suggested that the discovery would be applied toward the development of a new generation of vaccines and \Jantibiotics\j.\p
Curiously, the master switch is the same within many pathogenic \Jbacteria\j, including \IVibrio cholerae\i (the cause of \1cholera\c), \IYersinia (Y. pestis\i causes \1plague\c), \ISalmonella (S. typhi\i causes \1typhoid fever\c), and \Ishigella\i (the toxin from \IS. Dysenteriae\i causes \1dysentery\c). Microbial infections cause 17 million deaths each year, and the researchers hope that the switch mechanism may be used against these bacterial infections as well as others that are a major cause of death of AIDS and cancer patients.\p
More importantly the treatment which now appears possible will sidestep the defenses of the newly emerging, drug resistant pathogens. In recent times, superior pathogens have emerged that can no longer be controlled by available \Jantibiotics\j. Medical workers are now faced with drug resistant tuberculosis, \IStaphylococcus\i and \IStreptococcus\i organisms. \p
The research involved a novel approach, aimed at identifying genes in \Jbacteria\j that come alive when they infect mice, but are "cloaked" in culture in the Petri dish. Like the Trojan horse, says lead author Mahan, the \Jbacteria\j hide the dangerous weapons inside a benign exterior. "You can't fight what you can't see," he says in the press release. Then switching metaphors to poker, he likens the \Jbacteria\j to good poker players: "But no matter how good a poker player you are, if you lay your cards down, you're dead."\p
In short, when the master switch is "on," all of the "cards," the bacterial tricks for getting past the gut and into our tissues, are laid bare. This disables the \Jbacteria\j, while at the same time causes the immune system to recognize the attackers and mount an immune response. In effect, the crippled \Jbacteria\j act as a vaccine.\p
While Mahan and his colleagues have undertaken successful trials on animals, applications on humans are still a few years away. There may be an indirect impact on humans even sooner, through the food supply. It might be possible, for example, to vaccinate chickens and cows to develop \ISalmonella\i-free poultry and \IE. Coli\i-free beef. With more large-scale processing of food, the level of contamination in the food supply is expected to worsen over the next few years.\p
\BKey names:\b Michael J. Mahan, Douglas M. Heithoff, Robert L. Sinsheimer, and David A. Low.\p
#
"An experimental staph vaccine",1079,0,0,0
(May '99)
If the master switch does not stop the staphylococci in their tracks, the National Institute of \JAllergy\j and Infectious Diseases (NIAID) reported a possible solution in \IScience\i at the end of May. According to the report, the researchers have developed a vaccine that protects mice against multiple strains of \IStaphylococcus aureus.\i Over the past two years \IS. Aureus\i has become increasingly resistant to \Jantibiotics\j and now this germ is the most common cause of \Jhospital\j-acquired infections.\p
While world figures are hard to come by, the USA is one of the main centers for resistant strains of \IS. aureus,\i and it is estimated that perhaps half a million Americans in American hospitals contract staph infections each year, and some of these infections involve strains resistant even to vancomycin, the last line of defense against \IS. aureus.\i\p
The illnesses caused by infections range from minor skin infections and abscesses to life-threatening diseases such as severe \Jpneumonia\j, meningitis, bone and joint infections, and infections of the heart and bloodstream.\p
The new vaccine is the first to be made from a bacterial molecule produced primarily during infection, rather than in laboratory culture. In the past, molecules isolated from bacterial cultures have been used as a base for vaccines, but there is a problem: bacterial growth under laboratory conditions may not mimic an actual infection, as we have seen in \BDisabling \Jbacteria\j\b this month. This means the molecules found in culture may be irrelevant in a real infection. As a result, scientists have recently begun searching for bacterial products that are activated specifically during infection.\p
Tissue from humans and mice infected with \IS. Aureus\i contains one of the more interesting \1polysaccharides\c, a staphylococcal molecule known as PNSG (poly-N-succinyl Beta-1-6 glucosamine). Curiously, few \IS. Aureus\i strains produce PNSG when cultivated in the laboratory. When PNSG was collected, purified, and injected into rabbits, the animals produced large amounts of PNSG \Jantibodies\j which persisted for at least eight months. These \Jantibodies\j, extracted and injected into mice, made the mice immune to eight different strains of \IS. aureus,\i including strains resistant to the antibiotic methicillin and partially resistant to vancomycin.\p
It is reasonable to assume that the vaccine could provide immunity to the multi-drug resistant \IS. Aureus\i "superbug" that is now threatening world health, but there is more. Other bacterial species classified as coagulase-negative \Istaphylococci,\i or CoNS, also produce PNSG and may be open to control in the same way. Trials of the vaccine on humans are still about two years away, but the procedure looks to be one with a great deal of potential.\p
\BKey name:\b Gerald B. Pier.\p
#
"Slimy bacteria and chronic infections",1080,0,0,0
(May '99)
A review paper published in \IScience\i in mid-May shines a light on bacterial slime as the culprit behind many nagging infections that plague children and adults. Sticky communities of \Jbacteria\j, usually referred to as biofilms, can cause a wide range of infections, such as ear infections and periodontitis, and the films can also be involved in infections linked to sutures, contact lenses, urinary catheters, intrauterine devices (IUDs), mechanical heart valves, and penile prostheses. In short, biofilms are a problem.\p
Future solutions, according to the article, are likely to come from a better understanding of the genetic and molecular makeup of biofilms.\p
So far, researchers know that \Jbacteria\j attach to surfaces, and if enough of them gather in one place, they can form biofilms. Some aspects of the ways in which the \Jbacteria\j coordinate their activities and build complex structures are known (see \BHow \Jbacteria\j work together,\b April 1998), but there is still a great deal to be explained.\p
Among one of the puzzles to be solved is the problem of explaining why biofilms resist \Jantibiotics\j. It is possible that some of the cells grow so slowly that they do not pass the \Jantibiotics\j along, and that this slow growth may be because the cells are starved. It could also be that some of the cells set up barriers to keep the \Jantibiotics\j out.\p
The solution, say the authors, will probably be to develop therapeutic agents (drugs) which attack the biofilm as a whole, and use signals which either stop the film from forming, or make it come apart.\p
\BKey names:\b William Costerton, Philip Stewart and Pete Greenberg. The first two authors are with the Montana State University Center for Biofilm \JEngineering\j (CBE) and Greenberg is at the University of \JIowa\j. Costerton and Greenberg both have children who suffer from cystic fibrosis, a lung disorder caused by biofilm.\p
#
"Intestine transplant survivors",1081,0,0,0
(May '99)
The American Society of Transplantation held its 18th Annual Meeting and Scientific Sessions in mid-May. One of the highlights of the meeting was a report from University of \JPittsburgh\j Medical Center (UPMC) surgeons on the survival rates they have been achieving with intestinal transplants. With a sample of 127 to report on, the largest number of transplants in any center, they say they now have a one-year survival rate of 72%.\p
The 121 adult and child patients, who had a total of 127 transplants between 1990 and 1999, all had irreversible intestinal failure. Of these, 81% needed intestinal transplants because of short-gut syndrome, which involves the loss of more than 70% of the \Jintestine\j due to trauma, surgery, or disease. Adults may develop short-gut syndrome because of trauma, clotting of the \Jintestine\j's vessels or Crohn's disease, while children may develop it as the result of a twisting of the intestines called a volvulus, or from a variety of congenital conditions.\p
The most successful survivors were children (those aged 2 to 18), who had a five-year survival rate of 68%. Of the 58 patients still alive, 55 are now completely off intravenous \Jnutrition\j, are eating normal diets and have an improved quality of life. \p
The small \Jintestine\j can be transplanted in one of three ways: alone, which happened for 48 patients; in combination with the liver (58 patients); or in combination with the liver, \Jpancreas\j and \Jstomach\j (the remaining 15 patients).\p
A number of factors have played their part in improving the survival rates, including the suppression of rejection with a three-drug cocktail of tacrolimus, steroids, and daclizumab, better matching of donors and recipients, and donor bone marrow augmentation: some of the patients received donor bone marrow with their transplants. \p
Overall, 35 patients are alive and were able to maintain good \Jnutrition\j for more than three years, and another 22 have reached the five-year mark, while one has now survived for nearly nine years.\p
The most recent "tweak," the addition of the drug daclizumab to the anti-rejection drug regimen, has lifted the one-year survival rate to 92%, with 14 out of 14 patients surviving.\p
\BKey name:\b Kareem Abu-Elmagd.\p
#
"Oral control of diabetes a step closer",1082,0,0,0
(May '99)
A report in \IScience\i in early May describes an \Jinsulin\j-like compound, a substance which mimics the role of \Jinsulin\j, at least in mice, and which can be swallowed instead of being injected. This offers the prospect of relief for the world's diabetics from the need either to monitor their lifestyle rigorously, or to inject daily \Jinsulin\j doses.\p
The report arises from an international study aimed at screening some 50,000 compounds in the hope of finding an alternative to \Jinsulin\j. Diabetes arises when the cells of the body cannot absorb enough blood sugar to provide the \Jenergy\j that is needed. In Type I diabetes, the problem is that the body does not produce \Jinsulin\j, while in Type II diabetes, the body and its cells have become resistant to the effects of \Jinsulin\j. An oral drug, not itself \Jinsulin\j, has the potential to help in both these cases.\p
Any suitable target compound would need to bind to the cell's \Jinsulin\j receptor, and it would also need to activate an \Jenzyme\j which in turn sets off the sequence of events that provides blood sugar \Jenergy\j in the right places in the cells. The search involved incubating the substances with cultured cells engineered to have large numbers of \Jinsulin\j \Jreceptors\j. Then the researchers looked for evidence that the initial \Jenzyme\j had been activated.\p
The problem with \Jinsulin\j is that it is a \Jpeptide\j, a chain of amino acids like a protein, and equally prone to attack and breakdown by the digestive juices. This makes it useless as a drug for oral delivery, so the researchers were looking at molecules which are not peptides. The most promising product, derived from a \Jfungus\j which once grew on a leaf in the present-\Jday\j Democratic Republic of \JCongo\j, goes by the unexciting name of L-783,281. This tiny compound shows a remarkable ability to fill the role of the much larger \Jinsulin\j molecule.\p
After the successes with cultured cells they administered the new compound by mouth to two different strains of mice that are standard animal models for human diabetes. They found that the compound was able to control the blood sugar levels of the mice, but any pill for human diabetics will require much more research to maximize its effectiveness while keeping it safe, and also to identify any other compounds which may be out there somewhere, in the slime of a flatworm, the nectar of a flower, or the cells of a mushroom, somewhere on the face of the \Jearth\j.\p
\BKey names:\b Bei Zhang.\p
#
"Near-sighted children and the light",1083,0,0,0
(May '99)
When children under two years old sleep in a room with a light on at night, they are more likely to develop myopia when they are older. A report in \INature\i in mid-May indicates that 10% of children who slept in darkness were myopic when tested between the ages of 2 and 16. This compares with 34% of a similar sample, distinguished by having slept with a night light on before the age of two. In a third group, those who had slept with a room light on, 55% were near-sighted.\p
It seems that a nightly period of full darkness may be needed to prevent future myopia, and while there is no proof that lighting directly causes myopia, the authors believe that " . . . it would seem advisable for infants and young children to sleep at night without artificial lighting in the bedroom until further research can evaluate all the implications of our results." \p
The finding helps to explain an observed increase in myopia in urbanized populations over the past two centuries, as populations around the world shifted from agricultural to urban, industrialized environments. The common explanation in the past has been that near-work, like reading and other close-at-hand occupations, has caused the increase. This could still be a partial cause, but the greater ambient night-time light levels that have come with industrialization may still help to explain the high incidence of myopia in developed nations. \p
The study looked at lighting levels at night before the age of two in a sample of almost 500 children, each child's present night-time lighting conditions, lighting in other parts of the home, in \Jday\j care or school settings, and in the geographical region in which the child lived, and also the use of sunglasses. The study was triggered by research which showed that eye growth and refractive development in chicks were greatly affected by the relative proportions of light and dark during the 24-hour \Jday\j. \p
The only significant factor was lighting before the age of two. This cut-off was chosen because the eye grows particularly rapidly before this time, but that is no guarantee that the danger period ends at that time. \p
\BKey name:\b Richard A. Stone.\p
#
"Safer gene delivery",1084,0,0,0
(May '99)
Gene therapy up until now has relied on a virus's ability to inject its DNA into cells, but with the viral agents come risks, such as an immune response. This has made a non-viral method of gene transfer something much to be desired. Liposomes, which fuse with the cell membrane to gain entry, have been used in clinical trials, but now there is a new solution on offer.\p
Late in April the \IProceedings of the National Academy of Sciences\i carried a story about a new method of delivering DNA to the nucleus of a cell, using a polymer called polyethylenimine or PEI. This polymer has been used for years in common processes such as shampoo manufacturing, paper production and \Jwater\j filtration, but now it looks set for a new future, bringing DNA to a patient's cells without using viruses.\p
Researchers have tracked the path that PEI takes through the cell, where it behaves rather like an artificial virus, as it carries DNA into the cell's nucleus. They have shown that the complex ends up intact in the cell's nucleus, where the new DNA can be read and put to work. The PEI/DNA complexes were labeled with two fluorescent markers, one for the PEI and one for the DNA, which allowed the two halves to be followed.\p
It came as a surprise that the PEI actually delivers the DNA to the nucleus without separating from the DNA, arriving in the nucleus as an intact structure. There is no knowledge of the effects that PEI may have on the cell, but it is one of a class of systems which are polycationic (or positively charged), and if PEI is not a suitable candidate perhaps one of the other systems may prove more suitable.\p
They say that outside the cell the PEI/DNA complexes attach to cell membranes and then join into clumps which are taken into the cell. Once they are inside, the complexes are not broken down and used by the cell's machinery. Instead, the complexes move into the nucleus where the desired gene is turned on.\p
The researchers think the complex may get into the nucleus because it becomes coated with phospholipids, which help regenerate the cell membrane. If this happens, the coated complexes might fuse with the membrane and then release the PEI/DNA into the nucleus, but this remains a matter of speculation. Lying somewhere between informed speculation and confident prediction is the suggestion by one of the researchers, posted on the Web in early May, that synthetic polymers might allow researchers to modify the polymer and incorporate specific ligands so it can deliver the DNA to the desired cell type for targeted gene therapy.\p
Likely targets for the near future are as diverse as cystic fibrosis and cancer, but the treatment could also be used in cardiovascular applications such as \Jballoon\j angioplasty, where gene therapy might be used to slow down the rapid proliferation of cells which causes scarring and a re-narrowing of the artery. All we have to do, it seems, is to work out what happens to the PEI or other carrier polymer, once the DNA has been stripped off, inside the nucleus.\p
\BKey names:\b Antonios (Tony) Mikos, W. T. (Terry) Godbey, and Kenneth Wu.\p
#
"Telomeres are big loops",1085,0,0,0
(May '99)
One of the puzzles of \Jgenetics\j is that cells are very good at repairing the breaks in DNA, yet they leave the ends of chromosomes alone. The cover story of \ICell\i in mid-May was an explanation of why cells' internal repair machinery does not either "fix" or destroy the ends of chromosomes. The answer is that the end of a \Jchromosome\j, the telomere, is in the form of a big loop.\p
The proof lies in \Jelectron\j micrographs which clearly show the loops, but the initial clue came when the removal of a particular protein from a cell triggered "cell suicide." This is common when a cell is unable to deal with breaks in the chromosomes, and so the "suicide" suggested that the protein was somehow involved in masking the \Jchromosome\j ends - once it was gone, the cell gave signs of massive damage, and the suicide response was started.\p
This led the researchers to try to examine how the protein might arrange DNA molecules containing genetic sequences typical of telomeres. Then they took the \Jelectron\j \Jmicroscope\j images which showed DNA molecules arranged into loops. The telomere DNA appears to be looped back around, and attached to a distant internal site on the DNA and held there by the added protein. The loop serves to disguise the DNA end, so that the repair or suicide sensors cannot "see" it.\p
\BKey names:\b Jack Griffith and Titia de Lange.\p
#
"Mutton dressed up as lamb?",1086,0,0,0
(May '99)
Dolly the cloned sheep, the subject of several previous news stories, has proved to have "old cells," or " DNA damage that might cause premature aging," according to several reports. While the original Dolly stories all carried the heading "Hello Dolly," this time the popular theme was mutton dressed up as lamb - so who are we to buck a trend?\p
In fact what has been established is that the telomeres in Dolly's cells show every sign that the cells are as old as the lifetime of Dolly's "mother," up until the time Dolly's cells were removed, together with Dolly's own lifetime. So when Dolly appeared to be a lamb, it is probable that her cells carried the genetic markers that would label her as mutton.\p
Dolly's telomeres are shorter than normal, about 20% shorter, according to those of similar sheep which were conceived naturally, according to a report in \INature.\i Curiously, Dolly's natural daughter, Bonnie, had normal length telomeres at birth, suggesting that the length of the telomeres is somehow reset during gamete formation, or as a result of the process of fertilization.\p
The finding on Dolly's telomeres was not entirely unexpected because cells used for \Jcloning\j have to be cultured - allowed to divide many times in the laboratory, and this is a process that is known to shorten telomeres. As well, she was cloned from cells taken from a middle-aged animal, but even so, there are no signs of premature aging in Dolly yet, though these might still show up later. Dolly's telomeres are presumably still getting shorter, and there is the possibility that Dolly may die suddenly, when the (so far unknown) critical lower limit for the telomeres is reached.\p
The discovery may place limits on the potential of exciting new medical technology arising from developments in \Jcloning\j. For example, hopes of taking master cells from cloned human embryos and growing them into perfectly matched organs for transplant surgery could be thwarted if the process of creating the organs left them with a short life, which means that we are likely to see an acceleration of research into telomerase.\p
And the bad news for sheep: any problem caused by the shorter telomeres is unlikely to affect animals cloned for use as livestock, because food animals are usually slaughtered before they have lived more than around a third of their natural lifetime - well within the telomere-imposed "use-by date."\p
#
"Women under anesthesia",1087,0,0,0
(May '99)
A report in the May issue of \IAnesthesiology\i reveals that women appear to wake up almost twice as fast as men when general anesthesia is discontinued after surgery. This suggests that women react differently to men, and may need higher levels of anaesthesia. This may also help explain why three times more women than men have complained of being conscious during surgery, according to the authors of the study.\p
Dosage is usually related to body size, so that women are typically given less anaesthetic than men, but it is normal practice for dosages to be adjusted during surgery if the patient shows any sign of returning to consciousness. In this study, anesthesiologists used a new standardized technique that delivered the same concentration of an anesthetic drug to all patients, and then observed the patients' level of consciousness to see how quickly they awoke after the drug was stopped. \p
It is unusual for patients to be aware of surgery, mainly because many physicians tend to give more anaesthetic than necessary, simply because the result can be devastating, especially if the patient is able to feel pain at the same time. So this information on sedation levels should help avoid the danger without running the risk of excessive medication.\p
The cause of the difference is unknown. It may arise, say the researchers, from differences in sensitivity, or perhaps a woman's metabolism can break the anaesthetic down sooner, or perhaps both effects play a part.\p
The study began as a test of bispectral index monitoring (performed by a device called a BIS) which is carried out by a device that interprets brainwave patterns and assigns a numerical score to a patient's degree of unconsciousness. At the zero end of the scale, we find a dead brain, while a fully functioning brain scores 100. In this first study on human responses, anesthesiologists have determined that the optimal range of "consciousness" for patients during surgery lies between 45 and 60. At 70 or above the patients are emerging from anesthesia, and below 35 they may be too deeply anesthetized. \p
The BIS monitor was approved by the US Food and Drug Administration in 1996 for use on humans, but it is generally too expensive for all but the biggest institutions, so physicians generally rely on checking their patients' blood pressure and heart rate, and by looking for such signs as movement, changes in the size of their eye pupils, and sweating. \p
In this study, researchers re-examined data on 274 patients monitored with BIS, to find out which level gives the fastest recovery from anesthesia. The Duke University study was motivated by the knowledge that faster recovery would not only benefit a patient, but would save health-care dollars, because recovered patients need a lower level of supervision, and that means lower costs.\p
The sample was made up of 96 men and 178 women who were scheduled for either general surgery, gynecologic, urologic, ENT (ear nose or throat), or orthopedic procedures. All were given the same doses of a common treatment, using a hypnotic drug called propofol which affects consciousness, along with a painkiller called alfentanil and nitrous oxide, which also mediates pain and awareness.\p
Female patients recovered in an average of seven minutes, while the men averaged more than 11 minutes, even though all participants had exactly the same dose of anesthetic. The aim of the study was to keep all of the patients in the BIS range from 45 to 60, and while this occurred, the women generally had a higher BIS score than the men.\p
This is in accordance with an earlier study where women with the same dosage had higher BIS levels than men with the same blood concentration of anesthetic. Now all we need to do is find out why women are different.\p
\BKey names:\b Tong Gan and Peter Glass.\p
#
"A neurological basis for dyslexia?",1088,0,0,0
(May '99)
A study published in the \IProceedings of the National Academy of Science\i in late May points to an interesting explanation for dyslexia. It appears that reading deficits may stem from a deficit in the cortical processing of sound inputs. The authors say that dyslexic adults have a functional abnormality in a part of the brain which we use to process brief, rapidly successive sounds. Simultaneous brain imaging and behavioral tests appear to confirm that there is some sort of lasting disability in processing these brief, rapidly successive sounds.\p
In other words, the basis for dyslexia appears to be associated with a lack in a particular brain function, and the authors believe that this lack may cause difficulties in early speech and language learning, and lead to a weakness in the later mental leap in abstraction to words on a page.\p
They say that the way the brain processes sound in poor readers is very different from its processing and representation of rapidly changing sound inputs in competent readers. The adult dyslexics appear to be representing the sound parts of words in a weaker way, so that a less effective form of representation reaches the regions of the brain involved in speech perception and reading.\p
While this finding does not rule out the possibility that there may be other processing faults which contribute to dyslexia, the difference does appear to be enough to tip the balance in adults, leading to dyslexia. In other words, it would be a sufficient cause, but it may not be the only or necessary cause of dyslexia.\p
\BKey name:\b Michael Merzenich.\p
#
"The human capacity for mathematics",1089,0,0,0
(May '99)
A report in \IScience\i in early May sheds a fascinating light on how human minds approach \Jmathematics\j. The evidence of mathematicians themselves has always suggested that there are at least two ways of thinking about \Jmathematics\j, and this now seems to be confirmed. Albert Einstein reported that numerical ideas came to him in "images, more or less clear, that I can reproduce and recombine at will," while other mathematicians report that they process \Jmathematics\j by way of language-related symbols, or verbal representations of numbers. \p
It now appears that the visual-spatial mode and the linguistic mode of doing \Jmathematics\j work together. The authors, from \JFrance\j and the USA, believe that this finding may help children who struggle with numbers.\p
Studies of brain-damaged patients reveal that some can subtract, through a nonverbal quantity-based operation, but cannot multiply, which involves a rote verbal operation, while others can multiply but not subtract. The new study confirms this two-mode theory and locates the point where such mental activity takes place in the brain.\p
The method used volunteers who are fluent in both Russian and English, who were trained in the \Jmathematics\j needed to solve certain problems, either in Russian or in English. Then they were tested in one of the two languages.\p
Where the task involved an \Iexact\i problem, like deciding if the sum of 53 and 68 is 121 or 127, the volunteers were slower when they were tested in the second language, presumably because the problem used the linguistic mathematical ability. When they were give an \Iapproximate\i mathematical problem, like deciding if 53 plus 68 is closer to 120 or 150, the volunteers showed no language-dependent lag in their answers, suggesting that this task does not involve linguistic mathematical abilities.\p
This language-based distinction was also demonstrated in other more complex mathematical tasks such as addition in bases other than 10, and approximations of logarithms and square roots. And functional brain imaging techniques showed that exact calculations lit up the volunteers' left frontal lobe, an area of the brain known to make associations between words. Mathematical estimation, on the other hand, involved the left and right parietal lobes, parts of the brain responsible for visual and spatial representations and also for finger control.\p
Perhaps significantly, finger counting is typically an early stage in a child's learning of exact arithmetic, and the authors point out that both preverbal human infants and monkeys can numerically distinguish among small groups of objects. So perhaps this innately grasped nonverbal sense of quantity, an ability that humans share with other primates, may be a key part to the power which only humans have, the symbolic mode of mathematical thought, the ability that allowed Einstein to capture the whole universe in a single equation. \p
The findings do not give us a method of selecting children who are "naturally" better or worse at \Jmathematics\j, but they do suggest that the impact of education may be more important than any inherited ability.\p
\BKey names:\b Stanislas Dehaene and Elizabeth Spelke.\p
#
"A hormone treatment for autism",1090,0,0,0
(May '99)
Autism is a severe developmental disorder which interferes with a person's ability to communicate or relate socially with other people. Afflicted individuals also have a restricted range of activities and interests. Children with autism have high levels of blood \Jserotonin\j, a hormone which is produced in the small \Jintestine\j, and also have another hormone, secretin. Both \Jhormones\j, called neuropeptides, are found in the brain as well, and they can affect sleep, appetite, and other brain-regulated functions.\p
Now there is a developing belief, or perhaps suspicion, that injections of secretin reduce the symptoms of autism, with injected children reportedly showing improvements in their eye contact, alertness and use of language. Oddly, given that the secretin stays in the blood for only a brief time, no longer than 20 minutes, its effects have been noted weeks later. \p
The secretin used is very similar to natural human secretin, but it is taken from pigs, and while the extract is 60% secretin, the remaining material is unknown, though it is probably made up of \Jpeptide\j fragments. At this stage, nobody knows if the effects, assuming they are real, are brought about by the unknown parts or by the secretin itself.\p
To unravel these questions, large-scale trials were started during May, using a \Jplacebo\j treatment, artificial and purified pig secretin, and natural secretin. About a hundred children, aged between 3 and 12, will be treated across two trials located in Seattle and \JDenver\j, with other trials due to happen soon. At the very least, this will satisfy the parents of a number of children who have received secretin injections as a treatment for autism and to diagnose digestive and gastrointestinal problems that frequently accompany the disorder. There is always the risk that their observations will turn out to be biased, which is why a \Jplacebo\j is being used, but at least the link with \Jserotonin\j means that there is a reasonable theory which would indicate why the injections \Imight\i have an effect.\p
If the tests appear to show that secretin has a positive effect, a larger study will be launched, aiming at identifying proper dosages, and any possible side-effects.\p
\BKey names:\b Alan Unis, Geraldine Dawson, Ed Goldson, and Sally Rogers.\p
#
"Cleaning up video with VISAR",1091,0,0,0
(May '99)
Two NASA scientists, David Hathaway and Paul Meyer, both from NASA's Marshall Space Flight Center, described an interesting way of cleaning up blurred and jittery video images in a release on the \JInternet\j during May. VISAR, or Video Image Stabilization and Registration in full, is a computer \Jalgorithm\j that corrects for zoom, tilt, and jitter. \p
The procedure is simple: in a digital image, the picture is made up of pixels, tiny blocks of color, and even if the camera is zoomed or shaken, the corresponding pictures in several frames can be related to each other, allowing the system to register on an object in the image, so the pixels from several video frames can be lined up together. This produces a steadier video image.\p
The development began when the \JFBI\j needed to use a video showing the bombing of the 1996 Olympic Games in \JAtlanta\j. The video was of particularly poor quality, so the \JFBI\j asked if NASA could help improve the video's clarity.\p
Meyer is a meteorologist and computer scientist who processes \Jweather\j \Jsatellite\j images, and Hathaway is a solar physicist who uses video stabilization techniques to enhance pictures of the \JSun\j. As Hathaway points out, telescopes are always shaky, so this was a familiar problem.\p
This was worse, though. The video, taken with a hand-held camcorder, had been shot at night, but at least there were some 400 frames, around 13 seconds worth of shots to use, and in the end they managed to stabilize, sharpen, and brighten the image.\p
The usual method of "sharpening" an image just removes blurring to increase the contrast at perceived edges without taking into account how the blurring occurred, or what it represents. As a result, sharpening usually produces a degraded image. Worse, if the image is being zoomed, random noise ("snow") can be taken into the brightening process, degrading the image still more. With VISAR technology, the system combines several video images together, allowing any noise to be averaged out over several frames.\p
Based on their success with the \JFBI\j video, and encouraged by a warm response from scientific colleagues, Hathaway and Meyer decided to create a new product, which brings us to VISAR, which is currently covered under a provisional patent and will soon be available for licensing.\p
Hathaway thinks marketing is likely to concentrate on law enforcement, where the system could help clarify grainy shaky shots taken during car chases, or perhaps in singling out faces in the crowd at a fire - arsonists typically return to the scene of their crime. Another target market is likely to be medical imaging, where the ultrasound images shown to infanticipating parents might now be able to see something more than a fuzzy blur.\p
Meyer sees applications for VISAR in \Jmeteorology\j, where the system might help to track changes in \Jcloud\j formations and storms. It might be particularly useful, he thinks, in tracking the eye of a hurricane, and in analyzing \Jtornado\j images which are typically shot on home video cameras by people unlucky enough to be caught in the path of the storm. He says that if the image is steadied on the \Jtornado\j, researchers could track the whirling objects hurled around by the storm, allowing better estimates of wind speed.\p
The biggest market of all, though, is likely to be the home market, where learner video operators would be delighted to find their amateur errors being ironed out. Many camcorder devices currently have built-in anti-jitter devices, but there are currently no devices to fix problems having to do with zoom and tilt.\p
The hype does not end there: the developers see VISAR helping people using VISAR for special effects, but some of the best uses will need to wait until real-time VISAR processing and correction can be used, and that may still be a few years off. The original version took five minutes to process a single frame, while the current version has this down to 15 seconds.\p
#
"Are the birds modern dinosaurs?",1092,0,0,0
(May '99)
The world of paleontology includes many people with strong views, but few are as forthright as those who say the dinosaurs never died, that they simply grew feathers and became birds. While there is a lot of circumstantial evidence to suggest that today's birds arose from yesterday's theropods, there are also some difficulties.\p
A study reported in \INature\i during May looks at \Jdinosaur\j footprints preserved in three dimensions. The authors report both similarities and differences between modern-\Jday\j fowl and ancient theropods. Nonetheless, the authors assert their faith that birds evolved from dinosaurs. \p
The meat-eating theropods slopped through Triassic mud in what is now \JGreenland\j, 210 million years ago, and left behind impressions which preserve their three-dimensional foot motions. Examining these, the authors say, reveals significant differences in the position of the big toe, foot posture, and hind limb movement, but they still assert that if today's birds do not move exactly like 210-million-year-old dinosaurs, they are still the closest thing alive today.\p
The theropods were about the size of a human, and strode on their hind legs. Where they left three-toed shallow impressions on firm ground, the sloppier mud areas allowed the feet to sink deeper, leaving unusually long four-toed prints. The deeper tracks captured the movements of the foot below the surface, giving us a three-dimensional record of the theropods' locomotor behavior.\p
The main surprise comes from the "first toe," the equivalent to our big toe. In most birds, this is reversed, allowing the bird to perch on a branch, or to grasp prey. The apparent track of a first toe in the mud seems to suggest that there may have been perching theropods, whose fossils are yet to be found, but the trackways reveal that the imprint is an artifact of the way the theropods walked, rather than an indication of the way the toe pointed.\p
More importantly, the tracks reveal imprints of the sole of the foot, which tell us that the limb moved from the hip, like an alligator, rather than from the knee, like modern birds. When a bird lifts its knee, this raises the sole of the foot before it can leave a print.\p
So given the differences, why do the researchers still believe that birds evolved from dinosaurs? The similarities, they say, are enough, with the toes of the foot converging in the mud - the opposable toe, they say, did not evolve until later.\p
\BKey names:\b Stephen Gatesy and Kevin Middleton.\p
#
"Explaining the scale of things",1093,0,0,0
(May '99)
The pulse rate of a mouse weighing 30 grams (one ounce) may be 600 beats per minute, whereas the heart of an elephant weighing 5,000 kg (5 tons) beats only about 30 times per minute. Many biological processes change in a peculiar but precisely predictable way, depending on the mass of the organism's body. Up until now, nobody has known why, but in mid-May, a group of physicists reported in \INature\i on what they believe is an explanation of allometric quarter-power scaling.\p
This general \Jtheorem\j, they assert, is able to explain the whole scaling phenomenon, with many other biological properties varying in animals and plants as powers of the mass derivable from the number 4. But why 4? All living things exist in three dimensions, so any scaling could reasonably be expected to relate to the cube of some dimension or other, not to some fourth power, yet right across the living world, the fourth-power relationship holds.\p
The physicists began by assuming that the natural selection which drives \Jevolution\j would produce highly efficient circulation systems and other kinds of natural transportation networks. If these systems show the same general behavior they would fit the same underlying mathematical description simply because there would only be one solution delivering peak efficiency. This is a little like assuming that if you are on a mountain, so long as you keep going up, you will be moving closer to the highest point.\p
Stretching the analogy, this was a little like assuming that on a mountain range, you will reach the highest point by going upwards - when you may be on a lesser peak, so it was a slightly risky assumption. All the same, it was a reasonable one to make in exploring the data, because, if they were wrong, there would simply be no answer, or there would turn out to be different answers for different groups of living things.\p
They chose to explore both living and non-living transportation networks that have been shaped by the processes of \Jevolution\j. In other words, they looked at circulatory and respiratory networks in plants and animals, but they also examined networks of streams in the drainage basin of a river, and networks of pipes for transporting \Jwater\j to homes.\p
Taking this last example, they established that if they are to supply \Jwater\j to L\UD\u houses, the minimum amount of \Jwater\j in the pipes at any given time scales at least as L\UD-1\u. The value L in this case is the circulation length or the spatial extent of the colony of houses, while D is the number of dimensions in which the network functions.\p
To supply the same amount of \Jwater\j to all the homes, the total amount of \Jwater\j flowing in the \Jwater\j pipes must be proportional to the number of homes in the system, or L to the power of D, but an additional factor of L is necessary to account for the \Jwater\j filling the pipes throughout the minimum distance necessary to reach the most remote home, and this raises the minimum requirement to L\UD+1\u, which means that three-dimensional objects need to scale in a relationship linked with the fourth power of their size.\p
The interested reader can test this by multiplying the heartbeat rate by the mass, raised to the power of 0.25, which yields products close to the 250 mark. (Using the metric values: your 70 kg reporter, with a normal pulse of about 70, comes in at just over 200, which is close enough to demonstrate the effect.)\p
The relationship also holds for networks of small streams channeling precipitated \Jwater\j into the main stream of a river. These channels are sculpted by erosion, with the forces of efficiency apparently operating just as effectively on inanimate hills and mountains as they do on life forms.\p
\BKey names:\b Jayanth Banavar, Amos Maritan, and Andrea Rinaldo.\p
#
"High carbon dioxide boosts Duke forest growth 25%",1094,0,0,0
(May '99)
Duke University forest plots bathed in the higher carbon-dioxide \Jatmosphere\j expected by the year 2050 experienced a 25% boost to their growth over the first two years of a continuing experiment. The work, previously described in \BThe effects of high CO\D2\d levels,\b August 1998, was reported in \IScience\i in mid-May. While the potential for forests to sop up increasing human-caused carbon dioxide emissions has only been a matter for speculation, there are now some real data, allowing for global extrapolation.\p
If the world's forests were all to be growing 25% faster 50 years from now, these results would suggest that woodland plant life could serve as a "sink" for about half the expected carbon dioxide emissions from \Jfossil\j fuel combustion. Against this, the Duke Forest plots are dominated by 13-year-old loblolly pines, among the world's fastest growing tree species, at their peak growing age, which may lead to an unreasonable expectation. So it is probably safest to treat this figure as indicating the maximum extent of any forest sink effect.\p
Three 100-foot (30 meter) diameter forest parcels, each ringed by 16 towers, are receiving 1.5 times the current atmospheric concentrations of carbon dioxide. The gas is provided around the clock by pipes and valves on the towers. Computerized controls on the towers maintain the raised level of CO\D2\d, regardless of wind direction and speed.\p
\BKey names:\b Evan DeLucia and William Schlesinger.\p
#
"The cost of low cholesterol",1095,0,0,0
(May '99)
The popular view is that high \Jcholesterol\j is bad and low \Jcholesterol\j is good, but this may be open to challenge after a study reported in the June issue of \IPsychosomatic Medicine,\i which was released during May. According to the report, healthy young women with naturally low \Jcholesterol\j levels are more likely to suffer poor psychological health than women with moderate to high \Jcholesterol\j levels.\p
The study looked at 121 American women aged from 18 to 27, with 69 of the subjects having low \Jcholesterol\j levels. The low \Jcholesterol\j women were almost twice as likely to score high on depression and anxiety scales. \p
\BKey names:\b Edward C. Suarez.\p
#
"Bacteria clean up DDT",1096,0,0,0
(May '99)
The journal \INew Scientist\i carried a story in early May about the use of \Jbacteria\j to break down DDT in contaminated soil. All you need to set off the bioremediation process, as it is called, is a supply of chicken and cow manure, old newspapers, straw and wood chips, and soil \Jbacteria\j which convert chlorinated pesticides to less toxic by-products by using enzymes known as dehalogenases to chop out offending \Jchlorine\j groups.\p
The only other solutions are heating the soil in massive kilns, which is expensive, or one of two approaches now progressively being outlawed: trucking the soil somewhere else, or sealing the soil and burying it in landfill. Clearly, it is better to cover and mix the soil with a giant imported compost heap, and deal with the problem on the spot. After that, all that is required is tilling to provide the mix of aerobic and anaerobic conditions that favors the breakdown of the DDT.\p
One reported test carried out by the developers of the patented process, on the site of an old \Jpesticide\j factory, cut DDT levels in the soil by more than 95%. The developers are now working on modifications which can deal with persistent organic pollutants such as PCBs.\p
\BKey name:\b AstraZeneca is the developing company.\p
#
"A marker for malignant brain tumors",1097,0,0,0
(May '99)
A potential marker for the most malignant brain tumors was reported in \IClinical Cancer Research\i during May. Promising to be a valuable diagnostic tool for physicians, the marker points to high-grade gliomas, including glioblastoma multiforme (GBM). These are the most common and most difficult-to-manage brain tumors.\p
It was previously assumed that the variety found in these tumors would make it impossible to find a single marker, but 22 of a sample of 23 GBM patients have a receptor for an immune regulatory factor called interleukin 13, or IL 13.\p
In mouse studies, the researchers have shown that a powerful cytotoxin attaches to the IL 13 and kills the brain tumor in at least 40% of the cases, without harming normal brain cells. Because of these excellent results in animal testing, the researchers hope to begin clinical trials by the end of the year. \p
\BKey name:\b Waldemar Debinski.\p
#
"Melatonin not a good idea?",1098,0,0,0
(May '99)
Melatonin has become something of a fad "supplement" to the diet of the 1990s, but research published in \IChemical Research in \JToxicology\j\i in mid-June, but released on the Web in late in May, suggests that the people taking it may be getting more than they bargained for. The users of melatonin believe they will get antioxidant or sleep benefits, but the report indicates that melatonin reacts with chemicals in the body to form compounds that could alter behavior. \p
Melatonin is a natural product, formed in the body in very low concentrations to control sleep cycles. While it is sold as a sleeping aid and as an antioxidant, the research shows that the antioxidant properties of melatonin are very modest at best. As well, the authors believe that metabolizing excess melatonin may cause more harm than good, because they suspect that some of the secondary products of melatonin have as yet unrecognized health effects.\p
Carbonate and \Jnitrogen\j dioxide radicals are constantly formed in the body from peroxynitrite, and these react with melatonin to form two cyclic metabolites that resemble brain signaling chemicals but whose biological function is unknown. \p
The studies so far have all been undertaken "in the test tube," and will need to be followed up with further studies in animals and humans. For now, though, they are sufficiently confident to sound a note of warning that those future studies may reveal evidence of a serious risk.\p
\BKey names:\b William A. Pryor and Giuseppe L. Squadrito.\p
#
"June, 1999 Science Review",1099,0,0,0
\JA genetic cause for Gulf War syndrome?\j
\JNerve regrowth breakthrough\j
\JTwo new elements\j
\JThe gender gap in mathematics\j
\JA new world record prime number\j
\JThe march of the centenarians\j
\JWhy people die in English hospitals\j
\JOsteoporosis drug reduces the risk of breast cancer\j
\JWHO warns of microbial threats \j
\JA decline in city TB rates\j
\JTaking the scenic route to Io\j
\JRadar provides first 3-D views of moon's poles\j
\JRats operate robotic arm via brain activity\j
\JGetting the Antarctic Treaty online\j
\JGiving a shake to the family tree\j
\JDinosaurs and evolution\j
\JA new bird with the oldest beak\j
\JClearing the water fern menace in South Africa\j
\JWorld's smallest deer species discovered\j
\JAlaska's Columbia glacier getting faster\j
\JContrails and climate\j
\JCassini mission going well\j
\JGray wolf nears recovery in Yellowstone\j
#
"A genetic cause for Gulf War syndrome?",1100,0,0,0
(Jun '99)
An article published in mid-June in \IToxicology and Applied Pharmacology\i suggests that the cause of "Gulf War syndrome" may be linked to possession of a particular gene. The syndrome, variously blamed on anti-nerve gas treatments, smoke from burning oil wells, exposure to nerve gases or other unspecified pesticidal chemicals, and many other possible causes, has always posed one difficult question: why does one person get sick when the next person does not? Because of the patchy nature of the syndrome, many researchers have long suspected that the symptoms were brought about by stress.\p
Now Robert Haley, Bert La Du, and Scott Billecke think the answer may be found in the possession of a gene which controls the levels of an \Jenzyme\j, Q paraoxonase, or PON-Q, an \Jenzyme\j that allows the body to fight off chemical toxins by destroying them. They say that those people with a gene that causes them to produce high amounts of the \Jenzyme\j were protected, and did not get sick after exposure to certain chemicals in Operation Desert Storm, while others who produce low amounts of the same \Jenzyme\j did get sick.\p
The \Jenzyme\j is highly specific for the chemical nerve agents sarin and soman as well as for the common \Jpesticide\j diazinon. Those with high levels of PON-Q can fight off toxins like nerve gas, while others, with low levels of PON-Q cannot fight off even low levels of these toxic chemicals. Significantly, blood levels of a genetically similar \Jenzyme\j, PON-R, which destroys other chemicals, were the same in both sick and well Gulf War veterans.\p
The implication is that there really is a genuine Gulf War syndrome, and that whatever the cause, it is related to one or more biologically active chemicals. There is an outside chance that the depressed PON-Q levels are another symptom of a condition brought about by something else, but the smart money appears to be on a straightforward chemical exposure.\p
#
"Nerve regrowth breakthrough",1101,0,0,0
(Jun '99)
Many nerve-damaging disorders such as multiple sclerosis (MS) and Guillain-BarrΘ syndrome may be a little easier to understand following a report in the \IProceedings of the National Academy of Sciences\i in June. The report describes studies in mice, but the findings should also be applicable to humans, where the molecules involved are essentially identical.\p
Scientists at Johns Hopkins and the National Institutes of Health have verified a previously suspected molecular bridge between nerve cells and their surroundings which, when broken, causes nerves to deteriorate. This deterioration appears to be identical to what happens in a variety of neurodegenerative diseases, and if it could be reversed, might make it easier to stimulate regrowth in the brain and spinal cord.\p
Investigators of MS and other puzzling nerve diseases have focused on the \Jmyelin\j sheath which surrounds nerve cells, acting as an insulator, rather like the plastic on an electrical cord. In these disorders, the \Jmyelin\j comes off the surface of the nerve cell. Beneath it, the elongated part of the cell that relays messages, the axon, disintegrates.\p
The secret seems to lie in a natural \Jlinkage\j, a "molecular handshake" as one researcher calls it, between the \Jmyelin\j and the axon it insulates. The handshake is not structurally important in itself, because it does not hold the \Jmyelin\j in place, but it apparently trips a series of reactions on either side that are necessary for normal nerve behavior.\p
The report looks at molecules called gangliosides which extend from the surface of the nerve cell membrane, and which form one half of the handshake. Gangliosides are a group of complex \Jcarbohydrate\j and lipid-based molecules and a hallmark of nerve cell membranes.\p
The standard way to study the effect of a molecule is with knock-out gene techniques. In this case, the scientists bred mice with simpler-than-normal gangliosides by eliminating a gene that affects molecule length. This was all that was needed to block the attachment of a protein-attractive portion at one end of the ganglioside.\p
In early middle age, these mice showed a deterioration of brain, spinal cord, and peripheral nerves, which was clearly visible under the \Jmicroscope\j. The damage was similar to that seen in chronic human neuropathies, and the mice were weaker, having trouble in moving and in exhibiting key reflexes.\p
The other half of the handshake, a molecule called \Jmyelin\j-associated glycoprotein (MAG), is also known to influence nerve cell stability, and mice with the gene for MAG knocked out also suffer a loss of \Jmyelin\j and deteriorating nerves. The damage in the MAG-deprived mice is very similar, supporting the view that the gangliosides normally pair with the MAG molecules.\p
The main reason why nerves do not grow or regenerate in the brain and spinal cord is because \Jmyelin\j inhibits nerve regrowth there, and the MAG component is believed to lie behind this inhibiting effect. After an injury, the regrowth-blocking, \Jmyelin\j-containing debris gets cleaned up quickly by the immune system in peripheral nerves. The cleanup is slower in the brain and spinal cord than in the peripheral nerves.\p
So in the long term, a clear understanding of the "handshake" may allow scientists to design something that could block MAG's inhibitory action. For the moment, though, this remains just another small step in the working out of a giant scientific conundrum. \p
\BKey names:\b Kazim Sheikh, Ronald Schaar, Thomas O. Crawford, and John W. Griffin.\p
\BKey Web sites:\b http://www.bs.jhmi.edu/pharmacology/schnaarlab/welcome.html, http://www.nmss.org \p
#
"Two new elements",1102,0,0,0
(Jun '99)
After element 114 was declared "found" in January, two more new "superheavy" elements have been spotted by scientists at the US Department of \JEnergy\j's Lawrence Berkeley National Laboratory. The elements, numbered 116 and 118, were formed by bombarding targets of lead with an intense beam of high-\Jenergy\j krypton ions.\p
The elements decay almost immediately into other lighter elements, but the way they break down supports the long-held view that there is an "island of stability" for nuclei with approximately 114 protons and 184 neutrons. Elements become increasingly unstable as they approach and pass uranium, but then stability is expected to increase again. The element 118 isotope formed had a mass number of 293, indicating that it had 118 protons and 175 neutrons.\p
Element 116 was discovered as a by-product: less than a millisecond after its creation, the element 118 nucleus decays by emitting an alpha particle, leaving behind an isotope of element 116 with mass number 289, containing 116 protons and 173 neutrons. This then decays by successive alpha decays, at least down to element 106, at which point the trail was lost.\p
A paper on the discovery has been submitted to \IPhysical Review Letters.\i The method used was proposed by a visiting theoretician, Robert Smolanczuk from \JPoland\j, who calculated that the reaction used should have particularly favorable production rates - previously, "cold" reactions were considered unlikely to give nuclei with an atomic number higher than 112. Ken Gregorich, a nuclear chemist who led the discovery team, said in a release that "We were able to produce these superheavies using a reaction that, until a few months ago, we had not considered using. Our unexpected success in producing these superheavy elements opens up a whole world of possibilities using similar reactions: new elements and \Jisotopes\j, tests of nuclear stability and mass models, and a new understanding of nuclear reactions for the production of heavy elements."\p
The various reports on the \JInternet\j list the usual giant cast of participants that we have come to expect from a modern breakthrough such as this. They also come from many parts of the world.\p
If you want to try this at home, you will need a beam of krypton-86 ions, accelerated to an \Jenergy\j of 449 million \Jelectron\j volts and then you will need some way of directing the beam into targets of lead-208. And even if you can achieve that, scrap the plans to make ornaments from the new element: so far, just three atoms of it have been formed.\p
The \Jenergy\j of the krypton ions is just greater than the Coulomb barrier, the \Jenergy\j required to fuse the two nuclei together. While it is not possible to demonstrate directly that the decay chain starts with element 118, no other reaction sequence makes sense.\p
One key to the discovery was the newly constructed Berkeley Gas-filled Separator (BGS). This has a design with excellent efficiency and background suppression which allows scientists to investigate nuclear reactions with production rates smaller than one atom per week. The strong magnetic fields in the BGS allowed it to separate out the element 118 ions and separate them from all of the interfering reaction products which were produced in much larger quantities.\p
The other key was the unique ability of the 88-Inch Cyclotron to accelerate neutron-rich \Jisotopes\j such as krypton-86 to high-\Jenergy\j and high-intensity beams with an average current of approximately 2 trillion ions per second. The 88-inch Cyclotron has been in use since 1961, and has been upgraded to handle ions, all the way from \Jhydrogen\j at the low end to uranium at the high end.\p
Coming soon: further expeditions towards the island of stability.\p
#
"The gender gap in mathematics",1103,0,0,0
(Jun '99)
In most of the western world, boys and men perform better than girls and women on standardized \Jmathematics\j tests, but the females normally do better in class. A report released on the \JInternet\j in late June anticipated an article by James M. Royer, a \Jpsychology\j professor at the University of \JMassachusetts\j, in the July issue of the journal \IContemporary Educational \JPsychology\j,\i which suggests that the difference is tied in to the ability to perform split-second mental calculations.\p
Royer and his colleagues say that American males score an average of 40 points higher than females on the \Jmathematics\j Scholastic Aptitude Test (SAT) used for college admissions in the USA, but that females get better grades in high school and college. Yet high-school females take advanced \Jmathematics\j courses as often as high-school males, so the difference is not brought about by the girls opting for easier courses. Instead, there is a greater range of scores between males who do well on the tests and those who do poorly, than among females at the ends of the range.\p
At some time in the primary grades, boys begin completing \Jmathematics\j calculations "in their heads," and over time they become better and faster at this task than girls, says Royer. It is possible that this allows males to perform better on the tests, where they spend less time than females solving each problem, giving them more time to spend on the harder problems, boosting their scores. In the classroom, the test advantage is diluted, and other attributes such as preparedness, good study habits, and behavior, where females typically score better, become more important.\p
Royer suggests that the advantage comes from practice and repetition, with males beginning earlier and learning to do the mental calculations faster and in a more automatic fashion. In situations where both males and females are given practice mental calculations, with daily, multiple problems to solve over several weeks, both males and females become faster and the difference begins to disappear, he says.\p
But what of the wider range of scores among males? That, according to Royer, seems to be caused by males being more likely to disengage from academics while females generally pay more attention to their studies. However you look at it, there seems to be a message there for educators.\p
#
"A new world record prime number",1104,0,0,0
(Jun '99)
What is the value of a prime number? School children around the world run into small prime numbers in their arithmetic, but for most people, that is as far as it goes - on the surface. But below the surface is another story, and gigantic prime numbers are essential to things we have come to regard as part of our lives.\p
Large prime numbers are basic to the field of cryptography, privacy, and computer security, which means your purchases over the \JInternet\j, most banking transactions, the security of your digital phone call, and many other things, depend on the use of very large prime numbers.\p
Around the world more than 12,000 computer users have had their computers slaving away, \Jday\j and night, searching for new Mersenne prime numbers. And while they do it just for fun, they all know there is a practical side to their search. \p
Calling themselves GIMPS (the Great \JInternet\j Mersenne Prime Search), and led by a \JFlorida\j number theorist, George Woltman, these enthusiasts sift through the integers, the whole numbers, looking for those which are Mersenne primes. These are rare but findable, and Nayan Hajratwala, using a 350 MHz Pentium II \JIBM\j Aptiva computer running part-time for 111 days, has just found the 38th Mersenne prime.\p
These unusual prime numbers are not the stuff of everyday conversation, though you will find details of Mersenne primes on our \JCD\j-ROM under \1Marin Mersenne\c and in previous updates: \BMathematics update,\b January 1997; and \BA new world-record prime number,\b October 1997, because this is a growing story that we have been following.\p
Since 1996, the search for new Mersenne primes has been made easier. People who join GIMPS are given a block of numbers to work through. They feed the numbers, together with some special software, to their computer, and the computer then uses spare processing time to search for values which may give a new Mersenne prime.\p
Other computer owners use their home computers to search for other special numbers with special applications, exotic things called Golomb rulers, Proth primes, Sophie Germain primes, twin primes, and Cunningham chains. The SETI@home members sift through radio-\Jtelescope\j data, looking for signs of extraterrestrial intelligence, and other groups are planning other searches. \p
All of these groups use a form of shared processing which relies on the \JInternet\j. Members log on, get a set of data (SETI@home) or a block of numbers (GIMPS), and then use spare time on their computers to process the data for anything interesting. \p
According to experts, the future of computing will involve more approaches like this solution which they describe as a "virtual massively parallel supercomputer," as more and more computers are attached to the \JInternet\j. The technique is also called "distributed computing," and while it provides "free time" for what people see as interesting work, it also allows the development of algorithms and methods that will shape computing next century.\p
The number of members involved in GIMPS is small when you compare it with the more than half million computer owners around the world who have joined up in the past two months with the SETI@home organization (http://setiathome.ssl.berkeley.edu/) to search for signs of extraterrestrial intelligence in data collected from radio telescopes, but the GIMPS people now have no less than four Mersenne Primes as trophies to show for their efforts.\p
The most recent Mersenne prime discovery is the 38th Mersenne prime, and it is also the largest yet. Discovered in early June, and confirmed on a supercomputer at the end of June, the actual value of the new "largest prime" was a closely-held secret, until July 5, when it slipped out in a newsletter to GIMPS supporters.\p
While GIMPS has now had four successful "hits" in as many years, these are hardly known outside the world of \Jmathematics\j, while the first hint of success from SETI@home will bring a frenzy of press enthusiasm. George Woltman, the leader of GIMPS, says that he has now found out how to make the press sit up and take at least some notice of something that would usually only be of interest to mathematicians.\p
Woltman explained his new principle to GIMPS members in his newsletter: ". . . tell them it's a secret. The \Ioregonian\i was doing an article on Richard Crandall and when they found out there was a new prime and we wouldn't tell them what it was, their interest level went way up!"\p
And the actual value of the new record number? Well, stated in mathematically precise terms, is 2\U6972593\u-1. But if you wrote this out in full, this number would have 2098960 digits in it - too large to fit on a single floppy disc, enough to fill the pages of a 1000-page paperback, from cover to cover. If you were to read it out, for eight hours a \Jday\j, it would take two months to say all the digits.\p
Even now, around the world, computers are working to find M39, the 39th Mersenne prime. Finding this number should be about 125 times as hard as finding M38 was, so more volunteers are needed. To join in the search, all you need to do is connect to their home page (http://www.mersenne.org/) and follow the instructions. There \Iwill\i be other numbers to find, and there may be smaller Mersenne primes that have been missed out. There is a rather nice cash prize for the finder of M39, just like a lottery, but the tickets are free, say the GIMPS organizers - if you have a computer.\p
#
"The march of the centenarians",1105,0,0,0
(Jun '99)
The US \JCensus\j Bureau revealed in mid-June that the number of centenarians in the US was growing rapidly, almost doubling from about 37,000 counted at the start of the decade, to more than an estimated 70,000 today. And the exponential growth looks set to continue, perhaps reaching 834,000 by the middle of the next century, with a "high-end" estimate of 4.2 million American centenarians, and a low estimate of 265,000.\p
The study, funded by the National Institute on Aging (NIA) at the National Institutes of Health, had some problems with information on the true ages of people 95 and older, though the report indicates that data are becoming more accurate with improvements in birth records.\p
Men are still lagging behind women, and even as death rates in the highest age groups slow, men have shown smaller gains, so that four out of every five US centenarians alive today are women, and this pattern is common in the rest of the developed world, although the US appears to have the world's highest proportion of centenarians among people age 85 and older, with 120 per 10,000 of population.\p
The US centenarians will be studied closely over the years to come. In 1990, less than half of them had completed high school or more, and the impact of educational attainment, a major determinant of health status, will be closely observed as the younger group, or cohort, moves toward very advanced age.\p
The report's data, in a document called \ICentenarians in the United States,\i P23-199, is available on the \JInternet\j at http://www.\Jcensus\j.gov/prod/99pubs/p23-199.pdf\p
\BKey name:\b Victoria A. Velkoff.\p
#
"Why people die in English hospitals",1106,0,0,0
(Jun '99)
A report in the \IBritish Medical Journal\i in early June offered two predictors of death rates found in hospitals: the number of \Jhospital\j doctors per bed and the number of general practitioners per head of population in the localities from which \Jhospital\j admissions were drawn.\p
In a study of death rates in 183 NHS hospitals in England, Professor Brian Jarman showed that the death rates ranged from 3.4% to 13.6%, and that the most powerful predictor of death rates was the percentage of emergency admissions. (In this study 60% of admissions were considered emergencies.)\p
Once this variable was controlled, and adjustments had been made for age, sex, and primary diagnoses, doctor numbers showed up as the key to lowering death rates. England, they point out, has one of the lowest numbers of physicians per head of population of all the member countries of the Organization for Economic Cooperation and Development (OECD). The ratio of doctors to patients in England in 1994 was only 59% of the OECD average, at 1.6 per 1000 patients, as opposed to 2.7 per 1000 overall.\p
Jarman and his colleagues stress the need for care in interpreting \Jhospital\j mortality data, but in careful scientific language, they calculate, based on their findings, that if the number of \Jhospital\j doctors and general practitioners were to be increased in England, it may see a reduction in \Jhospital\j mortality rates.\p
And just to play it safe, they add that this is the first such study, and that their findings need to be validated by further investigations. \p
#
"Osteoporosis drug reduces the risk of breast cancer",1107,0,0,0
(Jun '99)
A mid-June report in the \IJournal of the American Medical Association\i (JAMA) says that a novel \Josteoporosis\j prevention drug, called raloxifene, has reduced the risk of invasive breast cancer in postmenopausal women suffering from \Josteoporosis\j by 76% after 40 months of treatment.\p
Researchers from the University of \JCalifornia\j, San Francisco, used raloxifene, a drug approved by the US FDA in 1997 for the prevention of \Josteoporosis\j.\p
The overall result was a combination of a 90% reduction in the risk of estrogen-receptor positive breast cancer (ER+), and a smaller reduction in the risk of estrogen-receptor negative (ER-) breast cancer among study participants who received raloxifene. The ER+ cancers are responsible for about 75% of all cases of breast cancer in women like those involved in this study. The ER+ breast cancer tumor cells contain a protein which acts as a receptor, responding to estrogen. This stimulates tissue growth in a woman's breasts, increasing her risk of breast cancer.\p
The study looked at 7,705 postmenopausal women in 180 centers, with a mean age of 66.5 years. All had \Josteoporosis\j, and were at low to average risk of developing breast cancer. They were randomized to receive either 60 milligrams (mg) or 120 mg of raloxifene per \Jday\j, or a matching \Jplacebo\j. Women with a history of breast cancer or those who were taking estrogen were excluded from the study. Among these women there were 54 cases of breast cancer confirmed, 40 of which were classified as invasive. Of these 40 cases, 27 were found in the 2,576 women receiving \Jplacebo\j treatment, while the larger group of 5,129 women on a raloxifene treatment had just 13 invasive breast cancers.\p
Raloxifene is one of a class of drugs called selective estrogen receptor modulators (SERMs). These work by mimicking the effects of estrogen in some parts of the body. For example, the SERMs mimic the positive effects of estrogen on a woman's bones and \Jcholesterol\j levels, but they inhibit estrogen's "bad" effects on a woman's risk of breast cancer.\p
Other SERMs with a similar effect include tamoxifen, which also reduces the risk of breast cancer among postmenopausal women, but at the cost of apparently increasing the number of cases of uterine cancer. Raloxifene does have its own side effects, mainly related to blood clots (thromboses), but this can be reduced by ensuring that women at risk do not take this drug.\p
There was also a slight increase in the number of newly reported cases of diabetes, and of existing cases getting worse. The study was mainly set up to find the effect of raloxifene on women's bone mineral density and vertebral fracture risk. The incidence of breast cancer among study participants was evaluated as a secondary endpoint of the trial.\p
\BKey names:\b Steven R. Cummings with a large number of co-workers. The drug is marketed as Evista by Eli Lilly and Co.\p
#
"WHO warns of microbial threats",1108,0,0,0
(Jun '99)
Dr Gro Harlem Brundtland, Director-General of the World Health Organization (WHO), warned in mid-June that the world had dangerously underestimated the threat that \Jbacteria\j and viruses posed to national security and economic growth. The world, she said, may soon miss its opportunity to protect people from this risk. \p
One in every two deaths among young working age adults and children worldwide is caused by just six infectious diseases: AIDS, \Jmalaria\j, tuberculosis (TB), \Jmeasles\j, diarrheal diseases, and acute respiratory infections such as \Jpneumonia\j. Together, these diseases killed 11 million people in 1998, accounting for almost 90% of the deaths from infectious diseases. \p
Just three of the diseases - \Jmalaria\j, TB, and most recently AIDS - have killed six times as many people in the past 50 years as military and civilian casualties from all wars over the same period.\p
The tools to prevent deaths from these diseases can cost $20 for each person at risk, but three of the diseases can be prevented for just 35 cents expenditure on each person at risk. As drug resistance, the emergence of new diseases, and increased travel make control efforts increasingly difficult, the risk of a world epidemic is increasing at a frightening rate.\p
TB medicines are no longer effective in up to 20% of patients in some parts of the world. Two leading anti-\Jmalaria\j medicines have become ineffective in many Asian countries and a third is effective in only half of \Jmalaria\j patients.\p
The indirect effects are almost as bad, because while good health is often a result of economic development and improved living standards, the opposite is also true. To put it simply, when infectious diseases are controlled, a major barrier to economic growth is removed. And the corollary: if disease is not controlled in the developing world, it will eventually spill over into the "developed" world. Dr Brundtland comments "How can anyone - families or communities - reach their economic potential with this burden? Economic development goes hand in hand with good health."\p
The threat demands greater political support; increased financial support for control, surveillance, and research activities; and the adoption of better health and management policies. Dr Brundtland concluded her plea by asking "How will history refer to us if we fail to control infectious diseases at the beginning of the new millennium?"\p
#
"A decline in city TB rates",1109,0,0,0
(Jun '99)
Mid-June saw a report on tuberculosis (TB) which shows that better control is possible. The \IAnnals of Internal Medicine\i carried an account of a study undertaken by researchers from the University of \JCalifornia\j, San Francisco. According to the report, the rates of tuberculosis cases overall and of cases due to recently acquired tuberculosis infection in San Francisco have declined significantly in recent years. This decline appears to be due to the effects of more intensive control measures.\p
The study used molecular epidemiology, which combines DNA fingerprinting techniques and conventional epidemiology. Disease was tracked in a defined population, and the data collected were used to assess the effect of interventions designed to halt tuberculosis. The study covered the period 1991-7, and examined the rate of TB cases overall within the City and County of San Francisco as well as rates within high-risk groups, such as persons infected with \JHIV\j.\p
Annual TB rates decreased from a high of 51 per 100,000 persons in 1992 to 30 per 100,000 in 1997. The incidence of clustered cases, where victims have identical strains of the TB \Jbacteria\j, decreased from 10 per 100,000 in 1991 to 4 per 100,000 in 1997. \p
Authorities believe that 10-15 million people in the US are infected with TB \Jbacteria\j and that around 10% of these infected individuals will develop active TB at some time in their lives. Clustered cases indicate the amount of transmission that is occurring within the population, while non-clustered cases provide an index of TB resulting from reactivation of latent infection, in which a person was infected many years earlier.\p
The early 1990s showed high levels of transmission in San Francisco, so control measures were focused on preventing transmission and on the use of effective preventive therapy. The measures included improved communication between TB control investigators and populations at risk, such as the homeless and substance abusers, as well as expanded use of directly observed therapy, in which health care workers supervise the care of TB patients, and development of an \JHIV\j-related TB prevention program. Other measures introduced improved screening among persons in residential care facilities, jails, and homeless shelters, and better \Jhospital\j infection control measures. \p
In the study period, 2051 cases of TB were diagnosed in San Francisco. Complete DNA fingerprinting data were obtained in about 1,500 cases, and two instances of the same strain within a one-year period was taken as establishing a cluster.\p
\BKey name:\b Robert M. Jasmer (who worked with a large team, supported by grants from the National Institutes of Health and the American Lung Association).\p
#
"Taking the scenic route to Io",1110,0,0,0
(Jun '99)
The \JGalileo\j \Jspacecraft\j zoomed by Callisto on June 30 in the second of four encounters designed to bring the \Jspacecraft\j closer to Jupiter's volcanic moon Io. The craft came within 1,047 km of Callisto, modifying its \Jorbit\j to bring it closer to Io, the only other body in the solar system known to have active volcanoes. \p
Io is literally bursting with volcanoes. Some of them blast sulfur plumes 300 km above the moon's surface, and one, Prometheus, has probably been active for the past 18 years. In late 1999, \JGalileo\j is scheduled to make two daring close approaches to Io, possibly flying through a volcanic plume. The \Jspacecraft\j flew within 127,000 km of Callisto in early July, and we will report on what was found next month. \JGalileo\j was within just 900 km of Io in December 1995 but the \Jspacecraft\j was not taking pictures at that time, so the early July encounter may provide some of the best ever pictures of Io's volcanoes.\p
That flight also took the \JGalileo\j craft through the outer edge of the Io torus, a gigantic ring of ionized gas circling Jupiter, formed by sulfurous material ejected from Io's volcanoes. With a diameter the size of Io's \Jorbit\j it spans 844,000 km and has an important impact on Jupiter's magnetic environment.\p
As Io moves along its \Jorbit\j and through this magnetized plasma torus, a huge electrical current flows between Io and Jupiter. Carrying about 2 trillion watts of power, it is the biggest DC electrical circuit in the solar system. \JGalileo\j passed through the torus in 1995, but with a better understanding of how the torus operates, scientists were hoping to gather a great deal more useful information from this pass.\p
Callisto, with a diameter of 4,800 km, is nearly the same size as the \Jplanet\j Mercury, and is notable for its icy surface. Callisto's surface is the most heavily cratered place in the solar system. All of the close passes of Callisto will involve careful study of the "disaggregation" of the smaller craters, less than 1 km across. "Disaggregation" involves partial obliteration by an unknown process, and while it is understandable on \JEarth\j or Mars where there is weathering and erosion, the cause on Callisto remains a mystery.\p
Instruments will be used to shed some light on this, but also to study Callisto's thin \Jatmosphere\j and any particles that may have been thrown up by impact events. Perhaps more importantly, \JGalileo\j magnetometer data released in 1998 suggest that Callisto, like Europa, may have an underground ocean. Callisto's magnetic field fluctuates in time with Jupiter's rotation. The best explanation for Callisto's peculiar magnetism seems to be an underground layer of melted ice. If the liquid is salty like \JEarth\j's oceans, it can carry sufficient electrical currents (induced by Jupiter's powerful rotating magnetic field) to produce a fluctuating magnetic field around Callisto.\p
While \Jwater\j and \Jenergy\j seem to be the minimum requirements for life, there may not be enough \Jenergy\j on Callisto, where the ocean is probably only heated by radioactive elements, while Europa has tidal \Jenergy\j as well. That makes Europa a better prospect for life, according to NASA scientists.\p
The next two flybys of Callisto are scheduled for August 14 and September 16, 1999.\p
#
"Radar provides first 3-D views of moon's poles",1111,0,0,0
(Jun '99)
In late July, the orbiting Lunar Prospector \Jspacecraft\j may, depending on a NASA decision, crash into the lunar south pole. The controlled, high-speed dive into a massive crater, 50 kilometers (32 miles) across and 2.5 kilometers (1.5 miles) deep, is planned to provide absolute proof of the existence of \Jwater\j on the moon.\p
But will there be any \Jwater\j? Going on images published in \IScience\i in early June, the chances have just increased considerably. A remarkable series of radar images were published, showing those parts of south pole craters which are permanently shaded from the \Jsun\j. One of Jean-Luc Margot's images is reproduced here with permission, but the white areas are not snow - just those parts of several craters where ice might be expected to survive.\p
The images were created by Margot, working with JPL researchers with the radar antennas of NASA's Deep Space Network at Goldstone, \JCalifornia\j. This south pole image reveals a number of craters that are in permanent shadow from the \Jsun\j and which are potential havens for \Jwater\j ice. The images were produced through a technique called radar interferometry. \p
\BPicture:\b \IJ.L. Margot/Cornell University\i\p
Ice has a definite "signature" when probed with radar, and 1996 data from the orbiting lunar \Jspacecraft\j Clementine suggested that there was ice at the south pole, though this was later discounted (see \BNo ice on the moon after all,\b June 1997).\p
The problem is that the Clementine radar, like a more recent probe from Arecibo, would only detect large slabs of ice, so ice buried in the lunar dust would be completely missed. In 1998, the neutron spectrometer aboard the Lunar Prospector orbiter detected significant deposits of \Jhydrogen\j at the moon's north and south \Jpoles\j (see \BBack to the moon again,\b February 1998), which may, or may not, indicate the presence of \Jwater\j ice.\p
The \Jsun\j's limb only rises 2 degrees above the horizon at the moon's south pole, while Goldstone's radar beam can angle as steeply as 6 or 7 degrees above the horizon, allowing it to peer into the craters which remain permanently in the solar shadow. Now all that is needed is the go-ahead from NASA to crash Lunar Prospector into one of these craters.\p
\BLunar Prospector web site:\b http://observe.ivv.nasa.gov/nasa/space/prospector/lunar1.html\p
#
"Rats operate robotic arm via brain activity",1112,0,0,0
(Jun '99)
Articles in \INature Neuroscience,\i like science articles generally, are not often the basis of news stories on \Jtelevision\j - not unless there is something negative to report. Regiments of scientists and science communicators have been saying this for half a century or more. Late June saw a different aspect, with the news that the brain signals of rats can be used to control a robotic arm. Once, this would have been the stuff of science fiction, but now it is a reality, and one that had the print and electronic media paying eager attention.\p
The obvious application for such a development is to allow paraplegics and quadriplegics to take control of their equipment and that part of their lives which is now out of reach, but that is still a few years away. At least it is closer than a full-strength human-machine interface, but that aspect was also raised in many of the news reports. Still, at the very least, it indicates a better chance that people may, in the future, be able to control a prosthetic device as they would their own biological limbs.\p
The system requires the rats' brain signals to be recorded onto electrode arrays, so these can be used to control a robotic arm without any actual muscle movement. The rats were trained to control a robotic arm by pressing a lever to receive a reward. While they were pressing levers, the researchers used arrays of electrodes implanted in the rats' brains to record the simultaneous activity of dozens of neurons in the areas that control muscle movement.\p
Once they had identified which neurons in the brain are responsible for moving the robotic arm, the researchers could, in their words, program the movement into the brain. They then switched control of the reward from the lever to the implanted electrodes, and the rats quickly learned to move the robotic arm to receive the reward solely through brain activity, without actually moving their muscles.\p
\BKey names:\b John Chapin, Miguel Nicolelis, Karen Moxon, and Ronald Markowitz.\p
#
"Getting the Antarctic Treaty online",1113,0,0,0
(Jun '99)
When they signed the Antarctic Treaty in 1959, the twelve original signatory nations, Argentina, \JAustralia\j, \JBelgium\j, Great Britain, \JChile\j, \JFrance\j, \JJapan\j, New Zealand, \JNorway\j, South \JAfrica\j, the United States, and Union of Soviet Socialist Republics, all agreed to manage the entire Antarctic continent for "peaceful purposes only" based on international cooperation. For the last four decades, the Antarctic Treaty System, which now includes 43 nations, has created innovative strategies for resolving issues from commercial fisheries and conservation, to tourism and environmental protection.\p
Previously, most of the documents covering the treaty have been available only in paper form, but now they can be found on the World Wide Web thanks to the efforts of an Ohio State University researcher. The \Jdatabase\j can be accessed at http://webhost.nvi.net/aspire/ and is now also available on \JCD\j-ROM.\p
Paul Arthur Berkman, an \Jearth\j scientist at Ohio State University's Byrd Polar Research Center, has been using the Antarctic Treaty searchable \Jdatabase\j in an interdisciplinary course on Antarctic science and policy, and arranged for the creation of the electronic form as an alternative to the large and unwieldy paper form. Driven by a fast and powerful search engine, the \Jdatabase\j gives users rapid access to the full range of treaty documents.\p
Search results can be presented chronologically or by topic on a single screen, and are already being used by policy-makers, as well as by students. Future plans may involve adding to the \Jdatabase\j, so that it becomes a learning tool for those wishing to study the Antarctic in a more general way.\p
In fairness, we have to note that when we tested the \Jdatabase\j, just before publication, a number of the parts seemed to be "down."\p
\BKey names:\b Paul Arthur Berkman and George J. Maynard.\p
#
"Giving a shake to the family tree",1114,0,0,0
(Jun '99)
Biologists can always be persuaded to argue about the best way to place animals into a family tree. Some prefer to use body shapes and \Jfossil\j evidence, while others will tell you that most fossils have no descendants anyhow, and that the chemicals you find in blood and other cells are descended from ancestors that survived and reproduced. In the end, all biologists are forced to join either the "morphology" camp, or the "molecular evidence" camp.\p
But even if you feel that present-\Jday\j anatomy is the key issue, which part of the anatomy do you look at? An Australian koala has an arrangement of its teeth very like that of an Australian possum, but the female koala, being a marsupial, has a pouch for its young, and that pouch points backwards, like the pouch of a wombat.\p
A backward-facing pouch may make sense for a wombat, which digs burrows, but it makes little sense in a tree-climbing koala. So, say the molecular evidence camp, is the koala a wombat-up-a-tree, or is it a fat possum? As far as they are concerned, there is no problem: the blood sera say that the koala and the wombat are close relatives, while the possum is but a distant cousin.\p
The problem with morphology is that people assume a feature is unlikely to evolve independently on more than one line, so a common characteristic ought to indicate a relationship, but features like the arrangement of the teeth only have a few lines to move along, so the "dental formula," the pattern of teeth in the skull, is probably less important than the arrangement of the pouch. So why do biologists worry about teeth so much?\p
The answer lies in the fossils, where we can never see soft parts such as the pouches, but we can often see the teeth, and how they lie in the skull or the jaw. And this is where the molecular evidence camp likes to point out that most of the fossils have no descendants, while all of the chemicals have ancestors . . .\p
Your reporter is very much in the molecular evidence camp, and that may need to be kept in mind during the discussion which now follows. During June, a report in \INature\i suggested that we now have enough evidence to change the shape of the main family tree of the animals, ourselves included. A team from \JFrance\j, Britain, the USA, and \JRussia\j say that the tree of life has just three main limbs.\p
That is to say, the animal kingdom divides into three primary lines of descent that first diverged from a common ancestor at least 540 million years ago, and which gave rise to most animals (other than the \Jjellyfish\j and sponges) which are still living today.\p
The evidence comes from examining the Hox genes in three distinct kinds of animals: a priapulid, an unsegmented marine worm related to insects; an unusual and ancient marine brachiopod (shellfish) called a lamp shell; and a polychaete, a segmented worm related to earthworms and leeches. The Hox genes are a generalized toolbox, operating in the development of animals, helping to organize cells into the different body parts and determine such things as number and placement of legs, wings, and other appendages.\p
The information collected was then compared with earlier data from mice, fruit flies, leeches, and sea urchins, to show that the animal kingdom divides neatly into three evolutionary lines. If the same Hox gene is found in two animals, no matter how different they appear to be, it is likely that they once had a common ancestor, which also possessed that gene.\p
Now, animals with backbones are on the same family tree as the starfish and their relatives - a relationship which has long been argued on the evidence of embryology and development. The second branch is more revolutionary, holding all the animals which molt, such as crustaceans, insects, roundworms, and priapulids. The third branch holds brachiopods, earthworms, polychaetes, mollusks, and flatworms, each of which either has a feathery feeding structure or a special larval stage.\p
This tree can mainly be applied to animals with bilateral symmetry. The data on the "roundish" animals, the ones with radial symmetry like \Jjellyfish\j and sponges, are still incomplete, but it seems a reasonable bet that they will form at least one more branch of the tree (see \BFlatworms, symmetry and us,\b March 1999).\p
The findings tell us that the pre-Cambrian ancestor must have had more Hox genes than we might have expected and that it was a fairly sophisticated animal. We may have no fossils of this animal, but now, at least, we know something about its genes.\p
\BKey names:\b Jennifer K. Grenier and Sean B. Carroll. \p
#
"Dinosaurs and evolution",1115,0,0,0
(Jun '99)
A special issue of \IScience\i in late June was devoted to \Jevolution\j. According to paleontologist Paul Sereno, \Jdinosaur\j fossils offer the answers to some of scientists' biggest questions about \Jevolution\j. Sereno suggests they answer questions such as: How does an upstart group of species beat the dominant group? How do organisms develop nifty new tricks like flying?\p
Throughout the Mesozoic era (248 to 65 million years ago), all of the land animals more than one meter long were dinosaurs. Once they arose, the \Jdinosaur\j species radiated fast, filling all the niches, just as spectacularly as the mammals did in the Cenozoic era which followed, and continues today.\p
So how did the dinosaurs make it to the top? Did they take advantage of some mass \Jextinction\j event to move in on vacant territory, or did they gradually push other, less well-adapted animals out of the picture? It looks as though they were opportunists: the dinosaurs' dominance did not begin until at least 15 million years \Iafter\i the first dinosaurs appeared on \JEarth\j, but it did coincide with a mass \Jextinction\j that killed off many other reptiles. So it now looks like both the dinosaurs and the large mammals that followed the dinosaurs' \Jextinction\j came to dominate their world essentially by accident - not as the inevitable outcome of natural selection.\p
According to standard wisdom, the dinosaurs did not disappear completely, because some of them evolved into birds. Several spectacular finds in recent years have shown scientists just how effectively dinosaurs recycled many of their features that had been originally designed for other purposes. Then once the first avian dinosaurs came into being, they quickly evolved in ways that helped them fly even better, shrinking several times in body size, for example.\p
The dinosaurs reached the height of their diversity during the breakup of the supercontinent \JPangaea\j in the early Cretaceous period (about 150 to 100 million years ago), and it now appears that this diversity arose as the result of a patchwork of regional extinctions of certain species and replacements by others.\p
#
"A new bird with the oldest beak",1116,0,0,0
(Jun '99)
The family tree of the birds took a bit of a shake during June. A newly discovered fossilized bird from China offers new evidence that early bird \Jevolution\j was considerably more complex than previously believed, according to a \INature\i report. The almost-complete skeleton dates back 130 million years, and has a horny beak. The beak is pointed and turned up at the tip, very much like the \Jcartoon\j bird Woody Woodpecker, according to one of the discoverers, Alan Feduccia.\p
The presence of a beak at this time throws doubt on the popular view that the birds descended from dinosaurs, suggesting that they arose instead from unknown earlier \Jreptile\j ancestors. The oldest known bird, \IArchaeopteryx,\i which dates to 150 million years ago, had no beak, but it did possess a very reptilian jaw with teeth. \p
The researchers have named the new species \IConfuciusornis dui\i in honor of Wenya Du, who collected the specimen near the edge of a lake in northeast China's Liaoning province and donated it to the \JBeijing\j institute. It is a smaller but close relative of \IConfuciusornis sanctus,\i another crow-like bird of the same age the researchers found and reported in 1995.\p
The bird shows signs of being more advanced to the modern bird form in that it had a beak, yet at the same time it was less advanced than the older \Iarchaeopteryx\i because it had two small openings in the rear of its skull, rather like those in the \Jreptile\j ancestors of the birds. In other words, the \Jevolution\j of the birds was no simple linear advance: rather it was a messy tree of a thing, where most of the branches and twigs died off.\p
It is highly probable that neither \IConfuciusornis\i nor \IArchaeopteryx\i was an ancestor of modern birds. Instead, they were most likely just a couple of the discarded twigs. \p
In 1979 Feduccia made international news by showing that the oldest known bird, \IArchaeopteryx,\i could fly because its wing feathers were asymmetric. Barbs on one side of its wing feather quills clearly grew longer than barbs on the other side, and this is only found in flying birds. The new bird, \IC. dui,\i also grew asymmetric wing feathers. The bird also had highly curved foot claws and reversed big toes, showing clearly that it was a tree-dwelling creature.\p
It appears to have lived in social groups, and the males appear to have long tail-plumes, indicating that the sexes differed significantly from each other. According to Feduccia, this indicates that it was a tree-dwelling bird, rather than an \Jearth\j-bound, feathered \Jdinosaur\j as the advocates of a \Jdinosaur\j-origin of birds have argued.\p
As well, the half-moon shaped bone in the wrist that's been used to support a dinosaurian ancestry for birds is the same in \IConfuciusornis\i and \IArchaeopteryx\i as in modern birds, but is a different bone in dinosaurs, according to another of the authors, Larry Martin.\p
\BKey names:\b Alan Feduccia, Lianhai Hou, Fucheng Zhang, Larry D. Martin, and Zhonghe Zhou.\p
#
"Clearing the water fern menace in South Africa",1117,0,0,0
(Jun '99)
Southern African lakes have five serious \Jwater\j weed problems, including the \Jwater\j hyacinth which has caused "choking" problems on Lake Victoria, near the Equator. Further south, the biggest problem for farmers has been the Red \JWater\j Fern, also known by its scientific name, \IAzolla filiculoides.\i This also causes "choking," by blocking off sunlight from the waters below, while the thick covering of green leaves looks like a rich grass meadow, luring sheep and other stock animals to feed. Instead, they drown when the thick carpet proves to offer no support to hooves.\p
The answer, according to a team of scientists from the University of the Witwatersrand, \JJohannesburg\j, and the Plant Protection Institute in \JPretoria\j, may be found in a voracious weevil, \IStenopelmus rufinasus,\i which is no more than 2 mm long. This tiny member of the beetle family breeds on the \Jwater\j fern, feeds on it, and is capable of completely clearing lakes and dams without hurting any other plant or animal life!\p
The surface weeds all cause a problem by forming a light-proof barrier, which leads to a loss of oxygen which in turn spells death to many other living creatures in the infested \Jwater\j. There has been a near \Jextinction\j, for instance, of a fish species, the Eastern Cape Rocky, \ISandelia bainsii.\i \p
The \Jwater\j from Azolla-infested lakes changes color, clogs irrigation pipes, and smells and tastes bad, according to Andrew McConnachie, one of the Witwatersrand scientists involved. The clogging is a serious problem when it affects drinking \Jwater\j, as it causes pumps to burn out.\p
The weevil was imported from \JFlorida\j, USA in 1995, and 17,500 insects have been released onto the \Jwater\j fern since December 1997. The scientists report that 24 of their 66 sites, or 36%, have been cleared so far, and they estimate that it takes about seven months for a site to be cleared.\p
The weevil has shown an ability to thrive under South African conditions, including the ability to survive very cold winters, and so far, it has had no problems with predators - next month, we bring you the story of a mite, introduced to control gorse in the western United States, which has not been so successful.\p
\BKey names:\b Andrew McConnachie, Martin Hill, and Marcus Byrne.\p
#
"World's smallest deer species discovered",1118,0,0,0
(Jun '99)
A tiny adult deer, measuring just 50 cm (20 inches) at the shoulder and weighing 11 kg (25 pounds), has been confirmed through DNA testing as a new species, according to a recent study led by the Bronx Zoo-based Wildlife Conservation Society (WCS), which announced the find on the \JInternet\j during June, acclaiming it as the world's smallest deer. The information was also released in the journal, \IAnimal Conservation.\i\p
The "leaf deer" or "leaf muntjac," lives in remote mountain regions of Southeast Asia, and it was first seen by WCS biologist Alan Rabinowitz in 1997 during field surveys in northern \JMyanmar\j (\JBurma\j). Rabinowitz obtained specimens from local hunters and took them back to New York, where DNA sequencing showed that the animal was clearly distinct from any relatives.\p
Several new large mammals have been discovered in Southeast Asia in recent years, particularly in the Annamite Mountains of \JCambodia\j and Laos. \JMyanmar\j, however, remained virtually unstudied by western science for decades, until WCS began surveys in this isolated nation in 1994.\p
With the area's wild habitats fast disappearing, researchers have been quick to underline the need for even more intense studies of the unique wildlife living in some of the world's most remote areas.\p
#
"Alaska's Columbia glacier getting faster",1119,0,0,0
(Jun '99)
The fastest moving \Jglacier\j in the world, the Columbia \JGlacier\j in \JAlaska\j, got even faster during June. In fact, the \Jglacier\j has increased its speed from 25 meters to 35 meters per \Jday\j in recent months, according to Tad Pfeffer, a University of \JColorado\j at Boulder glaciologist.\p
The \Jglacier\j is near \JAnchorage\j, and calves both large and small icebergs into Prince William Sound. It has been under intense study since the 1970s, when the US Geological Survey determined that a marked increase in its calving rate might pose a risk to shipping lanes in the sound, best known to the outside world as the site of the devastating 1989 Exxon Valdez oil spill.\p
Since 1982, the Columbia \JGlacier\j has retreated about 11 kilometers, and it is now 55 kilometers in length, 5 kilometers wide and more than 900 meters thick in some places. A tidewater \Jglacier\j, the ice river's terminus, or end, continues to rest in several hundred meters of seawater. The Columbia is the last of the 51 tidewater glaciers in \JAlaska\j to make a drastic retreat. \p
Many of the world's glaciers outside of \JGreenland\j and \JAntarctica\j have shrunk to half their former size during this century, an effect often attributed to global warming. Tidewater glaciers seem not to respond to climate change as directly as other glaciers until they reach a critical thinness, but when a \Jglacier\j thins to a critical level, it seems to be the point of no return, according to Pfeffer.\p
Future changes may involve the Columbia \JGlacier\j front retreating up to four kilometers over the next year, or it may begin to flow much faster into Prince William Sound, perhaps as much as 50 meters a \Jday\j. If it follows the pattern of other tidewater glaciers in \JAlaska\j, it will eventually retreat back to where its channel rises above sea level, said Pfeffer. If the speed increases it would mean more icebergs to threaten shipping.\p
#
"Contrails and climate",1120,0,0,0
(Jun '99)
A paper to be published in \IGeophysical Research Letters\i on July 1, but issued in preprint form late in June, suggests that by 2050, increased flights by jet \Jaircraft\j will have a significant impact on global climate through the greater number of contrails they will produce.\p
Contrails are ice clouds with radiative effects similar to those of cirrus clouds. They have a short life when they form in dry air, but may persist for hours in moist air, or even form new cirrus clouds, indistinguishable from the natural clouds.\p
The effect of contrails produces a small warming of the \JEarth\j's \Jatmosphere\j, although their impact is currently small as compared to other greenhouse effects. Over the next 50 years, the effect is likely to increase by a factor of six. Contrails in 1992 added an estimated 0.02 watts of warming per square meter globally, about 1% of the total of human-generated greenhouse effects.\p
Contrails are not evenly distributed over the globe. They are concentrated over parts of the United States and Europe, where local warming reaches up to 0.7 watts per square meter, some 35 times the global average. The authors estimate that the resulting \Jtemperature\j increase is likely to reach between 0.01 and 0.1 degrees \JCelsius\j (0.02 and 0.2 degrees Fahrenheit), just on current air traffic. With air passenger-kilometers increasing by 7.1% annually between 1994 and 1997, the effect seems likely only to become greater.\p
The paper is very much an early warning, setting some of the parameters by little better than an order of \Jmagnitude\j, although the authors have tried to make their estimates conservative. Contrails, they say, cover somewhere between 0.5% and 2% of the sky over Europe and North America on average, and they estimate a global average of coverage by linear contrails at around 0.1%. The peak localized maxima are about 3.8% cover over northern Europe, and 5.5% over the eastern United States.\p
While this may be too high an estimate, the effect is likely to be a significant one during the next century, especially if other warming effects bring about an increase in \Jhumidity\j.\p
\BKey names:\b Patrick Minnis, Ulrich Schumann, David R. Doelling, Klaus M. Gierens, and David W. Fahey.\p
#
"Cassini mission going well",1121,0,0,0
(Jun '99)
On June 24, on the 617th \Jday\j (see \BCassini on its way,\b October, 1997) of its voyage to Saturn, the Cassini \Jspacecraft\j successfully completed its second flyby of the \Jplanet\j Venus, once again on time and on target. Cassini came within 600 kilometers (about 370 miles) of the \Jplanet\j, as planned, using Venus's gravity to give the \Jspacecraft\j a boost in speed to help it reach Saturn, more than 1 billion kilometers away. A total of four flybys of planets, two of Venus, and one each past \JEarth\j and Jupiter, will give Cassini the speed it needs to reach Saturn.\p
(At first glance, the laws of physics would seem to rule out any such gain, but when a \Jspacecraft\j comes up on the \Jplanet\j from behind, the \Jplanet\j is slowed down, ever so slightly, while the \Jspacecraft\j is given a significant acceleration. For more on this, see \1gravity assist\c.)\p
Most of Cassini's scientific instruments were set to make observations during the Venus flyby. Scientific data from the near approach were being transmitted to \JEarth\j in late June and early July. Cassini will be back here in mid-August, passing a mere 1,166 kilometers (724 miles) above our \Jplanet\j, before being flung off to Jupiter for a December 30, 2000 flyby. The giant \Jplanet\j's gravity will swing Cassini around to put it on course for arrival into \Jorbit\j around Saturn on July 1, 2004.\p
\IPicture courtesy of JPL/NASA\i - see http://www.jpl.nasa.gov/cassini/ for more pictures, or see http://www.jpl.nasa.gov/basics/ for more information on planetary exploration.\p
#
"Gray wolf nears recovery in Yellowstone",1122,0,0,0
(Jun '99)
In late June it was announced that gray wolves, reintroduced into \JYellowstone\j National Park from Canada, are close to being taken off the endangered species list, as the 31 wolves introduced in 1995 have now grown to a population of around 120. The news came from Douglas Smith of \JYellowstone\j National Park in Wyoming at the June 1999 Society for Conservation \JBiology\j meeting.\p
Wolves have been savagely attacked for centuries, and were eliminated in the west in the 1920s and 1930s. Many ranchers still bitterly oppose bringing them back, but wolves play an important part in the \Jecosystem\j. Wolves have boosted \Jbiodiversity\j in and around \JYellowstone\j: for instance, there are fewer elk and coyotes, and more eagles, pronghorn, foxes, and wolverines.\p
Ranchers are allowed now to kill wolves which attack stock, but once the species is "delisted," ranchers will be able to kill all wolves which enter their land. This will happen when there are 10 breeding pairs in each of \JYellowstone\j, Idaho and Montana. Right now, there are 10 breeding pairs in \JYellowstone\j, 11 in Idaho, and 6 in Montana.\p
#
"July, 1999 Science Review",1123,0,0,0
\JThe great genetic engineering battle\j
\JWhat are GM products?\j
\JWhat is 'genetic engineering'?\j
\JAdding new genes\j
\JThe Frankenstein fears - are they justified?\j
\JThe emotional arguments\j
\JFrankenstein revisited?\j
\JThe scientist who cried wolf\j
\JThe labeling situation\j
\JThe activists and the poplars\j
\JMonsanto and Roundup Ready\j
\JThe poisonous plants\j
\JThe real dangers of genetic manipulation\j
\JThe ethical arguments\j
\JIs it too late?\j
\JStirring the pot: the Doomsday weapons and GM\j
\JA new excuse to eat lobster, crab, and crayfish\j
\JPolio still lingers in Africa\j
\JSafety tests for four herbs \j
\JMelanoma 'vaccine'\j
\JSaving children from AIDS\j
\JScientific misconduct?\j
\JAsteroid water\j
\JAsteroid visits\j
\JAsteroid risk\j
\JLunar Prospector\j
\JThe collider that ate the Earth!\j
\JNew aspen for pulp and paper industry\j
\JEarthquake analysis \j
\JMites take a pasting\j
\JA quick cold snap\j
\JTour de France science\j
\JLife, the Universe, and everything else: mathematical truths\j
\JPopulation news\j
\JImplanted defibrillators safe enough around anti-theft systems\j
\JNew tick-borne pathogen \j
\JFirst complete physical map of a higher plant genome\j
#
"The great genetic engineering battle",1124,0,0,0
(Jul '99)
As we approach the end of the century in which genetics became a science, we see the geneticists, and those using the work of geneticists, under severe attack. In Europe, and to a lesser extent in Australia, there is a groundswell of public opinion, demanding that GM foods, as the products of genetically manipulated crops are usually called, must be banned or at least stopped from growing, immediately. As part of this emotive campaign, opponents are calling for all foods which contain GM food products to be labeled.\p
The print and electronic media have been less than even-handed, sticking to the principle that Chicken Little would never have got famous crying that the sky was not falling. This review article is intended to outline the scientific background, to explain the scientific principles, and to describe the methods used, both in altering the genomes of plants and animals, and also in arguing against the practice. The issue is a complex one, and needs to run on facts, not emotions.\p
Neither the opponents nor the proponents of genetic manipulation will be entirely happy, for neither side has a monopoly on the good arguments. The real problem with a uninformed debate is that it is not really a debate. Many of the potential dangers of genetic manipulation have not been considered properly by supporters or opponents, while many of the arguments against GM have been emotional, and from a scientific viewpoint, wrong and irrelevant. Slogans like "Frankenstein foods" and plaintive cries of "No genes in my food" are as worthless as the bland reassurances of those who stand to profit from widespread use of GM foods and products.\p
#
"What are GM products?",1125,0,0,0
(Jul '99)
GM foods are foods produced from plants or animals which have been genetically modified or manipulated or engineered. In most cases, this manipulation involves the addition of transgenes, genes taken from another organism, cleaned up, and then inserted into a new organism, where they give the organism receiving the genes some new power or ability. The power may be as simple as allowing a rose to have blue petals (not yet achieved), or as complex as causing a banana to produce a vaccine, or resist a viral disease which attacks its leaves.\p
Future plans include vaccine-producing potatoes, plants which fluoresce when they are stressed, warning farmers of drought or disease, oilseed rape which is able to produce biodegradable plastics, and even cotton which provides us with wrinkle-free fibers. For other planned or potential uses of GM methods, see \BGrowing blind\b June 1997, \BOral vaccine against botulism\b November 1997, \BMuscle from bone\b March 1998, \BEat up your greens\b April 1998, \BGenetic control of malaria sought\b October 1998, \BEight calf clones\b December 1998.\p
Typically, GM is used on the large scale to develop crops which resist disease, produce higher yields, or are easier to grow. In almost every case, the improvement is brought about by the addition of a gene which is owned and patented by a commercial company. A few GM crops have had a gene or genes "knocked out" to change the characteristics of the product, like poplars engineered to produce wood, which are easier to turn into paper.\p
#
"What is 'genetic engineering'?",1126,0,0,0
(Jul '99)
Depending on the source you use, you can get remarkably different answers to this question. Most school text books, for example, have not been able to keep up with the rapid progress in this branch of science, and are likely to speak only of bacteria which have been "engineered" to carry and express the genes of other organisms. For example, a simple unicellular organism may be given the gene for human insulin, or growth hormone, or some other vital chemical. The cells can then be cultured in their millions, and processed to extract the valuable chemicals.\p
This is important when the only known source of a drug is biological, like \1Taxol\c, a potent anti-cancer drug extracted from the bark of yew trees. Given the huge number of trees required to make a small amount of the drug, it would be far easier and far cheaper, and far better for the environment if the molecule could either be made in the laboratory, or grown in a bacterium.\p
That simple use of "genetic engineering" was soon overtaken by the creation of transgenic plants and animals, organisms which \Icarry and express\i the genes of other organisms. As these techniques have come to the fore, the term "genetic engineering" has been replaced by "genetic manipulation," or the less emotive "genetic modification." It remains, however, the same set of practices, with regular upgrades in the available methods.\p
Another term which is now used less is "recombinant DNA technology," which has its origins in the discovery of conjugation and recombination in bacteria, where genes were swapped between bacterial cells in nature. Once transferred, the recombinant genes are recombined into the genome, the genetic material of the recipient. A second meaning of "recombinant DNA technology" is now represented by "gene therapy." This involves the insertion of "healthy" genes into the cells of people with a genetic illness, so they can be cured of their disease.\p
Many of the principles used across the whole field are common, but this review article looks mainly at the genetic modification of crop plants and animals. At the same time, it is important to keep in mind the ability of bacteria to transfer genes across the species barrier from one bacterium to another, if we are contemplating the genetic manipulation of bacterial genomes.\p
#
"Adding new genes",1127,0,0,0
(Jul '99)
In the mid-1980s, there was just one way of introducing genes into a plant, and that was to insert them into a Crown Gall bacterium, \IAgrobacterium tumefaciens,\i and use that to carry the genes across the wall of the plant cell. Now there is a whole range of tools available, both biological and technological.\p
The biological solutions depend on finding a living entity, something like the brown gall bacterium, but more efficient in transferring genes. Efficiency in this context comes in several parts: the capacity to carry genetic material, and the ability to persuade the gene to take up residence inside the target, and express itself.\p
The first transgenic plants were created a very long time ago, in nature. We can be certain of this because \IAgrobacterium tumefaciens\i is a natural genetic engineer which lives in the soil. If a plant has been injured or damaged, the bacterium will invade the wound and influence the plant cells around the wound to multiply wildly, forming what gardeners would call a gall, but which is what we would call a tumor in an animal, and this term is now preferred. (Literally, a tumor is a mass of foreign cells growing in an uncontrolled manner.)\p
The bacterium carries two packets of genetic material: the main chromosome, which has all the information it needs to grow and reproduce in the soil, and a much smaller piece of DNA called a tumor-inducing (Ti) plasmid. Under normal circumstances, the \Iagrobacterium\i does not produce any proteins from the Ti plasmid while it is living in the soil. Once it invades a plant, the T-DNA integrates into one of the plant's chromosomes, and it then becomes active.\p
The T-DNA codes for proteins, which trigger the production of plant growth hormones, auxins, and cytokinins, setting off the development of a tumor. The tumor cells are then stimulated to produce chemicals called opines, which the bacteria can then use as a source of energy. Cleverly, they break down the opines with an enzyme which is coded for in a part of a Ti plasmid that is not transferred to the plant. And best of all, the transfer of the T-DNA is controlled by genes which are switched on by molecules released by the wounded plant cells.\p
But how does the plasmid know which genes in the T-DNA to transfer? That is the cutest bit of all: the T-DNA is marked out by short, nearly identical sequences of DNA, known as right and left border repeats. Any genetic material placed between these markers is tagged to be transferred. The recipe for the genetic engineer is clear: cut a gene out of another organism using restriction enzymes, tag it with right and left border repeats, and insert it into a plasmid - say it quickly enough, and it almost sounds easy.\p
First, though, you need to tweak the plasmid, change it to make it easier to insert new genes between the right and left border repeats, and take away its power to produce growth hormone, so it can no longer produce tumors. And finally (and most controversially), you add a gene for antibiotic resistance inside the border repeats. This means that, a few steps down the track, researchers can select the successfully transformed cells by culturing the plant embryo or seedling in a medium containing the antibiotic, so that the unchanged plantlets will be killed off.\p
Protoplasts are cells stripped of their cell walls. If these are grown in culture, the agrobacteria can be added to a culture and given free rein. From here, the process is a lottery, but enough cells are usually transformed to pay off. Plants are easier to work with than animals, because a whole plant can be regenerated by tissue culture methods which have been around for many years.\p
At first, the plasmid method only worked on \1dicotyledons\c, while most of the important crop plants are \1monocotyledons\c. The monocots include all of the grains: rice, maize, wheat, and sorghum, as well as the pasture grasses that animal production relies on. Towards the end of the 1980s, researchers managed to infect monocots as well.\p
The key issue to keep in mind is that plasmid transfer is a wholly natural process which has been taken over for purposes humans see as desirable. Genetic material can be transferred by plasmids at any time, and for all we know, may be happening all around us. While it is difficult to get scientists to speak on the record, there are those who harbor a suspicion that "innocent gene transfer," the natural movement of genes from species to species, may be a significant driving force behind evolution - it would certainly explain the rapidity of evolution at certain times.\p
Another method used in the past was to "shoot" genes into plant cells. The DNA is wrapped up in a microprojectile, a small particle of tungsten or gold coated with DNA, which can be accelerated by a "gun," so that it penetrates the cell wall and the cell membrane. The original form of the gun, the "bioblaster," developed at Cornell University in 1983, was powered by a blank .22 caliber cartridge - these days, compressed helium gas or an electric discharge is used as the propellant.\p
In any of these cases, DNA is mixed with tungsten or gold particles, and placed in a plastic cylinder and shot towards the target. A steel plate with a small hole in it stands in the way of the blast, and this stops the plastic container, but the tungsten particles covered with DNA will carry on through the hole and penetrate a few thousand cells of the plant material on the other side.\p
The targets in such a case can be developing pollen cells or plant embryos, offering the opportunity to grow transformed plants directly from seed, without any need to use tissue culture methods. While the use of .22 caliber cartridges, even blanks, probably justifies a "kids, don't try this at home" warning, if biohackers ever appear on the scene, this is likely to be the method they will use, as it is fairly simple to set up.\p
If these guns are used to inject DNA into a culture of plant tissue for example, this can then be developed into a plant or plants which can be tested for the presence of the target gene. The methods of doing this are described underneath.\p
In a fully-equipped laboratory, viruses are the most obvious tools to use, as they have evolved to inject genes into cells, and to integrate those genes into the cell's operation. As well, viruses can probably be engineered to attach themselves to cells with certain types of receptors, and the ideal carrier will be so tamed that it will be unable to replicate (reproduce) or cause disease.\p
Retroviruses like the cause of AIDS and HIV are the most favored tools at the moment, because these viruses usually splice their genes directly into a chromosome. HIV itself would not be used, but there are other less dangerous retroviruses available to be used.\p
In the longer term, human medical treatment will involve finding harmless retroviruses which target stem cells, the special cells which are used by the body to make new cells for the blood and the immune system, but when plants and animals are being manipulated, a few failed attempts are less of a problem.\p
Adenoviruses like the common cold virus might be used, as these do not normally cause serious disease, while they have a large carrying capacity for "foreign" genes. There are also liposomes - bodies with no viral genes which cannot cause disease but which are less efficient at getting genes into cells, and finally, there is "naked DNA," which can be forced into a cell in a variety of ways.\p
This is the basis of one of the fears of GM's opponents. Often, the way used to improve a yield is to inject two genes together, one of them providing resistance to an antibiotic, or a gene capable of making the cells glow in ultraviolet light. The antibiotic resistance gene is particularly useful, because it can be used to cull the cells until only those with the desired genes are left.\p
The problem, say opponents, comes if that gene is then transferred across to other organisms - like other bacteria. This ignores the fact that the resistance genes were created by the blatant misuse of antibiotics in medicine and agriculture, and the key point that these genes are out there in the world's bacteria, transferring across species barriers every day.\p
The most commonly used antibiotic is kanamycin, to which 30% of all the dangerous E. coli O157:H7 bacteria in Japan are resistant. As well, there is a widespread occurrence of the kanamycin-resistance gene (npt II, short for \Ineomycin phosphotransferase II)\i in soils, manure slurries, and water. So any argument based on the perceived risk that the npt II gene will escape from a plant into a bacterium and make it kanamycin-resistant is ludicrous as the transfer is improbable and irrelevant because there are so many other easier ways for the bacterium to acquire the npt II gene.\p
The use of antibiotics in animal feed is a serious agricultural problem, potentially far more lethal than all the GM trials around the world, but like so much else in conventional agriculture, the practice has sneaked in, and nobody has noticed. The genetic modification experiments, on the other hand, are a major change, and as such, they are more visible, and more open to attack - even when their aim is to reduce some of the environmental damage caused by conventional agriculture.\p
Even if the marker genes themselves are harmless, there remains the possibility that other genes may be carried, either with the marker gene, or with the desired new gene. As the efficiency of adding DNA improves, it becomes possible to test for converted individuals by screening directly for the unique DNA sequence that describes the gene of interest. \p
#
"The Frankenstein fears - are they justified?",1128,0,0,0
(Jul '99)
When a very young Mary Wollstonecraft Shelley wrote her novel \IFrankenstein,\i she was filled with an admiration for science, but she had very little knowledge of the subject. The novel was really about the effects of ill-treatment on a noble savage (the monster), and much less about the misuse of science. Now, Frankenstein, who created the monster, has been equated with the monster, and the theme has been taken to be the creator destroyed by his creation, and the press is full of headlines about \IFrankenstein foods.\i\p
Sadly, the reporters who create these stories know little of literature, and while they match Mary Shelley's scientific knowledge, they lack her admiration for science, so there is now a shrill edge to the stories we are seeing about genetics in general. Stories about genetic manipulation of bacteria for biological warfare carries meaningless asides to the effect that the methods used are " . . . the same methods used to create GM foods which are being sold in supermarkets." This is good fear-mongering but bad science, and the logical equivalent of saying that the steel used to make ploughs is made by "the same steel used to create gun barrels" - or claiming that scalpels are sharpened "in the same way as swords."\p
This is not to suggest that there are not good reasons for fear - some of them generated by the clumsiness of the large corporations which should most wish to see the fears laid to rest. Monsanto did nothing to help its case when it announced in 1997 that GM soybeans (Roundup Ready) and other soybeans, had been mixed together in a single shipment sent to Europe. The European public reaction was that Monsanto was trying to force the GM beans down European throats by presenting them with no choice at all. (See \BMonsanto takes the blame,\b October 1998.)\p
A year earlier, a modified soybean carrying a gene from Brazil nuts, was shown to cause a strong and potentially fatal reaction in people sensitive to Brazil nuts. The cause was a gene aimed at increasing the amount of protein in the soybeans, but the reaction was tested for, and that particular product was taken off the market. Two years before that, though, it was the Tiniest Scorpion.\p
The story began with an advertisement informing the public around Oxford of plans to carry out a trial release of a virus (called \IAutographa californica\i nuclear polyhedrosis virus, or AcNPV), equipped with a scorpion venom gene, making the virus deadly to its target, the cabbage looper moth \I(Trichoplusia ni),\i a major pest in cabbages. This was the second year of the trials, but in the first year, nobody had seen the legal notice. Now the panic began.\p
The first trials had shown that the virus killed the caterpillars faster when it carried the scorpion gene, but now the protesters began to voice their concerns. Did the virus attack any other caterpillars, and did the scientists realize there was a nature reserve inhabited by rare moths, just a hundred meters away? There was no evidence, they suggested, to show that the virus attacked only one species. There was no evidence to show that the scorpion gene would not transfer into other viruses, and there was no information on how long the virus would survive in the wild.\p
Research later showed that the virus, imported from America, could infect a number of moth caterpillars which are native to Britain. In fact, a 1986 study had already listed 43 species of moth that are susceptible to AcNPV, and another biologist suggests there may be as many as 250. The researchers put the number at about five species, saying that the earlier study may have been contaminated, and that the estimated 250 does not distinguish laboratory from field conditions. In any case, it appears that the homework was not properly done, and one slipshod case is all that the opponents need to show that their fears are reasonable.\p
And that is before we consider laboratory studies which show the possibility of genes switching to a new virus. It remains just that - a possibility, but that is all the opponents need, because then they can argue "the precautionary principle" - that when you are not certain of safety, as a precaution, you should not proceed.\p
Baculoviruses of this sort have been used for a century without catastrophe, so the risk is light. Still, given the public fears of "meddling with nature" and the emotive value of scorpion venom, this was probably an unwise experiment to have started - whether you are looking from a public relations, or a scientific, standpoint.\p
(Strictly speaking, the precautionary principle mentioned above derives from the 1992 Rio summit on the environment, and should read "Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation," but it is generally applied in the wider sense.)\p
#
"The emotional arguments",1129,0,0,0
(Jul '99)
The Prince of Wales probably sums these up rather well when he suggests that it is "wrong to tamper with the building blocks of life." One American campaigner, Diane Beeny, put it in stronger terms: "We are on a reckless course," she said, "Entirely new life forms are being created. This is an aberration of nature and a recipe for disaster." \p
Her concern was with technology that allows genes from an Arctic flounder to be inserted into a tomato to keep it from freezing. While the approach is, at the very least, novel, it is hardly the creation of a whole new life form.\p
The risk, of course, is that we may discount her emotive language, and so discount her concerns, when the concerns, stripped of their emotive overload, may have scientific validity. There is a simple trap here: how do you make people sit up and pay attention, without resorting to hyperbole? Perhaps the answer is to talk to the scientists, to bone up on the facts, and to concentrate on making sure that the commercial and scientific decision-makers are aware of any risks.\p
The way is still open for the cannibal argument, which may be emotive, but which appears on the surface to be most definitely valid. If lambs are given a human gene, or two human genes, say activists, at what point do the eaters of lamb chops become cannibals? Two genes good, four genes bad?\p
But is it valid? The answer to this, say the scientists, is to look at the huge number of genes which we have in common with our normal food animals, so that the significance of "a human gene" is placed in perspective.\p
#
"Frankenstein revisited?",1130,0,0,0
(Jul '99)
Of course, some of the fears and worries we are seeing and hearing about GM foods now offer a certain \IdΘja vu\i quality, for the very good reason that we have indeed seen them before. In the 1960s, a potato called the American Lenape was launched, featuring an unusual burning taste, which turned out to be the result of dangerous levels of toxins called glycoalkaloids.\p
Or take the mid-1980s celery which was highly resistant to insects, so that it promised to boost yields most pleasingly, but it caused skin rashes, because the celery was shedding psoralens, natural chemicals which become irritants and mutagens when exposed to sunlight. Like the American Lenape potato, the celery was withdrawn from the market, but there was no outcry.\p
These were not GM foods, but the products of normal plant breeding and selection from plants growing in the breeder's fields. Somehow, they had acquired new genes, either by mutation, or by some accidental and entirely natural transgenic event. Mistakes will happen in the selective breeding of crops, but it is likely that consumers are actually safer eating GM foods, because these will have been far more thoroughly tested before they get to market.\p
This is not the same as saying there are no risks involved in producing GM crops, but under normal conditions, there should be no more problems, and in most cases, there should be fewer problems with GM crops, \Iso long as care is taken.\i The only question is: at what point should scientists sound a note of alarm when something appears to be going wrong?\p
"Mad cow disease," bovine spongiform encephalopathy, or BSE, is a case in point, and certainly a major factor in European concerns about GM foods. Too many cattle had to be slaughtered and burned, a number of people died, and more will die in the future, simply because official reaction times were too slow. Of course, "mad cow disease" was more about profit maximization and greed, but near enough is good enough for the fear merchants.\p
At the same time, the outbreak of CJD-like disease in humans arose because of enthusiastic use of new agricultural methods without proper checking or safeguards, so it is not an unfair parallel. On the precautionary principle, you must sound a warning straight away, but if you do sound the alarm, and you are wrong, you risk paying a hefty price, as Arpad Pusztai learned.\p
#
"The scientist who cried wolf",1131,0,0,0
(Jul '99)
Arpad Pusztai was working at the Rowett Research Institute in Aberdeen when he broke a story that rats, eating genetically modified potatoes, appeared to be harmed by their diet. As a result, Pusztai was roundly denounced for inadequate statistical analysis, and his employers first suspended him, and then refused to renew his annual contract.\p
Pusztai fed young rats on potatoes engineered to produce an insecticidal protein, a lectin, taken from snowdrops. The potatoes themselves were never intended for human consumption, but the gene is being considered for addition to rice plants, to protect them from sap-sucking insects, so the study is both important and relevant. \p
Pusztai fed some rats on potatoes modified to make their own lectin, while others were fed potatoes laced with lectin, and a control group ate normal potatoes. It was only the rats eating the GM potatoes which suffered damage, said Pusztai, inferring that there may be some deep-seated problem in the GM methodology used, rather than any effect caused by the lectin itself.\p
The GM potatoes contain a kanamycin-resistance gene, and also a gene to make a substance which stains blue - both of these are marker genes, and no biologist could make a case for either gene causing problems, but perhaps, \Ijust perhaps,\i something is going wrong. And that means time for the precautionary principle, especially as these marker genes are so widely used. It is better to be safe than to be sorry.\p
Critics (and there are many of them), say that Pusztai only showed that rats hate potato, and that all of the rats were malnourished, and that starvation or known toxins in potato were the cause of their illness. The problem is that standard tests often do not work: a recent \INew Scientist\i article cited the work of Dutch scientist, Harry Kuiper, who tested GM tomatoes by freeze-drying them, and feeding so much to rats that each got the equivalent of 13 of fresh tomatoes a day. If the dosage had been any higher, the rats would have been poisoned by the basic nutrients, such as potassium, in the powdered tomatoes. Even so, toxicologists said that the rats had not been fed enough of the tomatoes.\p
At the same time, there are other scientists who support Pusztai, and the case is not yet closed. What we can say is that right or wrong, Pusztai was treated most unwisely, and the way he was treated will be in the minds of others who obtain worrying results.\p
The protein, starch, and glucose levels in the transgenic potatoes were all unusual, and it seemed likely that the insertion of the lectin gene caused some sort of secondary disruption. Then again, the protein levels in the transgenic potatoes were 20% down, so that Pusztai had to add a protein supplement, which might just possibly be the cause of the problem. Further research would reveal either of these causes, and testing on new transgenic crops would reveal any huge disparities, and would sound alarm bells, and the GM companies say their own routine testing would detect any such problems, long before a crop got to the world's tables.\p
Pusztai's major "crime" appears to be that he breached scientific protocol by "going public," releasing his findings to the public, rather than waiting for the normal scientific process of peer review. All the signs are that Pusztai is a reputable scientist with a distinguished record, but distinguished scientists have been known to "go off the rails" before. In the absence of any hard scientific evidence, most reasonable scientists would have to agree that the precautionary principle should be applied - and at last report, Pusztai had submitted a paper to a peer-reviewed journal.\p
However you look at it, the response to Pusztai's report was a massive PR disaster for the GM people, and science did not fare very much better.\p
#
"The labeling situation",1132,0,0,0
(Jul '99)
In the United States, the Food and Drug Administration (FDA) does not require GM foods to be labeled unless they "differ significantly" from non-engineered foods. In other words, if a protein or gene is present which may provoke allergies, labeling is needed in the USA. According to Rolf Zinkernagel, 1996 Nobel laureate for his work on immunology, there are no GM foods causing allergies available today. \p
(Zinkernagel's fellow-laureate, Australian Peter Doherty, commented in mid-August that people should accept that GM crops are safe, and represent a real hope to the Third World, adding that those who wish to worry would do better by worrying about the insecticide levels in conventional crops, where there are no labeling requirements in place whatsoever.)\p
Europe insists on labels, Japan is considering them, Australia and New Zealand are planning them, and in many other countries, a bandwagon is on the roll, with people demanding "labeling" as though that is a solution. Some foods, such as rapeseed and potatoes, have modifications which cannot be detected, while corn oils and soy sauce, even if made from GM corn or soybeans, offer no trace after the foods have been heated and processed.\p
In the same way, if cattle are fed on Roundup Ready soybeans, there will be no way that anybody can tell whether this has happened or not, and it is unlikely that labeling would be required in such a case. As in the "cannibal argument," where do we stop? Should we label meat from cattle who pastured alongside genetically modified rapeseed, where pollen may have blown into their pastures?\p
The real problem is that "labeling GM foods" is no answer. Normal methods of selective breeding, as used for the past 10,000 years, have produced at least one crop, "smart Canola," which is just as immune to herbicides as any of the GM Frankencrops, but which is completely "natural," meaning that it does not need to be labeled, while being just as potentially dangerous as the GM crops - and similar herbicide tolerance was bred into maize by natural means in the 1980s.\p
#
"The activists and the poplars",1133,0,0,0
(Jul '99)
In mid-July 1999, eco-activists in Britain attacked GM poplar trees at night, breaking smaller trees, and stripping the bark off larger trees in an attempt to kill them. There were 64 normal poplars and 88 GM trees in a mixed stand. The GM trees had been modified to disrupt the enzymes involved in the hardening of lignin - paper makers need to remove the lignin from the cellulose in wood pulp, using a chlorine-based bleach, which has the potential to do environmental damage.\p
Ideally, wood pulp from the GM poplars will require 15% less bleach, reducing environmental impact and saving energy. The pulp also requires the use of smaller amounts of the caustic chemicals typically used for the lignin-cellulose separation, while the gentler treatment required would produce better-quality paper from a given stock.\p
The same method can also be used with \IEucalyptus\i trees, and would involve inserting the same gene into selected stock which would then be used as plantation trees, reducing the commercial value of old growth forests which are important to biodiversity in many Australian forests. (While \IEucalyptus\i trees are highly favored for plantation growth around the world, they are natives of Australia, where old-growth forests are currently being logged.)\p
Half the GM poplars in the damaged stand have a modified gene for cinnamyl alcohol dehydrogenase (CAD), while the other half have an inactive gene for O-methyl transferase. In either case, lignin is still produced, but it is easier to deal with in the manufacturing process. The activists claimed that genes from the poplars might escape into the wild, although how they would do this is hard to see. Poplars have separate sexes, and the test plants were all female, guaranteeing that there would be no pollen escapes.\p
Of course, from a scientific point of view, mutant poplars like this, with one or even both of the lignin-hardening genes missing, are quite likely to exist in the natural population. The simple principle of natural selection may account for the mutant poplars continued non-discovery, or they may be evolution-neutral, and simply hidden from view.\p
The most fearful scenario that the opponents can offer, based on these monster trees, is that the new genes will make poplar forests that fall over from a lack of stiffening, but this is arrant nonsense: if the genes were that deadly, they would never be able to spread and conquer the poplar world. It is a simple scientific fact that organisms with inadequate genes do not compete with those of their species with genes better suited to the environment.\p
There is a lesson for the technologists here, though. In 1997, American researchers found some mutant pines with a mutation that blocked all production of the CAD enzyme, and yet lignin formation was completely normal. And equally, there is a lesson for the fear-mongers: this mutation is completely natural. There is no difference between the mutation, happening by sheer chance in a natural population, and the same mutation being manufactured in a laboratory.\p
(See also \BNew aspen for pulp and paper industry,\b July 1999.)\p
#
"Monsanto and Roundup Ready",1134,0,0,0
(Jul '99)
In crops, genetic modifications seen on a large scale usually involve either herbicide-resistance or pest-resistance. The classic example of herbicide-resistance is the Roundup Ready soybeans, which allow farmers to use sprays, rather than mechanical tilling, to control weeds. Other Roundup Ready crops already available include canola and cotton.\p
Monsanto markets both the Roundup Ready soybeans and Roundup, a formulation of the herbicide glyphosate, a wetting agent or surfactant called ethoxylated tallowamine, also known as POEA surfactant. Because both the seeds and the herbicide come from the same source, people seem to be more willing to harbor suspicions and entertain dark notions about monopolies and conspiracy.\p
As always, there is a balance, at least if the rival claims are to be believed. Mechanical tilling damages the valuable topsoil, say the proponents, but opponents suggest that the wetting agent used with Roundup may be running off into rivers and streams. Certainly there is a world-wide phenomenon of frogs and toads dying off, but the pattern of death and survival does not seem to match up with the use or non-use of herbicides.\p
Certainly Monsanto does not accept the suggestion. It points out that the POEA is used as an ingredient in many products, including soaps, detergents, shampoos, and cosmetics. It claims that it has tested the herbicide against human skin and a variety of amphibians without any significant harm. Monsanto adds that POEA is rapidly and completely degraded in the soil, and that it binds tightly to soil particles, ensuring minimal run-off into the water supply.\p
The main risk to consumers would arise if Roundup turned out later to damage humans - assuming that any significant amounts of the glyphosate remained in the crop by the time it reached market (unlikely), or if some breakdown product of the herbicide accumulated in the food, but once again, there appears to be no evidence for this.\p
#
"The poisonous plants",1135,0,0,0
(Jul '99)
The pest-resistant plants mainly use the Bt gene, taken from \IBacillus thuringiensis,\i a bacterium (see \BNew Toxin to Combat Insect Pests,\b December 1997). When this gene is expressed in a plant, it becomes poisonous to insect and other pests. Those who favor using this approach say that this avoids the use of toxic pesticide sprays which seem to go unerringly where they will do the most damage to the environment. The opponents argue that this repeated low-level application of the Bt toxin is worse than an occasional spray, with breaks in which the affected animals can recover. Wildlife corridors and refuges might preserve invertebrates (assuming they are provided) when conventional sprays are used, but this is less likely when Bt crops are grown.\p
A number of examples of collateral damage have arisen. The GE corn which expresses the Bt toxin to kill the European corn borer and other lepidopteran (moth and butterfly) pests, also kills lacewings. The lacewings are "good" insects which eat pests, but when a lacewing eats a pest which has fed on the plant, it also dies. Death even follows when the insect eating the plant is not itself affected by the Bt toxin.\p
So the GM crops are not really being grown to benefit the consumer - they are there to maximize the profits of agribusiness, and that makes it more likely that, over time, crops will become ever more uniform and lack in biodiversity. And that, as every biologist knows, is a recipe for disaster, because sooner or later, a pest will arise which is able to exploit that particular strain of crop.\p
In time, GM crops will be developed with extra vitamins or lower fat levels, and opponents of GM foods will require even these to be labeled, though it is uncertain whether anybody will gain by the process. To be useful, any labeling protocol will need to specify exactly what sort of manipulation has been practiced, so that the potentially risky GM crops are tagged.\p
One of the biggest risks, according to opponents, is that transgenes may escape into wild populations. Yet if the GM action was to make the crop sterile, as in the "Terminator" technology, then this is unlikely, while even a fertile crop is no problem, so long as it is far enough away from unmodified crops to avoid cross-pollination. In the long run, a simple "GM" warning label appears to be no better than simplistic.\p
#
"The real dangers of genetic manipulation",1136,0,0,0
(Jul '99)
The potential for damage can come in a number of ways: as a risk to the farmers, as a risk to the farm environment, or as a risk to consumers. For farmers, the main risk is that pests will become immune to the toxins, because these are always available, rather than being sprayed intermittently. As an argument against GM, this has no real weight, because this will only be a problem for those who are using the insecticide.\p
A more serious risk was described in \BEngineered corn and monarch butterflies,\b May 1999, where the pollen from Bt-enriched plants was shown to be dusting leaves eaten by monarch butterfly caterpillars, which were dying as a result.\p
Again, there is always the chance that genes will transfer from one species to another. While animals are fairly fastidious about inter-species hybrids, they are much more possible and likely in plants, and weeds growing near crop fields have been known to hybridize with crops before, so the risk of Bt genes "escaping" is far greater (see \BSpreading the genes,\b September 1998).\p
The risks to consumers come in a number of forms. Pesticide-resistant crops might reach the market with residues of the pesticides still inside them. In the case of Roundup Ready crops, there is no evidence of risks from residues or breakdown products of glyphosate, or of any damage to consumers or the environment from wetting agents in Roundup, but it took us hundreds of years to identify the dangers in tobacco, and 30 years to identify the problems caused by 2,4-D and 2,4,5-T, the herbicides which went into Agent Orange (and which your reporter sold to the public as late as 1967 as "perfectly safe plant hormones, which can only affect plants!").\p
Then there is the potential risk from "biohackers," people who decide to create their own home laboratory and construct their own monsters. This is possible, and feasible, but it is hard to see how using such practices responsibly would increase the danger. This particular genie is well and truly out of the bottle, but we would all be far more likely to die from a naturally mutated influenza virus, or a TB bacterium made immune to all antibiotics, than we are likely to die from a "Frankenstein germ."\p
The most serious risk, and the one which both opponents and supporters of GM need to concentrate on, is the loss of diversity which may arise from relying on GM crops. The problem is a simple one: having produced a transgenic strain, tailored for high yield, and having got that strain past all of the very sensible and stringent tests, there is a temptation to rely on just that strain. In the real world, this is called putting all your eggs in one basket. In agriculture, a better name would be suicide.\p
While amateur naturalists and the general public are now well aware of the need for biological diversity, a species is only as strong as the range of genes it can fall back on. If farmers concentrate on just those seeds sold to them by the GM companies, we will end up with a monoculture in each species, wide open to attack by any pest lucky enough to find a loophole that lets it exploit the feast laid out before it. We can learn now by thinking, or we can learn later by bitter experience, but we will learn.\p
As long as genes are being added to plants and animals, there remains a small but finite risk that the transferred genes will be accompanied by other genes which might conceivably be dangerous. The risk is vanishingly small, but so is the chance of winning a lottery, and people do that every day. In the end the question will probably be whether it is better to let a million die of starvation each year, or to avoid the risk of a few being endangered by a Frankenfood.\p
Finally, opponents need to consider the economic costs, especially when crops are tailored to not produce viable seed, so that the farmer has to return to the large corporation, year by year, to buy fresh seed. As well, they need to question the wisdom of allowing so much power to fall into the hands of a single group who have been clever enough to integrate vertically in such an alarming way.\p
#
"The ethical arguments",1137,0,0,0
(Jul '99)
The ill-feeling towards large combines like Monsanto just rolls along. In 1998, Monsanto acquired cotton seed producer Delta Pine & Land in a stock swap, which gave them access to the Delta Pine "Terminator" technology which renders seed from proprietary crops sterile and so stops farmers from saving seed for the next year, guaranteeing the firm a continuing market for their enhanced crops.\p
And that brings us to the key issue for many opponents of GM: they believe it is immoral for companies to "own" genes, and transfer them from species to species, still retaining their ownership. For people with that approach, it makes sense to label every GM crop, and to prevent them being used.\p
But this is a stance based on political belief, not on scientific facts, and while the GM companies will enroll all the science they can to make the case for the safety of their crops, the opponents will work just as hard at finding science-based arguments against the use of those crops. Fundamentally, though, neither side in the dispute is there for the sake of the science.\p
#
"Is it too late?",1138,0,0,0
(Jul '99)
Maybe we should not worry, maybe our worst fears are happening right now, and have been happening for millions of years. We don't know how far genes can travel, and do travel, right now, under natural influences. For all we know, all living things may be gathering up new genes all the time: certainly the mechanisms for us to be transgenic is there.\p
John Martin at the Center for Complex Infectious Diseases in California says he has found a herpes-like virus, isolated in a woman suffering from ME (chronic fatigue syndrome) which seems to be able to gather genes from a variety of bacteria. The genes come from \IEscherichia coli\i (10), \IBacillus subtilis\i (6), \IBrucella abortus\i (4), \IMycoplasma pneumoniae\i (3), \ISalmonella typhimurum\i (3), with another 27 "foreign" genes also found in the virus. The genes code for tasks relating to photosynthesis, nitrogen fixation, and the construction of bacterial cell walls.\p
None of these genes is likely to do much inside the virus, but unless Martin made a massive contamination error, it appears that viruses are able to gather up genes from the life forms around them. And from earlier evidence, those genes may even be able to travel on, and reach us, though quite what we would do with a bacterial cell wall gene it is hard to say.\p
As yet, we still know very little about the viruses and bacteria that swarm around us. Unless they cause illness, we may not notice them, and if we cannot culture them, we cannot study them, so there is no real way we can tell what sort of genetic manipulation is going on, day by day, under no control or supervision whatsoever.\p
In 1997, Walter Doerfler reported that when DNA from a bacterial virus was eaten by a mouse, some pieces of the viral DNA got into the mouse's bloodstream and cells, and in a few cases, even linked themselves to mouse DNA. While this provoked an hysterical reaction at the time, Doerfler reports no instances of active ingested genes, even when those genes were ones designed to work in human cells, which are similar to mouse cells.\p
We probably have adequate defenses against gene invasions like this, with the invading genes being chopped into small harmless pieces, but maybe, just maybe, one of the steps that powers evolution is the very rare transmission of just the right genes, at just the right time, allowing organisms to do something they could not do before.\p
#
"Stirring the pot: the Doomsday weapons and GM",1139,0,0,0
(Jul '99)
Worked into the Doomsday scenarios, we find suggestions that the weapons of biological warfare are somehow implicated in the whole GM business. Disease bacteria like anthrax \I(Bacillus anthracis),\i plague \I(Yersinia pestis),\i and the Ebola virus, or the Botulinum toxin of \IClostridium botulinum\i are somehow going to be engineered to attack people carrying certain genetic markers which identify a racial group.\p
From here, it is a quick step to suggesting that the genes from these disease organisms are going to be engineered into crops, even though this would be (a) difficult, and (b) about as dangerous as inserting Doberman genes in a petunia.\p
Whatever the future brings, most of our present fears are likely to be proved groundless, and some of the reassurances we get may be found to be unjustified. There are huge benefits to be gained, but there are also risks that will be taken. The beauty of science is that, if left free from populist pressure and political grand-standing, it ought to take care of the problems - especially if the scientists know that the public understand the processes, and are watching.\p
The real problem is that genetic manipulation, like quantum physics, is little understood by a public whose scientific education contains very little that would surprise a scientist of the 1890s - except for plate tectonics. In such a vacuum, it is very easy for alarmists to make their mark by hinting darkly at conspiracy theories, even though what they are really doing is hindering science and putting lives at risk from starvation.\p
And in such a vacuum, it is conceivable that well-meaning scientists might just blunder - though given the clamor at the moment, that risk seems less and less probable, so perhaps the protesters are playing a useful role after all.\p
#
"A new excuse to eat lobster, crab, and crayfish",1140,0,0,0
(Jul '99)
If there is one thing better than eating delicious food, it is eating delicious food and knowing that it is doing you good. July saw the usual crop of stories saying that red wine and chocolate are good for you, but it also brought us the news that crustaceans in the diet can help prevent disease. Some inherited forms of amyotrophic lateral sclerosis, also known as Lou Gehrig's disease, have been linked to defects in an enzyme called superoxide dismutase (SOD), which sops up dangerous "free radicals" that accumulate inside cells.\p
Copper, in excess, is a poisonous heavy metal, but small amounts of copper are needed as a "helper" to some enzymes, including SOD. And as you may have guessed, lobsters and crabs are a good source of dietary copper.\p
A report in the August issue of the journal \INature Structural Biology,\i released in late July, describes work at the National Institute of General Medical Sciences (NIGMS) which has involved deciphering the three-dimensional structure of a yeast copper "chaperone" protein, a molecule that transports copper to the SOD enzyme. The copper chaperone protein protects copper from unwanted cellular interactions and delivers it safely to its destination, and the yeast chaperone is very similar to its counterpart in humans.\p
The first chaperone was identified in 1997, and a second one was described in \IScience\i in April 1999. This second chaperone is specifically linked to supplying copper to the SOD molecule. So now this second chaperone will be treated as a suspect in triggering Lou Gehrig's disease, though this may be a case of guilt by association say the researchers. In the meantime, as long as your chaperones are in order, slip another crustacean on the barbie . . .\p
\BKey names:\b Valeria Culotta, Jonathan Gitlin, Peter Preusch and Thomas O'Halloran, among others. A paper with P. J. Schmidt as the senior author, appears in the \IJournal of Biological Chemistry\i in August 1999.\p
#
"Polio still lingers in Africa",1141,0,0,0
(Jul '99)
The World Health Organization (WHO) has signed up its first corporate partner in a decades-long effort to eradicate polio in Africa. The diamond-mining company De Beers will donate $2.7 million over the next two years to immunize children in Angola, where a civil war is still in full swing. \p
\I"I hear some say that infectious disease is becoming yesterday's problem. But is that correct? I don't believe so. There is an unfinished agenda of eradication and rolling back diseases. No one should underestimate childhood infections, HIV/AIDS, TB, malaria, polio, and the other new and emerging diseases. They may hit us all in this small world - but above all they keep ravaging the lives of the poor. \p
WHO must be an enduring advocate in the fight against infectious diseases. And WHO must help governments face the daunting challenge from the new epidemic of non-communicable diseases, now spreading in the low- and middle-income countries."\i\p
Those were the comments of Gro Harlem Brundtland, addressing the World Health Assembly in October 1998. Very little has changed since then, but a Rotary program has been making changes where poliomyelitis is concerned, and with the De Beers contribution, perhaps a bandwagon is beginning to roll. \p
While polio almost completely disappeared from western nations some decades ago, it is still ravaging several developing countries, including India, Pakistan, Nigeria, and Angola. The WHO set out to eradicate the disease in 1988 and, since then, the worldwide incidence has fallen by 80% to 6,000 cases a year.\p
Figures, however, are rubbery at best. There were approximately 4,000 polio cases were reported in 1996. The WHO estimates that due to underreporting, which occurs in regions where surveillance is not fully developed, there may have actually been as many as 40,000 cases, so the 6000 figure is a conservative guesstimate.\p
The WHO hopes to end transmission of the virus by the end of 2000, and it will then need to monitor afflicted areas for another three years, at a total cost of $1.25 billion, but fundraising hasn't matched expectations. The program is still about $500 million short of its financial target, but only three major centers of transmission remain: South Asia (Afghanistan, Pakistan, India), West Africa (mainly Nigeria) and Central Africa (mainly Democratic Republic of Congo). Once these hotspots are dealt with the disease should be under control.\p
The De Beers contribution will cover the cost of vaccinating about 80% of the 3.3 million Angolan children who must be immunized in the next two years. The only snag now will be avoiding the problems caused by civil war, as the conflict between rebels and the government is getting worse. Still, the money has come in the best form, with no strings attached as to how it is used in the fight against polio.\p
Other support for the campaign against poliomyelitis has come from the Rotary International PolioPlus program, which has provided close to $500 million dollars.\p
United States authorities have indicated that they will be testing four herbs used medicinally: aloe vera (used as a dietary supplement as well as a cosmetic), ginseng (said to promote "vigor"), kava kava (also called kava or sakau, and used as a mood elevator), and milk thistle (considered by some to have anti-cancer and liver-protective properties) for toxicity, because they are now in such wide use by the public. As well, a substance in cruciferous vegetables such as broccoli, called indole-3-carbinol, which is thought to inhibit cancer, will be included in the study.\p
This plan was announced in July in order to allow the public to comment before the tests go ahead. A number of other compounds are also under review, including ammonium molybdate, one of many soluble molybdenum compounds which workers and the general population may be exposed to; 5,6-benzoflavone, also known as Beta-napthaflavone, which may have some relevance in treating cancer; 1,3-Dichloro-2-butene, which has a similar structure to a known carcinogen; and 3-Picoline, an industrial chemical produced in large amounts.\p
#
"Melanoma 'vaccine'",1143,0,0,0
(Jul '99)
A new vaccine treatment appears to be effective in prolonging the survival of patients with malignant \1melanoma\c. The vaccine is prepared from patients' own cancer cells, but before they are returned to their owners, the cells are first inactivated to stop growth, and then treated with a chemical called dinitrophenyl, or DNP. \p
While the method was only described in the \IJournal of Clinical Oncology\i in June, by July, plans were already in place for trials to be undertaken in Australia, the world's melanoma hotspot. Australia combines a large population of fair skins, especially Celtic skins, together with a great deal of sunshine, and an out-of-doors lifestyle and culture which has long worshipped "a perfect tan," although Australians are now beginning to realize that only a perfect fool seeks a perfect tan.\p
The work was partially disclosed on May 21, 1996 at the annual meeting of the American Society of Clinical Oncology. Sixty-two patients were treated with a special vaccine made up in part of their own tumor cells. The June report indicates that the vaccine greatly extended the survival of patients with melanoma that had spread to their lymph nodes and increased the percentage of cures as well. The results of the study showed that 58% of the patients survived at least five years - a much higher percentage than would be cured with surgery alone. The results were also better than those obtained by treatment with alpha interferon.\p
\BTechnical details:\b The vaccine is being used post-surgically for treating advanced, but surgically resectable, malignant melanoma. In the study, patients with stage-III tumors in the lymph nodes were treated with the vaccine after standard lymphadenectomy p: an operation where the melanoma and surrounding lymph nodes are removed. Of 62 patients who received the vaccine, 47% were free of relapse after four years, and 58% survived four years. This compares well with the 20-25% survival rate in patients treated with surgery alone.\p
The DNP makes the cancer cells appear more foreign so that the body's immune system is more likely to recognize and attack them, and this then sets the immune system off to attack the melanoma itself. While there is now a need for a trial with 300 to 400 patients in several major cities over three to five years, the prospects are looking good, and not only for melanomas: the vaccine is already being tested in ovarian cancer, and a colon cancer study is expected to begin in one to two months. In these cases, the patients will need to have the tumors removed before the treatment is started.\p
\BCurious features:\b Patients over the age of 50 had a higher survival rate after vaccine treatment than younger patients, and patients who developed immunity to their own cancer cells fared better than those who did not, with immune reactions being assessed by standard skin testing.\p
\BKey names and terms:\b David Berd, Henry C. Maguire Jr, Lynn M. Schuchter, Ralph Hamilton, Walter W. Hauck, Takami Sato, and Michael J. Mastrangelo. Thomas Jefferson University, AVAX Technologies Inc., and autologous DNP-modified vaccine.\p
#
"Saving children from AIDS",1144,0,0,0
(Jul '99)
Around 1,800 HIV-infected children are born each day around the world, but according to Anthony Fauci, head of the National Institute of Allergy and Infectious Diseases (NIAID), a single dose of the new drug nevirapine (also known as Viramune) given both to the mother and the child, prevents transmission of the AIDS virus to newborns.\p
A 1994 study showed that a very intensive treatment with AZT reduced the percentage of HIV-positive mothers who infected their babies from 25% without treatment, down to 8.3% with treatment. Similar results can be obtained with shorter courses of AZT by itself, or AZT combined with the drug 3TC, but these treatments are too expensive for use in developing countries, while nevirapine is comparatively cheap (on the retail market, it is less than half the price of AZT, and fewer tablets are needed).\p
AZT and nevirapine both disable HIV's reverse transcriptase enzyme, which the virus needs to copy itself into the host cell's DNA. Treatment either involves giving AZT to the mothers every three hours from the onset of labor until delivery is complete, while infants are given AZT twice daily in their first week. With nevirapine, women were given a single dose at the start of labor, and the babies were given a single dose of nevirapine in their first three days.\p
#
"Scientific misconduct?",1145,0,0,0
(Jul '99)
Do electromagnetic fields cause cancer? The scientific evidence over the past couple of decades has been remarkably equivocal, and many critics have said that the correlations have to be ignored as chance fluctuations, because there is no known mechanism which would allow the fields, usually referred to as EMFs, to influence any part of a living cell.\p
When Robert Liburdy reported in two 1992 papers that EMFs had an influence on calcium signaling, this provided a plausible mechanism for the EMFs to act as they were alleged to do. The big problem has been that nobody else seems to have been able to produce similar results. Now the whole movement has been shaken by an announcement, trumpeted in some quarters as fraud, that biochemist Robert P. Liburdy "engaged in scientific misconduct ... by intentionally falsifying and fabricating data and claims." Liburdy was the sole author of each of the two papers.\p
In scientific terms, this is strong language, especially as it comes from the US federal Office of Research Integrity (ORI), and has resulted in Liburdy asking two journals to retract the results - or so the adversarial reports tell us. Liburdy's supporters say that he has been "nobbled by the US establishment for publishing a paper showing the importance of the ELF electric field." As they see it, he has been accused wrongly, for all he did was to "present it graphically in a more understandable style, as any good scientist would for journal readers."\p
Investigators from the ORI say Liburdy eliminated data which did not support his conclusions. After the investigation, he is said to have resigned quietly from the Lawrence Berkeley Laboratory in March and agreed to withdraw his research findings. Liburdy says he was constrained to do this because he could not afford to spend $1 million on his defense.\p
The federal officials say that his wrongdoing helped him gain grants worth $3.3 million from the National Institutes of Health, the Department of Energy and the Department of Defense to investigate a link between electric power and cancer. The laboratory's counsel, Glenn R. Woods, commented that "Now both the lab and the Office of Research Integrity have found that data on which he based his conclusions were fabricated. He's been asked to withdraw that data, and I think he's doing that right now."\p
As part of the settlement, Liburdy agreed to make no applications for federal grants for three years and not to contest the federal findings in administrative proceedings, but he retains the right to speak out. Liburdy counters: "The ORI charges center on graphic techniques I used in presenting Fura data depicting calcium changes in one figure in a 1992 review paper, and two figures in a 1992 research paper. For example, in one graph I used a computer to process Fura data for graphical presentation including a base-line adjustment and normalization to graphically overlay and compare exposed vs. control traces. Techniques like these are used in the literature, however I did not mention this computer processing in the methods section. Such 'processed' data was then characterized by ORI as being intentionally 'fabricated' data in the charge."\p
Liburdy's supporters see this as an attempt by "big business" to close down an embarrassing find, where the science runs counter to the public good. Liburdy still claims that "The raw data for these figures are not challenged, and are valid. How I graphed them is a matter of disagreement among scientists. Independent scientists have reviewed this for me and concluded that misconduct is not warranted."\p
That, however, is not the view of the ORI. They say that it was not a simple matter of graphing, but straightforward fabrication, though if there was any "fudging," it may not have been conscious. As one critic of Liburdy's work commented, this sort of study attracts crusading types who may be inclined to fool themselves when they look at the data. On the other hand, scientists who are going to color outside the box, who are going to challenge accepted models and paradigms, will usually be strong characters and open to attack.\p
Overall, the scientific evidence that has been published seems to say that the EMF from power lines is safe. At the same time, Liburdy is able to offer the comments of other scientists who have, he says, "independently reviewed the facts and do not agree with this charge."\p
We will, it seems, have to wait and see.\p
#
"Asteroid water",1146,0,0,0
(Jul '99)
Human beings could go prospecting on a tiny, water-rich asteroid and return selected samples to Earth by 2015, much sooner than any possible human mission to Mars. Forget Y2K, and concentrate on KY26, or 1998 KY26, if you want to be formal. This is a water-rich asteroid, about 30 meters (100 feet) in diameter, and Steven Ostro of NASA's Jet Propulsion Laboratory sees it as a "space station waiting for occupants."\p
Ostro believes there could be an automatic mission to KY26 and back by 2006, and a human mission to the asteroid by 2015.\p
Ostro was addressing a conference on asteroids, comets, and meteors at Cornell University in July. KY26 was first observed as it passed within 800,000 kilometers (500,000 miles) of Earth in 1998, but it poses no threat to humans. The asteroid is comparatively small and highly fragile, so that if it reached Earth, it would break up and explode as it entered the upper atmosphere. While the fireworks would be spectacular, only small bits would reach the lower levels. Referring to the Hollywood view of asteroids, Ostro said "We see them as killer asteroids (but) from this object's point of view, it should fear us, because from this object's point of view, we will assimilate it."\p
But even if Earth does not assimilate KY26, it is still endangered by humans. It is thought to be loaded with water, but the water is bound to organic chemicals, perhaps including the basic building blocks of life such as amino acids and nucleic acids. And that means that the asteroid could be mined as a source of reaction mass for spacecraft, and for food, water, and oxygen. It could even be hollowed out, and become a spacecraft of sorts. Most probably, it would be used as a staging and supply point for trips to Mars.\p
So what is the cost of a trip around the asteroid? Between $50 and $100 million, according to Ostro. And what would be the benefit? Well, aside from the possibility of using KY26 as a resource, the primitive materials that are in that asteroid come from the beginning of things, according to David Meisel, a radio astronomer who was there. While Mars has been churned over by areological (the Martian equivalent of "geological") processes, there would be fresh ancient material on the surface of KY26 - or within easy reach.\p
#
"Asteroid visits",1147,0,0,0
(Jul '99)
NASA's Deep Space 1 spacecraft flew by an asteroid during July, and the asteroid, formerly known just as 1992 KD, now has a name: 9969 Braille. The name honors Louis Braille, the inventor of the language system which allows sight-impaired people to read. The craft flew 26 km (16 miles) above the asteroid, photographing as it went. The target had been 15 km, or 9 miles, but apparently it was a little wider of the mark.\p
The craft flew under the control of the AutoNav autopilot system which is likely to be part of most future space probes. The flight completed the testing and validation of the 12 new technologies onboard, as well as (hopefully) gathering important new data and photographs. At the end of July, spectra were being sent back, and more was still to come. The project has a web site at http://www.jpl.nasa.gov/ds1news/\p
According to a NASA report in early August, "Braille's longest side is now estimated at 2.2 kilometers (1.3 miles), and its shortest side appears to be 1 kilometer (0.6 miles). This elongated asteroid was expected to be irregular, and two photographs taken approximately 15 minutes after closest encounter have helped to confirm this." The report indicates that Braille appears to be similar to Vesta, a rare type of asteroid and one of the largest bodies in the main asteroid belt, lying between Mars and Jupiter.\p
The next mission for Deep Space 1 is scheduled to fly close to Asteroid Wilson-Harrington and Comet Borrelly in January and September, 2001 respectively.\p
#
"Asteroid risk",1148,0,0,0
(Jul '99)
Asteroid 1999 AN\D10\d, a kilometer-wide rock which caused a media sensation when it was thought to be dooming Earth (but not until 2039), has now been carefully observed by amateur astronomers, and the "doomsday asteroid" is harmless.\p
Originally, there appeared to be a "small but finite" chance that the asteroid would pass close to Earth on August 7, 2027, and then maybe slam into it in 2039. During May, further observations by Australian amateur astronomer Frank Zoltowski seemed to point to an increased probability, though it was never greater than a 1 in 500,000 chance for the 21st century impact.\p
Relief came after two German amateur astronomers searched through the Digital Sky Survey, a set of photographic plates taken at Palomar Observatory in California in the 1950s, which were digitized and publicly accessed through the Internet. From a single streak on a plate taken on 26 January 1955, now confirmed to be 1999 AN\D10\d, we can now be confident that the collision will not happen. Perhaps there would have been less excitement if the threat had been given a rating on the Torino scale.\p
\BLink:\b the Digital Sky Survey includes the northern hemisphere as seen from Mount Palomar, and the southern hemisphere as seen from the UK Schmidt telescope in Australia. It is located at http://stdatu.stsci.edu/dss/\p
Remember the great asteroid scare of 1998? If not, refer back to \BDeath of a death threat,\b March 1998, and try to imagine the hysteria that went on, coincidentally during the buildup for the Hollywood blockbusters "Deep Impact" and "Armageddon," and ask yourself how well science came out of the whole affair.\p
The answer, sadly, is not very well. However, Richard Binzel, a planetary scientist at the Massachusetts Institute of Technology, may have the answer. In the March 1998 case, and another less publicized case later in the year, the real problem was finding a way of quantifying the risk, in much the same way as earthquakes are rated on the Mercalli or Richter scales, or winds are rated on the Beaufort scale. Enter the Torino scale, named from the Italian name for Turin, where a June workshop of the International Astronomical Union was held.\p
The IAU announced its official endorsement of the zero-to-10 scale at a UN space conference in Vienna during July. And the good news: right now, there is no risk greater than zero, but the scale means that people can assess the threat - after all, millions of microasteroids hit the earth each year, most of them burning up in the upper atmosphere.\p
We can rate the risk in the size of the asteroid, or by descriptions - in the listings below, a few key words have been bolded to make the distinctions more obvious.\p
0: The likelihood of a collision is zero, or well below the chance that a random object of the same size will strike the earth within the next few decades. This designation also applies to any small object that, in the event of a collision, is unlikely to reach the earth's surface intact.\p
1: The chance of a collision is extremely unlikely, about the same as a random object striking the earth over the next few decades.\p
2: A somewhat close, but not unusual encounter. Collision is very unlikely.\p
3: A close encounter, with a 1% or greater chance of a collision capable of causing \Blocalized \bdestruction.\p
4: A close encounter, with a 1% or greater chance of a collision capable of causing \Bregional\b devastation.\p
5: A close encounter, with a significant threat of a collision capable of causing \Bregional\b devastation.\p
6: A close encounter, with a significant threat of a collision capable of causing a \Bglobal\b catastrophe.\p
7: A close encounter, with an extremely significant threat of a collision capable of causing a global catastrophe.\p
8: A collision capable of causing localized destruction. Such events occur somewhere on Earth between once per 50 years and once per 1,000 years.\p
9: A collision capable of causing regional devastation. Such events occur somewhere on Earth between once per 1,000 years and once per 100,000 years.\p
10: A collision capable of causing a global climatic catastrophe. Such events occur once per 100,000 years, or less often.\p
#
"Lunar Prospector",1149,0,0,0
(Jul '99)
Astronomers were prepared in late July to observe Lunar Prospector's crash into the Moon's south pole, but it was all a bit of a disappointment.\p
While the craft hit its target, there seems to have been no evidence of any plume of water vapor. Either there is no water ice at the south pole, or perhaps there is just no water ice in that particular crater.\p
Still, the ashes of Gene Shoemaker (See \BDeaths,\b July 1997) were delivered to the moon, making the asteroid and comet expert the first human to have a lunar burial.\p
#
"The collider that ate the Earth!",1150,0,0,0
(Jul '99)
An old Isaac Asimov plot-line surfaced during July, with a suggestion in \IScientific American\i that a new collider might create mini-black holes that would slowly destroy the planet. The collider, at Brookhaven National Laboratory, is their soon-to-be-completed Relativistic Heavy Ion Collider (RHIC). But before we begin to tell that story, we need to remind readers that this happened in July, at the height of the journalistic Silly Season in the northern hemisphere.\p
The fuss began when physicist Walter Wagner speculated in a letter to the journal that the collider might create mini-black holes, those of a kind not seen since the Big Bang. Back at the beginning, the mini-black holes disappeared, but if one came into being near a large congregation of mass and if it started absorbing that mass before exploding, he suggested it might grow and absorb the whole of the planet.\p
The method he outlined was to smash a proton into an anti-proton with sufficient energy, to create a black hole which might reach a relatively stable half-life and so continue to grow. To be fair, Wagner also indicated that his calculations showed that the Brookhaven collider does not obtain sufficient energies to produce a mini-black hole, but he expressed concern that his calculations could be wrong.\p
He then suggested that the only way to determine the energy density at which a mini-black hole would be created would be to build a collider and do the experiment, and he concluded by asking, "Is the Brookhaven collider for certain below the threshold?"\p
Now that was fair enough as a piece of speculative science, but next came the journalists, and Brookhaven realized something was not quite right when they had a phone call from a journalist who asked, apparently seriously, whether the new collider might have created a black hole that swallowed the plane which was flown by John F. Kennedy Jr., as it flew by Long Island.\p
The \ISunday Times\i in London then picked up the story, mentioning the possibility of strangelets, a new type of matter made up of subatomic particles called strange quarks. These strangelets might then act as a template to form still more strange matter, until the whole world, and everything in it, had been converted. Here, science fiction readers would once again be on familiar ground: this is the plot line from the Kurt Vonnegut Jr. classic, \ICat's Cradle.\i\p
How could such bizarre stories get around? Well, partly because the collider is meant to briefly re-create, on a very small scale, conditions similar to the superdense state of matter which probably existed just after the Big Bang. The only problem with the Doomsday scenarios is that the energy involved in the Brookhaven machine is about seventeen orders of magnitude too low, but when physicists say "seventeen orders of magnitude lower," people think "one-seventeenth," rather than what the physicists mean: "one hundred thousand million millionth" of the needed energy.\p
So we can rule out any immediate gravitational effects, any black holes, but what about the strangelets? There is some reason to believe that these may exist in the center of neutron stars, but there has been no evidence for them so far, and there is certainly no evidence that they would be in any way aggressive, absorbing their surroundings. For a start, theory says they would not be aggressive, but theory can be wrong. The real evidence comes from experience: over billions of years, energetic cosmic rays have been plunging into the planet at far higher energies than those used at Brookhaven, and no strangelets have eaten the world so far. In short, it won't happen because it hasn't happened.\p
The whole fabricated farrago was denied in a notice posted on Brookhaven's web site, reading as follows:\p
"I am familiar with the issue of possible dire consequences of experiments at the Relativistic Heavy Ion Collider, which Brookhaven Lab is now commissioning. These issues have been raised and examined by responsible scientists who have concluded that there is no chance that any phenomenon produced by RHIC will lead to disaster. \p
"The amount of matter involved in the RHIC collisions is exceedingly small - only a single pair of nuclei is involved in each collision. Our universe would have to be extremely unstable in order for such a small amount of energy to cause a large effect. On the contrary, the universe appears to be quite stable against releases of much larger amounts of energy that occur in astrophysical processes. \p
"RHIC collisions will be within the spectrum of energies encompassed by naturally occurring cosmic radiation. The Earth and its companion objects in our solar system have survived billions of years of cosmic ray collisions with no evidence of the instabilities that have been the subject of speculation in connection with RHIC. \p
"I have asked experts in the relevant fields of physics to reduce to a single comprehensive report the arguments that address the safety of each of the speculative 'disaster scenarios.' I expect the report to be completed well before RHIC produces the high-energy collisions necessary for any of these scenarios. When the report is completed, it will be broadly published and placed on the Laboratory's web site."\p
So far, the world appears to be safe. But if you want to keep up with the issue, the Brookhaven web site is http://www.pubaf.bnl.gov/ \p
This has been a classic example of the way the media "beat up" a story out of nothing. Scientists are typically cautious in their pronouncements, so it is easy to get a physicist to say that the odds are infinitesimally small, while at the same time refusing to entirely rule out the possibility that a strangelet, a giant worm, or a mutated dingo will eat the planet. \p
Then it is child's play to increase the pressure so the authorities form a committee so they can be seen to be doing something, and then the media have new headlines about "Action at Last!", and "We Force Lax Authorities to Act!", then even more headlines ("Cover Up," or "Scientists Baffled," or "Official Denial") when the committee finds there is no risk. Then once the editorials have been written about scientists wasting money on unnecessary committees, the stories are filed, to be pulled out later.\p
No scare is ever entirely wasted for the media, no scare is ever free of the risk of resurrection. In the early days of the Manhattan Project, Fermi and others considered the possibility of a chain reaction that consumed the planet's atmosphere and ruled it out, before the Trinity test took place, and other theorists looked at the risk that the Bikini test might set off a fusion reaction in the sea - after all, it \Iwas\i a "hydrogen bomb," and the sea \Iis\i two thirds hydrogen, ran the musing - and even today, this is trotted out as evidence that the scientists were irresponsible tinkerers, unsure of what they were doing. The life of a journalist can be fun, sometimes.\p
#
"New aspen for pulp and paper industry",1151,0,0,0
(Jul '99)
A new genetically-engineered breed of 2XL-aspen looks likely to revolutionize pulp and paper production, according to a report in the August edition of \INature Biotechnology,\i released in late July. (Sounds familiar? See our special report on \BGenetic manipulation,\b July 1999 for another take on this sort of work.) The group reporting the development say they have introduced a gene into \IPopulus tremuloides,\i commonly known as quaking aspen, that cuts by nearly half the amount of lignin produced by the tree.\p
Given the financial and environmental costs of separating lignin from cellulose, any reduction in the amount of lignin is worthwhile, but the transgenic tree has surprised researchers by producing up to 15% more cellulose, and they are remarkably fast-growing, even for a fast-growing tree like aspen.\p
Typically the lignin:cellulose ratio in regular aspen, and other tree species is about 1:2. In the genetically-engineered aspen, the ratio is roughly 1:4, which means something like a 15% higher yield of pulp from a given amount of wood. Another advantage: the lignin has a similar structure to normal lignin, meaning that standard methods can be applied.\p
\BKey name:\b Vincent Chiang.\p
#
"Earthquake analysis",1152,0,0,0
(Jul '99)
The only way to understand earthquakes is to study how they are formed, but we still have less than the full picture. A study in \INature\i at the end of July revealed new details about subduction zones, the places on the earth where one tectonic plate is pushed down, deep into the earth, the places where the world's largest and most dangerous earthquakes take place.\p
It appears that the key properties of the fault zones change systematically with depth. This means that we get very different types of earthquakes, depending on the depth at which a fault ruptures. Thorne Lay and Susan Bilek analyzed the records of hundreds of earthquakes that occurred along subduction zones in Japan, Alaska, Mexico, Central America, Peru, and Chile, and found that the rigidity of the rock and sediments in the area of contact between the two plates increased steadily with depth in all six subduction zones.\p
Just as rigidity affects the way a bicycle transmits vibrations (see \BTour de France science\b, July 1999), so it affects the way that earthquake vibrations are transmitted. Both the duration of the rupture and the speed of the resulting seismic waves depend on this rigidity in the rock.\p
The shallower events rupture more slowly than those at greater depth, and it has been known for some time that large tsunamis may be generated by shallow earthquakes with abnormally long rupture durations. While some tsunamis are produced by an earthquake starting a submarine landslide, this is not always the cause, and it now looks as though some of the tsunami-causing earthquakes occur in regions of low rigidity at shallow depths.\p
All of the subduction zones have shallow regions of low rigidity, meaning that tsunami earthquakes can occur in many more places than previously expected. So there is nowhere to hide, but seismologists now have an extra tool to use in their earthquake probability calculations.\p
Bilek speculates that as pressure and temperature increase with depth beneath the surface, the sediments become compacted, which means water is squeezed out, and minerals in the sediments undergo major alterations, all of which can increase the rigidity of the subducted materials. Lay thinks that it is also possible the water content may affect the frictional mechanics of an earthquake, but this, he says, is more speculative.\p
#
"Mites take a pasting",1153,0,0,0
(Jul '99)
Biological control, as we all know, is an excellent way to keep a pest in check - most of the time. When an ill-advised Scot introduced \1gorse\c into Oregon, to form hedgerows, he had no idea of the problems he would cause. Since that time, the plant has run wild throughout the Pacific Northwest and Hawaii, forming dense thickets and crowding out native vegetation.\p
The same plant used to be a problem in New Zealand, but back in the 1980s researchers there found a tiny European spider mite which delights in eating gorse, and they released it, with US researchers following suit in 1994, letting swarms of the tiny insects loose at several Oregon gorse groves.\p
The only problem, an audience at the 10th International Symposium on Biological Control of Weeds was told during July, was that native predatory insects - and other mites - had learned to eat the newcomers. Worse, another introduced mite, brought in to control crop pests, appeared to have been eating the gorse mite as well.\p
\BKey names:\b Paul Pratt and the Oregon Department of Agriculture.\p
#
"A quick cold snap",1154,0,0,0
(Jul '99)
Around 8,200 years ago, two gigantic glacial lakes in Canada's Hudson Bay region suddenly drained in a catastrophic outpouring that seemed to have caused the most abrupt, widespread cold spell on Earth during the last 10,000 years. A paper in \INature\i in mid-July says that the lakes, Agassiz and Ojibway, contained more water than all of the Great Lakes combined, but when an ice dam from a left-over part of the Laurentide Ice Sheet collapsed, a flow of lake water rushed through the Hudson Strait and into the Labrador Sea. The flow, they believe, was about 15 times greater than the present discharge of the Amazon River.\p
The flow probably ran for about a year, lowering salinity and changing ocean circulation patterns. This would have interfered with heat transport in currents flowing from the tropics to temperate regions, and triggered an intense cold spell. Greenland ice core data show a drop of about 8░ C or 15░ F, while western Europe at the same time dropped by 3░ C or 6░ F.\p
The surface currents of the Atlantic act much like conveyor belts (see \BGlobal warming: could we lose the conveyor?,\b November 1997), carrying warm salty water up into higher latitudes from the tropics. This provides heat to areas which would otherwise be miserably cold. About one-third of the heat that warms Western Europe is delivered by the ocean, while the other two-thirds comes from the sun. Even away from the sea, winds warmed by the Atlantic warm parts of Western Europe.\p
The force of the currents can be seen by looking at Greenland, at equivalent latitudes to northern Canada and Sweden: Greenland, missing the warmth of the currents, is almost uninhabitable. So, if a huge volume of fresh water spilt into the Atlantic, the Conveyor would have been disrupted, and Europe would have suffered a cold snap for 200 to 400 years.\p
The evidence for the suspected catastrophe comes from "red bed" sediments underlying the ancient glacial lakes which were carried some 1,300 kilometers (800 miles) through the Hudson Strait by the massive freshwater surge, said Barber. Fossil shellfish from the Labrador seabed corresponding to the freshwater flood were radiocarbon-dated to about 8,200 years ago, and oxygen isotopes from the shells of tiny, plankton-like organisms from the same age of sediments showed the creatures lived in less salty water about 8,200 years ago.\p
\BKey names:\b Don Barber, with a large team of co-authors.\p
#
"Tour de France science",1155,0,0,0
(Jul '99)
Medical scientists were among those pleased by the result of the 1999 Tour de France bicycle race, as the winner, Lance Armstrong, had to overcome cancer to get himself into the winning position, but physicists had their say as well, converting Armstrong's energy requirements (about 2.5 times the normal intake) into jelly donuts. With each jelly donut carrying about 250 calories or 1000 joules, a normal adult needs about ten jelly donuts worth of energy each day, while a top cyclist competing at that level would need something over 30,000 joules.\p
Of that energy, a biker burns off about 1 or 2 jelly donuts worth of energy in overcoming friction and manipulating his bike, the actual pedaling work requires about 6 jelly donuts of energy, and the rest is wasted, mostly as heat. The heat, of course, means perspiration, and that means drinking about 15 liters (4 gallons) of water each day - the wind streaming in their faces and over their bodies evaporates the heat, cooling the cyclists.\p
But while we can explain what energy is needed and where it goes, we can do very little about improving performance. Any changes will have to depend on technology, as Armstrong did. Like other American cyclists, his machine is made up of tubes created using carbon fibers, which means a bike that requires less energy to move. The carbon fiber bikes are both lighter than metal, and extremely rigid, or stiff.\p
This stiffness is important in allowing a more efficient transfer of energy from the pedals to the frame, but stiffness is a disadvantage when a rider has to go over bumps. So the cycles used by the American team had the "direction" of the stiffness controlled. Where metal is the same in all directions, the carbon fibers can be stiff in one direction, and flexible in another, so Armstrong's bike resists the twisting forces he uses to move forward, but remains flexible in the up-and-down direction. And that, say the makers, is the secret of seriously fast bicycle riding.\p
#
"Life, the Universe, and everything else: mathematical truths",1156,0,0,0
(Jul '99)
This is more of a conundrum than a news story, and it depends on readers understanding the significance of the apparently erroneous sum: 6 x 9 = 42. (This will make sense to people who have read the works of \1Douglas Adams\c, especially \IThe Hitch Hiker's Guide to the Galaxy\i and its successors.)\p
Your reporter posted to an Internet list in early July that he had noticed that the relationship 6 x 9 = 42 is true if the calculations are performed in base-13 notation. A list member, known only as Merlyn, responded by pointing out that this can go further, and that there is a pattern to be observed.\p
If we take the statement "six times x = forty-two," and vary the value of the base, we find a number of values of x which satisfy the statement, and these form a pattern when we examine both x and the base used.\p
six times x equals forty two is true when x is:
5 and the base for the calculation is 7\p
six times x equals forty two is true when x is:
7 and the base for the calculation is 10\p
six times x equals forty two is true when x is:
9 and the base for the calculation is 13\p
six times x equals forty two is true when x is:
11 and the base for the calculation is 16\p
six times x equals forty two is true when x is:
13 and the base for the calculation is 19\p
six times x equals forty two is true when x is:
15 and the base for the calculation is 22\p
six times x equals forty two is true when x is:
17 and the base for the calculation is 25\p
six times x equals forty two is true when x is:
19 and the base for the calculation is 28\p
six times x equals forty two is true when x is:
21 and the base for the calculation is 31\p
six times x equals forty two is true when x is:
23 and the base for the calculation is 34\p
six times x equals forty two is true when x is:
25 and the base for the calculation is 37\p
six times x equals forty two is true when x is:
27 and the base for the calculation is 40\p
six times x equals forty two is true when x is:
29 and the base for the calculation is 43\p
The pattern continues beyond this point, and it \Iis\i an elegant pattern. Explaining it will require finding a formula for each of x and the nominated base in terms of its order n, in the pattern.\p
#
"Population news",1157,0,0,0
(Jul '99)
According to some estimates, the world population reached 6 billion on 21 July 1999, while other estimates set the date as some time in October. In either case, it is probably now safe to refer to the world's population as 6 billion.\p
#
"Implanted defibrillators safe enough around anti-theft systems",1158,0,0,0
(Jul '99)
If you are one of the 400,000 people around the world who have a pacemaker, an implanted defibrillator that shocks your heart to regulate its rhythm, then yes, it is safe to walk through anti-theft systems. That is the thrust of a report in \ICirculation,\i the journal of the American Heart Association in July.\p
There are twice as many of the systems as there are pacemakers in the world, and pacemaker users have been trying to keep the ratio as high as possible by avoiding the systems, but Douglas P. Zipes, the study's lead author, says, "There is absolutely no danger from a slow stroll through the gates, even if it takes 10 or 15 seconds."\p
He does, however, recommend that people not loiter in the vicinity of one for too long - just in case.\p
#
"New tick-borne pathogen",1159,0,0,0
(Jul '99)
Worried about Lyme disease? If so, prepare to get more worried, because the \INew England Journal of Medicine\i has revealed that another potentially fatal disease spread by ticks - though previously it was thought only to have affected dogs.\p
The bacterium is \IEhrlichia ewingii,\i a relative of \IE. chaffeensis,\i which causes a disease called ehrlichiosis, first reported in 1986, and now widely spread in the United States, with the highest incidence in Missouri. It may well occur in other places as well, because the symptoms can be mistaken for influenza and other diseases. White blood cell counts can give a hint, but clinicians use the polymerase chain reaction to search the patient's blood for a ribosomal gene which is typical of several \IEhrlichia\i bacteria.\p
The tick spreading the bacterium is yet to be identified, and the extent of the new disease is as yet unknown. A screening study over a number of years found ehrlichiosis in 60 of 413 people screened, and four of those turned out to be carrying the new bacterium, according to a report released on the Internet.\p
See also: \1Lyme Disease (Travel Information)\c.\p
#
"First complete physical map of a higher plant genome",1160,0,0,0
(Jul '99)
In early July, \INature Genetics\i published the first ever-established complete clone-based physical map of a plant genome. The efforts of a combined German-American team have produced a map which covers the entire nuclear genome of the higher plant \IArabidopsis thaliana.\i The team also drew on other work carried out by members of the \I"Arabidopsis\i community."\p
This is the first-ever assembled (for any organism) entirely on the basis of BAC (bacterial artificial chromosome) clones, which are the premier system for cloning and maintenance of large genomic DNA.\p
The existing \IArabidopsis\i physical maps were predominantly based on YACs (yeast artificial chromosomes, a system for cloning and maintenance of large DNA fragments). The genome is represented as a set of 8,285 overlapping BAC clones.\p
#
"August, 1999 Science Review",1161,0,0,0
\JThe first images from Chandra\j
\JMars has changed\j
\JSpace race 1\j
\JSpace race 2\j
\JHIV and breast milk\j
\JWillie Morris lives on\j
\JCinnamon against E. coli O157:H7\j
\JDrug-resistant TB in Russia \j
\JNew rice strains and vitamin A and iron deficiency\j
\JA gene for salt tolerance\j
\JGetting DNA into the nucleus without viruses\j
\JMaking fat mice lose weight shows the way for humans\j
\JThe Shroud of Turin: a botanist's view\j
\JThe value of child labor\j
\JComputers and chemistry: a look backwards and forwards\j
\JNear-critical water as a solvent\j
\JSerious computing\j
\JSerious publishing\j
\JIs there life under the ice?\j
\JA new primate genus\j
\JDid a nearby supernova cause a mini-extinction?\j
\JTurkey earthquake\j
\JHow dioxin kills\j
\JTrouble for the world's turtles \j
\JA new tree of life for plants\j
\JNearly half of Earth's land has been transformed by humans\j
\JAre we in the middle of a mass extinction?\j
\JUS drought may be the century's worst\j
\JCockroaches and catnip\j
\JKindness to animals in India\j
\JA diet of worms?\j
#
"The first images from Chandra",1162,0,0,0
(Aug '99)
The Chandra X-ray observatory, NASA's newest Great Observatory, and named in honor of the late Nobel laureate Subrahmanyan Chandrasekhar, was declared to be in excellent health with its instruments and optics performing up to expectations when the first images were released on August 26. The images included the aftermath of a gigantic stellar explosion in such stunning detail that scientists can see evidence of what may be a neutron star or black hole near the center, and a powerful X-ray jet blasting 200,000 light years into intergalactic space from a distant quasar.\p
One of the first images taken after the telescope's sunshade door was opened last week was of the 320-year-old supernova remnant Cassiopeia A, which astronomers believe was produced by the explosion of a massive star, around 9400 years ago. Cas A, as it is known, is 9100 light years away, so the explosion should have been seen from Earth some 300 years ago, but it was not recorded.\p
The Cas A supernova remnant is a puzzle for astronomers. By the time the supernova was first visible here, there should have been plenty of capable astronomers around, so why did observers miss seeing Cas A explode, some 300 years ago? Where is the pulsar or black hole that most supernovae leave behind? The answer to the second question is now available, with a suspected pulsar showing up in the middle of the Chandra images of the remnant.\p
According to Harvey Tananbaum, Director of the Smithsonian Astrophysical Observatory's Chandra X-ray Center, the image shows the " . . . collision of the debris from the exploded star with the matter around it, we see shock waves rushing into interstellar space at millions of miles per hour, and, as a real bonus, we see for the first time a tantalizing bright point near the center of the remnant that could possibly be a collapsed star associated with the outburst." This level of enthusiasm is the astronomical equivalent of shouting from the roof-tops.\p
Material from the cataclysm would have blasted out at 10 million miles an hour - about 4500 km/second, causing violent shock waves and creating a vast and extremely hot bubble of gas which emits the X-rays by which we see the star through Chandra. The gas also contains heavy elements which produce X-rays of specific energies. So far, silicon, sulfur, argon, calcium, and iron are among the other elements detected.\p
A degree of mystery remains, because Cas A has too much titanium-44, compared with the amount of nickel-56, according to our understanding of nucleosynthesis in a star. The Compton Telescope aboard the Compton Gamma Ray Observatory has observed emission lines at 1.156 MeV, corresponding to scandium 44 decaying into calcium 44, the last step in titanium 44's life cycle. \p
And when observations go against theory, that is generally a good indication that the theory needs a few fine adjustments: with any sort of luck, the Chandra instruments may be just what we need to unravel this. Chandra's instruments can measure these X-rays precisely, and this tells us how much of each element is present. With this information, astronomers can then investigate how the elements necessary for life are created and later spread throughout the galaxy by exploding stars.\p
The other main image showed a distant and very bright quasar, PKS 0637-752, a single star-like object, with a powerful X-ray jet blasting into space. The quasar radiates as much power as 10 trillion of our suns, energy which scientists believe comes from a supermassive black hole at its center. The Chandra image, combined with radio telescope observations, should help us understand how supermassive black holes can produce such cosmic jets.\p
Chandra combines a large mirror area, accurate alignment and efficient X-ray detectors, to have 10 times greater resolution, making it 50 to 100 times more sensitive than any previous X-ray telescope. With a resolving power of one half arc-second, it would let us "read a newspaper from half a mile [800 meters] away or see the letters of a stop sign from 12 miles [20 km]", according to NASA publicity. This means it is able to provide images of events which happened far away and long ago, extending our appreciation of the universe.\p
Launched from a shuttle mission in late July, Chandra's elliptical orbit takes it once around Earth every 64 hours, traveling as much as 200 times further away from the home planet than its visible light-gathering sister, the Hubble Space Telescope.\p
The Cas A remnant is a very faint nebula, so when Charles Messier was making up his famous catalog of blurry objects that should not be mistaken for comets in 1784-6, he missed it. Karl Jansky mapped it in the 1930s, during some of his pioneering work which founded radio astronomy, but Jansky's mapping was only precise enough to locate the object in the constellation Cassiopeia, and so it became Cas A. In the 1950s, the Cambridge University third catalog listed it as 3C 461.\p
The Uhuru Small Astronomy Explorer launched in 1970 "saw" the same source, and it gained the name designation 3U 2321+58 (in the 3rd Uhuru catalog). The numbers give the position in right ascension (23 hours, 21 minutes) and declination (58 degrees up) from Earth's equatorial plane. To complete the list of aliases for this object, Cas A has also been designated G111.7-2.1 in the galactic coordinate system.\p
The remnant is made up of an incomplete shell of expanding gas with compact knots of material at temperatures up to 28 million K (50 million ░F). Its outer shell is expanding at 800 km/s (about 1.73 million mph).\p
#
"Mars has changed",1163,0,0,0
(Aug '99)
Fascinating new images from the Mars Global Surveyor spacecraft were released in early August. These show that the red planet is a very different place now, compared with what it was two years ago when the spacecraft arrived, and Mars is now revealed as a constantly changing world where sand dunes shift, monster dust storms swirl, and frosts and polar ice caps grow and retreat with the seasons. We may have already known Mars as a cold dry desert, but now we know also that it is far from being the rather stagnant place the Viking spacecraft saw in the late 1970s.\p
Mars Global Surveyor Mars Orbiter Camera (or MGS MOC) is now into the mapping phase, taking shots with a resolution of just a few meters. The mapping is planned to extend through a full Martian year, from March 1999 to March 2001, but already the data have scientists at NASA very excited. According to Michael Malin, principal investigator for the Mars Global Surveyor camera at Malin Space Science Systems, the record now shows us "seasonal and meteorological events, which demonstrate that Mars is active and dynamic today." Like Earth, Mars also has seasons, but the summer and winter are, of course, linked to the Martian year.\p
The spacecraft's wide-angle cameras are able to monitor Martian weather, just as terrestrial satellites monitor Earth's weather, and the last two months have shown much more variable weather as the southern hemisphere of Mars moved into spring, and autumn gathered force in the north. Storm clouds gathered over the north pole of Mars during our July, and this cloud cover will grow as ever-larger portions of the north pole are plunged into darkness. It may even begin to snow in the north.\p
Elsewhere, the weather forecast is for dust devils, whirlwinds that develop when hot air rises off the ground in generally light winds. Terrestrial dust devils are minor, but on Mars, they are probably the main reason why the Martian sky, as seen by the Mars Pathfinder and Viking landers, appears a distinctly unearthly brownish color, with the dust devils dragging large amounts of fine, pinkish dust into the sky. Last May, swirling columns of dust as high as 8000 meters (five miles) were observed in northern Amazonis Planitia. Most dust devils probably only reach an altitude of 2000 meters (6000 feet), and a typical Martian dust devil probably carries several tonnes of dust with it.\p
Yet while the dust devils can move dust, they are far less powerful than the tornadoes which cause so much havoc on parts of our own planet. We have evidence of powerful winds in some areas of Mars, in the form of sand dune fields which appear to be shifting across the Martian landscape. Some of the new views also show dark dunes poking up through the carbon dioxide frost near the south pole as the spring thaw progresses, an effect seen last year at the north pole as spring approached there.\p
A variety of new images of Mars is available on the Internet at: http://www.msss.com/ and these include detailed shots of the area near the south pole where the Mars Polar Lander is due to touch down on December 3, 1999.
#
"Space race 1",1164,0,0,0
(Aug '99)
With the last cosmonauts to be stationed at Mir leaving on August 28, the Russian Government has announced that it will junk the space station next year, with one last group of visitors altering the orbit to plunge the space station down into the Pacific Ocean.\p
There remains a slim hope that the station may be maintained, but if this does not happen, the future for the Russians in space will be on the International Space Station, bringing to a close the "space race" which began back in the 1950s.\p
The core of Mir was launched in February 1986, and with the financial problems faced by Russia, the more recent cosmonauts have echoed an idea first put forward in Robert Heinlein's "The Man Who Sold the Moon", where the moon was to be turned into a giant Coca-Cola logo. In this case, it was the opposition, Pepsi-Cola, which was featured when they filmed a commercial on board the space station. Later advertisements were for bananas, milk and pretzels, and the cosmonauts even sold space pens over a home shopping network.\p
Perhaps Mir will feature in one final space race, the race for naming rights on the plunging space station?\p
#
"Space race 2",1165,0,0,0
(Aug '99)
If the traditional "space race" of the cold war era is about to end, a new kind of space race may be about to begin, though this will be a race not between ideologies, but between two competing technologies. The competition is between using a "sail" to catch the solar wind with continuing gentle acceleration over a long period, and relying on rocket power for short and furious acceleration followed by long-term coasting as a means of getting somewhere. Up until now, all our spacecraft have used this second technology, but it is just possible that a solar sail craft, launched in the next few years, could be the first human-made object to venture into inter-stellar space.\p
NASA says the sail idea has been around since the 1980s. For many people, the notion of a sail catching the solar wind traces back to another science fiction story, Arthur C. Clarke's "Sunjammer", published in 1966 in \IAmazing Stories,\i though an earlier version of the same idea was published in 1964, under the title "Dashing and Coasting to the Interstellar Finish Line." The idea goes back even further, with a writer identified as G. Marx said to have proposed a sail, driven by an earth-based laser, in \INature\i in July 1966. For the record, the idea of a lightsail was first suggested by Tsiolkovsky and Tasander in 1924.\p
Whichever original source you honor, NASA is now looking seriously at the idea, and Dr. Robert Winglee of the University of Washington has been given a $500,000 grant to continue work on the Mini-Magnetospheric Plasma Propulsion (M2P2) concept, where the craft makes a magnetic "wall," rather like Earth's magnetosphere, and the solar wind pushes on this. The idea is to create a magnetic bubble to deflect the solar wind.\p
Right now, the first four human-made objects to head for the edge of the solar system are the Pioneer 10 and 11 probes (launched March 3, 1972 and April 6, 1973, respectively) and the Voyager 1 and 2 (September 5 and August 20, 1977, respectively). None of these craft has yet reached the heliopause, the tenuous shock wave in deep space where the solar wind encounters the interstellar medium that fills the rest of our galaxy.\p
The Earth's magnetic field traps a large volume of electrified gas, forming the magnetosphere. Then it forces the solar wind to flow around it. As a result, the wind actually pushes on the Earth through the magnetic field lines, but the push is so tiny compared with the Earth's mass that it has no measurable effect.\p
The M2P2 craft would have a magnetic field of 0.1 tesla (about 1000 times stronger than Earth's magnetic field, generated by a conventional solenoid. One bottle of just 3 kg (6.6 pounds) of helium as the plasma fuel would provide the material needed by the helicon plasma source for three months. As the solar wind varies, so the plasma will vary in size, maintaining a steady force on the 100 kg craft of 1 Newton. The energy needed to operate the electromagnet and the plasma generator (around 3 kilowatts) would come from solar cells.\p
According to Winglee, there is enough power in the solar wind to accelerate a 136 kg (300 pounds) spacecraft to speeds of up to 288,000 km/h (180,000 mph) or 6.9 million km (4.3 million miles) a day. By contrast, he says, the space shuttle travels at about 7.7 km/s (17,300 mph) or 688,000 km (430,000 miles) a day.\p
It would take around ten years for such a craft to reach the heliopause, where the solar wind runs into the interstellar wind, about 150 Astronomical Units (AU). Voyager 1, launched in 1977, will get there in 2019, so there is a window of opportunity for the would-be racers, which closes in 2009.\p
See also: \BAstronomical Unit, Arthur C(harles) Clarke, Konstantin Eduardovich Tsiolkovsky.\b\p
#
"HIV and breast milk",1166,0,0,0
(Aug '99)
For many years, health authorities have encouraged mothers in developing countries to breast feed their children, saying that infant formulas tend to be misused, leading to malnutrition and infection. A report in late August in the \IJournal of the American Medical Association\i leaves this strategy open to question.\p
The researchers looked at HIV-infected mothers and their babies in Malawi, and found that an infant's risk of becoming infected with HIV through breast feeding is highest during the first few months of life. In the United States, where safe alternatives to breast milk are plentiful, HIV-infected women are advised against breast-feeding their infants. In countries like Malawi, either solution is fraught with risk, making this study an important one, since it allows a rational choice of the lesser risk.\p
The study looked at children of HIV-infected mothers, provided that the babies tested negative for HIV at their first visit, six weeks after birth. The researchers' aim was to examine breast-feeding-related HIV infections, and positive HIV tests during the first weeks of life can result from infection which occurred during pregnancy or childbirth as well as through breast-feeding.\p
Over the next two years, 47 of the 672 infants in the study became HIV-infected from breast-feeding. Of these, 21 of the infections occurred in the first five months, with another 15 between months 6-11, and a further seven between months 12 and 17. No babies in the study became infected with HIV after they stopped breast-feeding.\p
Younger mothers were more likely than older mothers to transmit HIV, and mothers with four or fewer children were also more likely to transmit the virus. The researchers believe that mothers who are relatively less experienced with breast-feeding are more likely to have subclinical mastitis, an inflammation of the mammary tissue which may be related to the transmission of HIV. Another recent study showed that in a different population of Malawi women, subclinical mastitis was associated with higher HIV levels in breast milk and higher HIV transmission to breast-feeding infants.\p
As always, there is a need for balance, in this case between the risks of HIV transmission on the one hand, and an increase in the risk for illness and death from the respiratory and diarrheal diseases on the other hand, as antibodies and other factors in breast milk help protect against these. In the longer term, a vaccine will probably turn out to be the best answer of all.\p
#
"Willie Morris lives on",1167,0,0,0
(Aug '99)
Writer Willie Morris died in early August, but two Mississippi men are likely to be among the first to line up for the movie of Morris' book \IMy Dog Skip,\i which is set to be released in January 2000. The movie will star Kevin Bacon, Diane Lane, and the dog from TV's \IFrazier.\i\p
In an unusual move, American medical authorities have persuaded Morris's family, and the two Mississippi men, to "go public" about the donation and transplantation of the writer's two corneas following his death. The family, the authorities, and the recipients have all agreed to this unusual step because of the shortage of corneas which exists not only in America, but all over the world.\p
The main problem is that the families of possible donors assume that removing the cornea will disfigure the corpse, but this is not the case. The cornea is just a window for the eye, which lets light in, as well as helping to focus the light. If a cornea is damaged in some way, the owner has problems seeing out of it.\p
Cornea transplants, say doctors, are the most successful type of organ transplants. They grant the gift of clear vision, whether it is the gifted writer's vision of Willie Morris (who must surely now be within six degrees of Kevin Bacon), or the vision of any one of us. Donors, they say, do not even need to have particularly good vision themselves - just the vision to wish to bestow a precious gift on somebody else.\p
#
"Cinnamon against E. coli O157:H7",1168,0,0,0
(Aug '99)
When people in the Middle Ages used spices to disguise the taste of "off" meat, perhaps they knew exactly what they were doing. A paper presented to the (US) Institute of Food Technologists' 1999 Annual Meeting in late July claimed that cinnamon is a powerful weapon against the lethal bacterial strain, \IE. coli\i O157:H7, and may be able to help control it in unpasteurized juices.\p
In apple juice samples inoculated with about one million \IE. coli\i O157:H7 bacteria, about one teaspoon (0.3%) of cinnamon killed 99.5% of the bacteria in three days at room temperature (25░C). When the same amount of cinnamon was combined with either 0.1% sodium benzoate or potassium sorbate, preservatives approved by the US Food and Drug Administration, the \IE. coli\i were knocked out to an undetectable level.\p
See also \BSpices and health,\b Science Review, for earlier work by the same researchers.\p
#
"Drug-resistant TB in Russia",1169,0,0,0
(Aug '99)
A report published in early August in the \IMorbidity and Mortality Weekly Report\i of the US Centers for Disease Control and Prevention (CDC) says that tuberculosis (TB) is spreading at an alarming rate in Russia, and adds that the strains isolated from sites far apart in Russia show resistance to the main drugs in use against the disease right now.\p
Death rates are high, both near Moscow and in Siberia, and instances of multidrug resistant strains are becoming more common. It appears that traditional TB-fighting techniques are failing in Russia, and if the epidemic is to be stopped, new, expensive, second-line drugs and intensive patient monitoring will be needed. The problem, of course, is that Russia cannot afford these treatments. \p
That should worry the rest of the world, because so long as traditional methods are used, the Russians are just building up a larger and larger stock of TB bacteria, immune to an ever-larger number of antibiotics. Once we feared the time-bomb of Russia in the Cold War - now we need to fear the time-bomb that Russia may become, if there is a massive social breakdown which sends Russian refugees across the world, carrying resistant TB with them.\p
#
"New rice strains and vitamin A and iron deficiency",1170,0,0,0
(Aug '99)
Vitamin deficiency is a major cause or contributor in maternal and childhood death, disease, and blindness in developing countries. New genetically-modified rice strains, able to improve the supply of iron and vitamin A in the human diet, were announced in St. Louis, Missouri in early August at the XVI International Botanical Congress.\p
These genetically-modified rice strains may provide tools which will help to reduce global rates of iron deficiency anemia (IDA) and vitamin A deficiency (VAD), especially in developing countries where the major staple food is rice. Professor Ingo Potrykus, from the Swiss Federal Institute of Technology's Institute for Plant Sciences, was the principal investigator for the two separate research teams conducting the vitamin A and iron research, funded by the Rockefeller Foundation.\p
The new rice varieties are to be made available, free of charge, to national and international agricultural research centers. In the next phase, the International Rice Research Institute (IRRI) in the Philippines will be the researcher's immediate partner for further development of the transgenic material into publicly available rice breeding lines.\p
Around the world, in a population of six billion, an estimated 2 billion people are anemic, while 3.7 billion are iron deficient, the majority of them women. In developing countries, almost half the children under the age of five are iron-deficient, and UNICEF figures indicate that 20% of all maternal deaths are at least contributed to by IDA, because anemia increases the risks of hemorrhage and sepsis during childbirth. And each year, more than one million children die from VAD-associated illnesses, while WHO figures indicate that as many as 230 million children are at risk of clinical or subclinical VAD.\p
This is, it should be stressed, a condition which can be simply prevented. If a child's diet is supplemented with Vitamin A, UNICEF figures show that the risk of death in susceptible populations drops by 23%. VAD is also the single most important cause of blindness among children in developing countries.\p
Even where death does not follow, IDA interferes with the immune response and reduces the physical and mental capacities of people of all ages. In infants and young children, even mild anemia can impair intellectual development. The main cause of IDA is an inadequate dietary intake of iron, but while iron supplementation can be used with wheat, the problem with rice is rather greater.\p
Rice produces the carotenoids which can be converted to vitamin A, but only in the green parts of the plant and not in the part of the rice grain which is eaten by humans. The result is that in rural areas where children are weaned on rice gruels, they are particularly prone to VAD since they eat little else. Adding to the effect, children in rural areas are seldom reached by vitamin A supplementation programs.\p
As well, vitamin A can be deadly in excess - there are several known cases of polar explorers killing and eating mammals (variously sled dogs, a polar bear, or seals), and after eating the liver(s) of the animal(s), becoming violently ill or dying. So any form of supplementation will need to be in the form of a precursor, such as the new rice strains offer. Nutritionists say that beta-carotene is not converted to retinol (the active form of vitamin A) if there is already a high level of retinol in the blood, so supplementing the normal diet with beta-carotene in foods should avoid any toxicity risk.\p
It is not enough for a chemist to reduce some food to ash, and show that the ash contains sufficient iron, because much of this may not be available to whoever or whatever eats the food. The amount of bioavailable iron depends on two things: the amount of iron taken in while eating, and the amount of iron which is absorbed from the food during eating.\p
Heme iron is mainly found in foods containing blood and muscle, which are in short supply in developing countries, and especially in rural areas. Much of the dietary iron in developing countries is non-heme iron, obtained from unrefined cereals including rice, from nuts, and from dark leafy vegetables, and this sort of non-heme iron is not absorbed well during digestion. \p
The foods which promote the absorption of non-heme iron, mainly fruits and vegetables rich in ascorbic acid, are often in short supply in developing countries. A further problem is that many legume staples and grains, including rice, are high in phytic acid (also known as phytate), which is a powerful inhibitor of iron absorption in the intestine.\p
Potrykus and his colleagues have added three genes to the rice plants. two from daffodil and one from the bacterium \IErwina uredovora,\i getting transgenic rice plants which produce sufficient beta-carotene - converted to vitamin A in humans - in the grain to meet total vitamin A requirements in a typical Asian diet.\p
The researchers then added a ferritin gene derived from the French bean, \IPhaseolus vulgare.\i Ferritin is an iron storage protein found in many animals, plants, and bacteria. They also added a phytase gene from the fungus \IAspergillus fumigatus,\i which is heat stable and so survives to degrade the phytic acid in the cooked rice. As well, they boosted the grain's content of its own iron absorption-enhancing cysteine-containing proteins, a technique called overexpressing.\p
The rice strains are still to be tested for any impact they may have on the environment and human health, but once they are over those tests, the new grain varieties will be distributed free of charge in developing countries. This will then allow local breeders to use traditional breeding techniques to transfer the characters of the beta-carotene and iron-enhanced rice into varieties adapted to local conditions.\p
Once in the possession of the farmer and plant breeder, according to the Rockefeller Foundation, the new varieties become their unrestricted property, to do with as they see fit. The farmers may, if they choose, use a portion of their harvests for further sowing, or they can make further crosses. Conveniently, the beta-carotene-rich rice grains are colored yellow by the beta-carotene, making selection for that factor remarkably easy.\p
Potrykus's research team focuses on rice, wheat, sorghum, and cassava and is using genetic engineering to contribute to the stabilization and increase of yield and to improvements in food quality, so their future progress will be worth watching. For more information on fortifying rice, see http://ccr.ucdavis.edu/biot/html/rice.html; for more on Potrykus and his work, see http://www.rereth.ethz.ch/biol/selb.potrykus/potrykus.proj_overview.html, and for a general account of Potrykus's work, see http://www.rereth.ethz.ch/biol/selb.potrykus/potrykus/pj.01.html; and follow the links from those pages.\p
The interesting thing about this piece of work is the way in which it meets a number of the standard criticisms of GM food technology. Usually, GM food work is done on crops which are found in temperate regions, and crops of the tropics and arid regions have been left alone, presumably because there is no financial reward in doing the work. Here, standard techniques, developed on more profitable crops, are now being used to benefit large slabs of the world population.\p
Next, the new developments do not offer any threat to the genetic diversity of crops, because the new genes will be introduced, using standard traditional methods of breeding. Equally importantly, control over the use of the new and improved varieties will lie with the farmers, not with a multinational combine. But most important of all, these strains have been produced to benefit the consumer, rather than to fill the coffers of a rich plant developer.\p
The standard model of GM foods depicted by its opponents is one based on greed and profit-grabbing, and certainly financial considerations do usually come first: this is why GM work up until now has not touched the crops of arid or tropical areas, where there is no profit to be had. The proponents of GM technology are described as greedy, holding onto control of all the power, and all the proceeds. This example of altruistic use of the standard techniques to benefit the women and children in farming families will hopefully be the model that prevails in the 21st century.\p
See also: \BVitamins in the News,\b Science Review, February 1999.\p
#
"A gene for salt tolerance",1171,0,0,0
(Aug '99)
Large parts of the world's best agricultural lands are under threat from saline irrigation water, or in the case of Australia, rising water tables which bring heavily saline ground water to the surface, where it "burns off" all of the crops and vegetation, leaving a desolate salt pan. Other affected areas include large parts of the Canadian prairies, the southwest United States, South America, Asia, and Europe. In total, affected areas make up 30% of the world's irrigated land, and these have all experienced serious declines in crop productivity in this century due to the effects of salinity.\p
Each year, some 10 million hectares (40,000 square miles) of land go out of production around the world because saline conditions will no longer support agriculture with normal crops. That is an area rather larger than the north island of New Zealand or the whole of Ireland, half the size of Kansas in the USA or of Victoria in Australia, or about one fifth the size of France.\p
Even when the situation is less complex than that found in Australia, irrigation water usually contains dissolved salts, and traditional solutions to this problem- such as high-quality irrigation water and the installation of drainage systems- are expensive and inaccessible in many areas.\p
Attempts at conventional breeding to produce crops that adapt to the saline conditions have also been unsuccessful, with only a few varieties produced with low salt tolerance. And as we move into an era when wars will be fought over potable water, many artesian water sources are heavily salted. \p
Under these conditions, a recent Canadian discovery at the University of Toronto looks set to make the world a little safer for agriculture. A report in \IScience\i in mid-August \I(Science\i \B285\b (1999):1256-1258) from Eduardo Blumwald, who led the research group which discovered the gene in the University of Toronto's Department of Botany, describes how the team isolated a gene which encodes a transport protein in plant cells. This protein, called the \I\BNa+/H+ antiport,\i\b prevents the sodium ions in salt from harming the cell, and so confers salt tolerance on the plant. This then allows the plants to grow, even in highly saline conditions, because the protein creates a balance of ions in the cell which draws water into the plant cell by osmosis.\p
The gene occurs naturally in the genetic workhorse, \IArabidopsis thaliana,\i which is normally salt-sensitive, so the researchers cloned the gene coding for the antiport and modified the plant to overproduce the antiport protein. This form of genetic manipulation should satisfy all but the most fastidious opponents of genetic manipulation, since all the researchers have done is to magnify the effect of a gene already present in the plant.\p
Most commercial crops (for example, corn, soybean, wheat, vegetables, and fruits) experience a significant yield loss in saline soil, and in most crops, the activity of the antiport transport system is very low. In practice, this means that the plants cannot drive water into the cell in the presence of salt, and instead lose water. From there, it is a downhill road, with reduced leaf size, a general decrease in growth, and ultimately, death.\p
As a rule, the environmental stress caused by salinity is one of the most serious factors limiting the productivity of crops, says Blumwald, concluding that " . . this innovation could have significant implications for the agricultural world." The university has applied for worldwide patents on the discovery.\p
\BKey names:\b Eduardo Blumwald, Gilad Aharon, Maris Apse, and Wayne Snedden.
#
"Getting DNA into the nucleus without viruses",1172,0,0,0
(Aug '99)
The standard method in gene therapy involves using a virus, typically an adenovirus or an adeno-associated virus, to introduce new genes into a cell. A new development, reported in the September issue of \INature Biotechnology\i (released in late August), may offer a cheaper alternative to this. The method works on cells which are not dividing, and promises a way of "transfecting" cells, introducing foreign genetic material, without introducing either lipids or viral components into the target cell.\p
The carrier molecule used to transfer the gene combines a short genetic tag from a nuclear protein with the standard marker gene, which provided the molecular key to the nucleus. The combination is said to be 60 times as effective as any other non-viral agent in introducing the marker gene into cardiovascular cells so that it becomes active.\p
There are several steps in effecting a gene transfer: first, the DNA must be transported to the cell surface, then it must be taken up by the cell and packaged into endosomes, then the DNA must escape from the endosomes and be carried to the nucleus - and then it has to be carried into the nucleus, and it is this step that the new carrier deals with.\p
The idea is to deliver DNA using lipoplexes. A lipoplex is made up of a lipid, a fat-like molecule, complexed to something else, in this case, to plasmid DNA, a simple, circular form of the genetic material. The gene used in this development work is a "reporter gene" which codes for an enzyme called beta galactosidase, together with the nuclear targeting sequence for a riboprotein normally found in the nucleus, and this is delivered to endothelial cells. In the test, 80% of the cells treated made beta galactosidase. In itself, the beta galactosidase is unimportant, but its appearance indicates that the gene has been successfully transferred, and is active - "being expressed" in the terms of the geneticists.\p
The 38-amino-acid-long nuclear targeting sequence binds to another protein called transportin, which ferries the entire plasmid DNA to the nuclear pore, where it then enters the nucleus. But while this has been achieved in a culture dish, there is still a way to go before the method can be used to treat a patient, though the fact that they are getting genes into endothelial cells is promising, as these cells form the lining of blood vessels, and so would be a convenient target for an injected gene therapy, which is what this research is all about.\p
\BKey names:\b Scott Diamond.\p
#
"Making fat mice lose weight shows the way for humans",1173,0,0,0
(Aug '99)
A study reported in September's \INature Medicine\i at the end of August has shown the way to a new approach in treating obesity. At present, most treatments and proposed treatments are directed at controlling appetite, while the new approach centers on the way in which the body stores and burns fat.\p
A gene known as POMC, found in both mice and humans, provides signals that are important both during development and also later in life, and the researchers were looking at the impact of POMC on the development of the brain when they discovered it also played a major role in weight regulation.\p
In simple terms, mice lacking POMC-derived signals are obese, but when they are treated with MSH, a hormone made by the POMC gene, they returned to almost normal weight in a matter of weeks. The effect goes beyond appetite, and also influences fat storage.\p
The finding may have useful applications with a number of other conditions. People with Down's (Down) syndrome often have difficulty controlling their weight. Again, medications commonly used to treat epilepsy and various forms of depression often lead to unwanted weight gain, so the discovery may have applications beyond the obvious ones.\p
#
"The Shroud of Turin: a botanist's view",1174,0,0,0
(Aug '99)
Early in August, at the XVI International Botanical Congress, botanist Avinoam Danin of The Hebrew University of Jerusalem described his studies on the \1Holy Shroud,\c or Shroud of Turin, thought by many to be the burial cloth of Jesus of Nazareth. Danin looked at the pollen grains and plant images associated with the Shroud, and has used these to date the cloth at "before the 8th century". A 1988 carbon dating process placed it in the Middle Ages - in the range from 1260 to 1390.\p
The review of plant and pollen evidence is to be published by the Missouri Botanical Garden Press as \IFlora of the Shroud of Turin\i by Danin, Alan Whanger, Mary Whanger, and Uri Baruch. The peer-reviewed publication will be available late in the northern summer.\p
Pollen grains are particularly distinctive, and they are also extremely robust, so their remains can often be used to identify the exact ecology of an area, or to pinpoint the type of location where the grains were gathered. Danin believes that flowers and other plant materials were placed on the Shroud of Turin, leaving pollen grains and imprints of plants and flowers on the linen cloth - as well as the apparent image of a crucified man, the cloth appears to carry images of plants.\p
Using a method called Polarized Image Overlay Technique (PIOT), Alan and Mary Whanger have concluded that the flowers were from the Near East region and that the Shroud originated in early centuries. Danin's work on the floral images, linked with the pollen analysis by Uri Baruch leave us with a combination of certain species that could be found only in the months of March and April in the region of Jerusalem.\p
There is a high density of pollen on the thistle \IGundelia tournefortii\i which has bloomed in Israel between March and May for thousands of years. An image of the plant can be seen near the image of the man's shoulder. It has been suggested by the Whangers, who have researched the Shroud for decades, that this may have been the plant used for the "crown of thorns" on Jesus's head.\p
Two pollen grains of the same species were found on the Sudarium of Oviedo, widely accepted as the burial face cloth of Jesus. This object has a sounder provenance than the shroud, as its location has been documented since the 1st century and the cloth has been in the Cathedral of Oviedo in Spain since the 8th century. Both cloths carry AB blood group stains, though ancient blood types can be hard to interpret in any reliable way. Both cloths, however, have a similar pattern.\p
Given these coincidences, it seems more likely that the two objects derive from the same time, and while this does not make the objects any more genuine, it does refute the negative evidence on the Shroud, derived from the carbon dating - a dating which Shroud supporters have rejected in any case, arguing that the Shroud could easily have acquired modern material which would confuse the dating method.\p
The Shroud also carries a clear image of the distinctive plant, \IZygophyllum dumosum,\i a plant which coexists with \IGundelia tournefortii\i in an area bounded by Jerusalem and Hebron in Israel and Madaba and Karak in Jordan. A third plant, \ICistus creticus,\i was also present, both as an image and as pollen, and this plant is also found only in the Jerusalem area.\p
Images of \ICapparis aegyptia\i flowers can also be detected on the Shroud. These show a distinctive pattern of opening during daylight hours, a pattern which stops when the flowers are picked and no water is supplied. The images support the view that the flowers were picked in the Judean Desert or the Dead Sea Valley between 3 and 4 pm on the day they were placed on the cloth.\p
Earlier religious art works often depict the same flowers, making Danin suspect that the images may have been clearer in past times, and known to the artists.\p
The Shroud of Turin is a linen rectangle measuring 4.35 meters by 1.1 meters. After appearing in Turin (Torino) in 1578, the Shroud was placed in a special chapel within the Italian cathedral of St. John the Baptist in 1694. Except for a brief period during World War II when the cloth was moved elsewhere for safety, the Shroud remained in this cathedral until the night of April 11, 1997, when it was removed, unharmed, during a fire. The Shroud was not damaged, and was kept elsewhere in the city until it was again placed in the cathedral for public display for two months in 1998.\p
#
"The value of child labor",1175,0,0,0
(Aug '99)
Robert Owen, the visionary who created New Lanark, allowed no child under eleven to work, while in modern times, we see 15 or 16 as the earliest age at which children should leave school. At the same time, most developed nations are comfortable with younger children and students over that age but still at school, having part-time jobs. Yet even traditional student jobs, like mowing the lawn, babysitting, or delivering newspapers may lower mathematics and science scores, according to a report delivered before the American Sociological Association in Chicago in early August.\p
The effect held up, even after controlling for family background effects, according to Dr. David Post, one of the researchers involved. Post and Dr. Suet-ling Pong looked at the National Educational Longitudinal Study (NELS). This was a 1988 random sample of eighth graders across the US that followed students through high school.\p
The detail in the NELS allowed distinctions to be drawn between light work, such as babysitting and delivering newspapers, and heavy work, such as farm work or construction. For boys, working during eighth grade had detrimental effects on achievement and on learning mathematics and science in the tenth grade, and there was a similar but lesser effect for girls.\p
Earlier studies, say the researchers, looked only at grades, and found no effect, while they concentrated on achievement, and so identified the effect. They suggest that the benefits normally assumed - developing a sense of responsibility and learning new skills from adults - are minimal when those supervisors are themselves only 18, and when the tasks are so de-skilled (for example, with coded keyboard cash registers in fast food outlets) that there are no longer any measurable gains.\p
There was also a suggestion that the effect is world-wide. The Third International Mathematics and Science Study (TIMSS) does not offer longitudinal data, but does look at achievements in mathematics and science for fourth, eighth, and twelfth graders in countries around the world. It also records whether the students worked outside school for pay.\p
The results show that those countries where students more often worked for pay had lower overall scores than those where eighth grade students do not work outside school, according to Post, who also detected the effect in girls, in those countries where girls are allowed to work outside the home.\p
While this is less than absolute proof, the findings do sound a note of alarm for educators if the teenagers in their schools are working as well as studying.\p
#
"Computers and chemistry: a look backwards and forwards",1176,0,0,0
(Aug '99)
Late August saw the 218th American Chemical Society National Meeting in New Orleans, where one of the highlights was a presentation by Raymond E. Dessy, professor of chemistry at Virginia Tech. Dessy won the first ACS National Award for Computers in Chemistry, and his talk covered the use of computers in chemistry, arguing that their time has come.\p
Computers are used in chemistry, not only to record a scientist's notes and data, but also to "mine" the information in institutional databases. As well, computers are used for the large scale testing of compounds during the advanced stages of new drug discovery. "It is a fascinating period," comments Dessy. "In 25 years, we have gone from computers with 4K of memory - which wouldn't hold today's screen savers - to the ability to process huge volumes of data."\p
The move to computers really began in the 1970s, with most research computing centers being distinctly chemist-unfriendly. As well, chemists had no idea back then of how to hook their instruments up to computers. The arrival of the PC started serious change, and now the needs of biochemistry are driving the new advances. As scientists try to identify the sequence of billions of base pairs in genes from the DNA of all manner of organisms, says Dessy, the processes involved have become more and more automated.\p
As well, modern drug discovery is now an automated process which involves sophisticated instrumentation. Once, a biologically active substance would be found by chance, and teams of chemists would then fiddle around the edges, trying to make variations that were better, more active, more directed to some target disease. Typically, each variation would cost $10,000 to create and test - and it might take 10,000 compounds to find one that made it through all the tests and trials to become a marketable drug. It was a process that took years.\p
Computerized drug discovery began with attempts to use computer graphics to visualize how drugs mated with their receptor sites on a cell. The next step was combinatorial chemistry (See \BA saving of time,\b April 1998), when chemists decided to use computers to manage huge numbers of combinations of drug precursors. Now chemists can screen 10,000 compounds a week.\p
Rather than chemists pipetting chemicals into a single test cell, now dozens of pipettes drop samples into test media contained in plates having 384 or 1536 wells. That costs on average around 50 cents per test, rather less than in the past, though the process is still a gamble - in fact, Dessy notes that one company calls its system "The Haystack", but the tests are founded in fundamental biochemistry, and the search is now much faster. " Speeding up drug discovery is important business," says Dessy.\p
One problem still to be faced is the thorny issue of validation of data. Most of the recent cases of scientific fraud have been uncovered or confirmed by a careful examination of a scientist's notebooks, which must be dated and signed. As scientists "migrate" to electronic data storage, with information in word processor, spreadsheet, and database files, this becomes a worry: there are no easy ways of dating and validating the information, or showing who did (or added or changed) what.\p
The advantage is that data can more easily be shared - Dessy maintains that few people could read his handwriting, but the electronic notebook is designed to be shared, and it speeds up the process of developing a discovery. The electronic lab book also makes it easier to prove discovery dates, to protect a patent, and it is easier to search through large amounts of data.\p
But under it all, there remains the problem: "What is original data?", and that problem is not going to go away. This is especially important in the areas of pharmaceuticals and gene technology, both areas of science subject to fine legalistic manipulation - in such a case, millions of dollars could be in the balance, depending on the reliability and validity of a few key pieces of data.\p
As a result, says Dessy, computers with zero-administration ability are being created for lab use. Many functions that home PC users take for granted are removed, to assure the integrity of the data chain.\p
Raymond Dessy is one of the more unusual minds around in science these days. His ideas can be explored on the Web at http://www.chem.vt.edu/chem-dept/dessy/honors/, where his Honors Colloquium, "Internet Impact," dealing with the sociological, political, and economic impacts of the WWW, is described. The links off that page are also worth exploring.\p
#
"Near-critical water as a solvent",1177,0,0,0
(Aug '99)
One of the major pollution problems in the world today comes from the solvents used in the chemical processes we rely on for every aspect of life, from plastics to microchips and computers, to many of the foods we eat, and even the clothes we wear.\p
Some solvents, like water, are comparatively harmless, but other solvents become a serious problem when they escape, or are disposed of. The problem stems from the fact that there are two main types of things needing to be dissolved: compounds like salts, which dissolve in water and non-polar compounds like oils, which do not dissolve in water at all. Under normal circumstances, these compounds require solvents that are potential future problems.\p
Now it turns out that water can dissolve oils as well, so long as it is at its near-critical point, heated to around 250-300░C (480-570░F), and at a pressure of around 6.7 megapascals (1000 psi). In other words, water comes into its own in a whole new range of manufacturing processes, so long as the conditions are right.\p
The critical point of water is 373.99░C (705.18░F) and a pressure generally given as 218 atmospheres, or more precisely as 22.064 MPa or some 3300 psi. Above this point, water is neither a liquid nor a gas, and its properties become interesting, while below that point, it is still water, but as it approached the critical point, its properties are very interesting indeed.\p
At the 218th American Chemical Society national meeting in New Orleans in late August, Dr Charles Eckert described work that his research team have been doing in this area, with a view to establishing the limits of such applications. \p
Both the gentle nature of water and its strength as a solvent depend on its unique system of hydrogen bonding. When water is heated, its normally strong hydrogen bonds weaken, allowing dissociation that forms acidic hydronium (H\D3\dO\U+\u) ions and basic hydroxide (OH\U-\u)ions. Close to the critical point, the amount of dissociation is three times what it would be at normal temperatures and pressures. The end result is that acid-catalyzed and base-catalyzed reactions are able to run in near-critical water without the addition of any mineral acid.\p
As well, the near-critical water has properties similar to those of polar organic solvents like ethyl alcohol or acetone. The dielectric constant drops from 80 to 20, and the density drops from one gram per cubic centimeter to around 0.7 gram per cubic centimeter. So molecules that would normally not be soluble in the same solvent become soluble together in near-critical water and can be processed together. According to Eckert, almost all organic substances are either soluble or completely miscible in water above about 250░C.\p
One of the cleverest aspects of this approach is getting materials out of solution. This is usually the costliest part of the manufacturing procedure, but cooling the near-critical water is often all that is required to throw out one or more of the products as a precipitate that can be lifted and carried away, while water-soluble catalysts remain in solution.\p
Many acid-based reactions require the acid to be neutralized later, producing large amounts of various salts, which then need to be disposed of - that cost is also eliminated when near-critical water is the solvent.\p
One drawback, of course, lies in the large industrial investment in conventional processes. Companies cannot afford to abandon an existing manufacturing facility just because a more environmentally-benign process has been developed. That being said, new manufacturing plants could well be cheaper and more effective if they switched to near-critical water as their solvent.\p
\BKey names:\b Charles Liotta, Roger Glaser, James Brown, and Shane Nolen.\p
See also \BClean water,\b Science Review, \BPhases of matter, dielectric.\b\p
#
"Serious computing",1178,0,0,0
(Aug '99)
Is your house full of the sorts of people who read articles with a heading like this? If so, there are several old computers lying around, doing very little, in the attic, in the garage, or tucked away under the house. And best of all, your non-technophile neighbors throw such machines away. New super-high-power computing is now within your grasp!\p
No doubt you have struggled with your conscience as you walked past the forlorn and abandoned 486 and early Pentium machines, forlorn and abandoned on the streets, but now you need struggle no more - seize the machines, clasp them to you, and bear them proudly home, to add to your own Beowulf cluster. You, too, can feel the power of 5 GFLOPS for less than $50,000 - or even for much less, if you are prepared to settle for more, but older, machines in your home cluster.\p
A Beowulf cluster is a group of computers running as one, sharing the processing, and producing a single large powerful machine, and they are the flavor of the month. The machines are linked by 100 base T Fast Ethernet, and the software of choice is Linux, the free Unix-like operating system.\p
A typical cluster may involve 16 nodes, each with a Pentium II or Digital Alpha 21164PC processor, between 128 and 512 Mbytes of DRAM, 3-30 Gbytes of EIDE disk, a PCI bus backplane, and an assortment of other devices. Of course, in a freeware area such as this, there really is no such thing as a typical setup, but if there were, that is how Beowulf clusters would look, right now, in this rapidly evolving culture. To find out what they look like next month, check again.\p
And now for a brief excursion into literature, because the literature of Beowulf clusters is full of names taken from the Old English poem of \IBeowulf.\i \p
The original Beowulf was Geatish chap (later he would be king of the Geats) who dealt summarily with a large dragon ("worm") called Grendel, whose habits were rather less than sociable, and who had been attacking Heorot, the hall of King Hrothgar of the Danes. As Maurice Sagoff put it in his "Shrink Lit" account:\p
Monster Grendel's tastes are plainish. \p
Breakfast? Just a coupla Danish. \p
For more on the original Beowulf, visit this web site:\p
and for more Maurice Sagoff: http://www.rado.sk/old_english/texts/Humorous.html\p
In short, Beowulf represents a new breed of parallel computer, and we can expect to see this sort of thing becoming more common in the next century, as more software is written for this sort of environment, and more people become competent in its use.\p
Because of the nature of the culture, the best and latest details will always be found on the Internet, where FTP sites, newsgroups and Web sites abound, dealing with the basics and how-to's, as well as the bleeding edges, while many of those active in the area are still willing to lend a hand and provide advice. With free Linux operating systems and public domain software, costs are low, but there is no centralized support.\p
Note that we are not talking about a normal network here, with some distributed processing: we are talking about a system where 16 or more computers (1000 Pentium II computers in one planned genetics machine), form a giant machine, where one keyboard and one monitor reveal what is going on. You do not need Beowulf for your daily word-processing or Web-surfing, but if you are doing high-end graphics, this could be the easiest way for you to get a workstation which will do what you want.\p
And if these machines are beginning to sound like overkill, take a look around the Web for the latest on the \IUebermensch\i project, which aims to have twenty thousand processors in parallel, with the basic unit being a box with fifty processors inside. This scheme appears to be one to watch.\p
That wise man of computing, \1Alan Turing,\c wrote in 1950, that by around 2000, computers would have a capacity of 10\U9\u, or a gigabyte. Such a computer, said Turing, should be able to pass the Chinese Room version of his "Turing test".\p
The Beowulf clusters fit Turing's specification; \IUebermensch\i goes well past it.\p
Be afraid. Very afraid.\p
See also \BSupercomputer on the cheap,\b Science Review, June 1998.\p
#
"Serious publishing",1179,0,0,0
(Aug '99)
Several issues about publishing and publishing policies for journals have arisen this month. There was a small scandal when the MIT's journal \ITechnology Review\i hit the newsstands with a headline "Biotech Goes WILD" and other tabloid-style sub-heads and blurbs. A typical example: "Genetic engineering will be essential to feed the world's billions. But could it unleash a race of 'superweeds'? No one seems to know. And nobody's in charge of finding out."\p
The article which follows is remarkably even-handed by comparison, but by then the damage may have been done, say the critics. Anybody seeing only the shouting cover headline will be influenced, while only those who buy the journal and read the article will get a balanced account. The problem, of course, is competition for sales in an over-filled market, where nobody can even buy all the specialist journals in their area of expertise, because journals, even technical and scientific ones, now exist to generate advertising revenue, rather than to disseminate scientific information.\p
Physicists have opted out of this treadmill of ever-more journals and ever-poorer library budgets by making a major use of the Los Alamos National Laboratory ePrint Server: (http://xxx.lanl.gov/, and http://xxx.adelaide.edu.au/ are two of the available sites - if you access the first site, it detects where you are and lets you know about the nearest available mirror site) to spread information and ideas. This is a highly intelligent interface, and well worth a visit.\p
Recently, the US National Institutes of Health proposed an electronic dissemination system, originally called E-biomed, now to be E-biosci, to spread ideas. But while it was originally to be an access source for preprints of unreviewed papers, it will now be a "reprint server" providing access to some of the content of certain existing publications that will join the project.\p
Three journals likely to join the scheme are \IMolecular Biology of the Cell, Plant Physiology,\i and \IProceedings of the National Academy of Sciences,\i but a number of other commercial scientific journals and many scientific society journals, have expressed strong opposition to the concept of a free major electronic journal resource. At the clamorous forefront is the US Journal \IScience,\i which has put forward a choice selection of specious arguments to the effect that the government does not disseminate information for barbershops or delicatessens, and so should not do so for the sciences.\p
Libraries, which have to shoulder an ever-increasing burden of journal purchases, are less than impressed. There are even journals in the market place which manage to extract a profit from desperate academics who may be willing to pay for publication, so the subscription gouged from a research institution is icing on a cake already paid for, but the research institutions are now starting to fight back.\p
SPARC (the Scholarly Publishing and Academic Resources Coalition) is one group saying that enough is enough. Describing themselves as fostering competition in scholarly publishing (the sub-text seems to be "opposing monopolies and cartels and sharks"), the group of libraries has an information sheet available at http://www.arl.org/sparc/factsheet.html, indicating the areas in which they are operating. So far, their partners include the American Chemical Society which has produced a journal, \IOrganic Letters,\i available at http://pubs.acs.org/journals/orlef7/index.html, which offers tables of contents, abstracts, and full "Acrobat Reader" PDF-format copies of papers, complete with complex chemical structures, all for free, for employees of institutions which have taken out a subscription to the journal. The sample articles are also provided for free.\p
Then there is Evolutionary Ecology Research, located on the Web at http://www.evolutionary-ecology.com/, and offering a sample issue for viewing. This journal still charges subscription prices, but offers low rates for researchers at an institution whose library has subscribed at the full rate. The sample articles come through in PDF format.\p
The Royal Society of Chemistry is providing the electronic version of \IPhysChemComm\i at http://www.rsc.org/is/journals/current/PhysChemComm/pccpub.htm, with articles in HTML format, but requiring validation - there does not appear to be any sample papers to look at. The use of the HTML format allows the RSC to use a variety of sophisticated plug-ins to help readers manipulate the information which is provided.\p
The future is coming, and it looks good - unless you happen to be a dinosaur. Next month, we explore the world of online publishing, and how it may be the technology that will change the way knowledge is packaged and managed.\p
#
"Is there life under the ice?",1180,0,0,0
(Aug '99)
Deep beneath the ice in Antarctica, thousands of meters down, there is a body of liquid water called Lake Vostok. Right now, scientists are convinced that the best chance of finding life elsewhere in the solar system will be found on Jupiter's ice-covered moon, Europa. And if there is anywhere on our planet where the conditions on Europa are simulated, it will be in Lake Vostok.\p
A few years ago, this idea would have been dismissed out of hand, but new studies on extremophiles in all sorts of environments make the idea far less improbable. The microbes there would probably be unknown to science, and could have been trapped in the water for millions of years, surviving sluggishly on the small amounts of energy they can obtain from the resources in the water under the ice.\p
The big difference between Lake Vostok and Europa, of course, is that the area where the lake now is was once far closer to what we regard as suitable conditions for life, while Europa was probably always as it is today - but now we know that all sorts of strange places seem to be able to support life, and we are more prepared to consider what was one impossible as distinctly possible.\p
A report from a workshop funded by the National Science Foundation (NSF), called "Lake Vostok: A Curiosity or a Focus for Interdisciplinary Study," concludes that the lake "may represent a unique region for detailed scientific investigation" for several reasons. The possibility of finding unique life-forms is high on the list, but other reasons include scientific curiosity about water which has been trapped for such a long time and what it may contain, and also the reason why the water remains as water.\p
There is also the interesting technical question of how you access the lake without introducing any contaminants - where in the past we would have worried more about what the life-forms might do to us, the real concern today is what we might do to the life-forms, if we attempt to drill into the lake to gather water samples, and more importantly, sediments from the lake bottom.\p
The workshop was held last November, but has only recently become available, just in time for a meeting of the international Scientific Committee on Antarctic Research (SCAR) during September in England. At this meeting, scientists will discuss the scientific objectives of sub-glacial lake exploration and will examine the logistical and engineering requirements for exploring the lake.\p
The main focus of the report is on the joint US, French, and Russian research project, in which Russian teams have drilled down into the ice covering the lake, producing the world's deepest ice core. Drilling was deliberately stopped roughly 120 meters above where the ice and liquid water meet to prevent any possible contamination.\p
Vostok would seem the most unlikely place on earth to find water: the station once recorded the lowest temperature ever seen on the planet, -88.3░C, or -126.9░F, but the under-ice water body is about the size of Lake Ontario, 19,000 km\U2\u, or 7500 square miles. Yet unlikely as a lake existing under the ice may be, it is there, and the lake sediments may contain a record of ancient climate conditions, or might even help us understand the processes which triggered the evolutionary explosion on earth. At the very least, they should help us in deciphering the geologic history of Antarctica.\p
See also \bEuropa fly-by\b in Science Review.\p
#
"A new primate genus",1181,0,0,0
(Aug '99)
Finding a new fossil is only the very first stage in "discovering" that fossil. True discovery involves searching for more fragments in the surrounding rock, establishing the true position of the fossil's formation to date it more accurately, and finally, careful preparation of the remains, stripping away the attached pieces of rock. After all that is complete, interpretation can begin, and the discovery begins to mean something.\p
Which explains why specimen KNM-TH 28860, first spotted in 1993 by Boniface Kimeu, is only now the subject of a major paper, appearing in \IScience\i in late August. The naming conventions for materials coming from Kenya involves the code KNM (Kenya National Museums), followed by a location such as TH (Tugen Hills) and an accession number (in this case, 28860). While the name of a specimen may change as knowledge increases, the number remains constant. While some specimens gain nicknames like "Mrs. Ples" (Sts 5), "Nutcracker Man" (OH5), "Lucy" (AL-288-1) or "The Black Skull" (KNM-WT 17000), scientific publications will normally use the formal identification, and only one specimen, the \1Taung skull\c of \1Raymond Dart\c is known by its nickname alone - the "Taung baby."\p
One reason to stick with the standard number is that until a fossil is prepared and studied, the actual species to which the specimen belongs is open to question. In this case, the specimen, described as "an exquisitely preserved 15-million-year-old partial skeleton of an ancient ape" has been placed in a brand-new genus, given the name \IEquatorius.\i As well, some other material previously included under the \IKenyapithecus\i label, has now been moved to \IEquatorius.\i\p
The main point about moving some of the fossils to the new genus is that the material remaining in the reduced fossil collection still labeled \IKenyapithecus,\i which is the material previously assigned to the species, \IKenyapithecus wickeri,\i is more closely related to the ancestor of the great apes - before this division, the \IKenyapithecus\i group sent mixed signals to anybody seeking that mysterious ancestral great ape.\p
During the Miocene era (23-5.5 million years ago), the ape family tree contained a large number of representatives, all over Africa, Asia, and Europe. The great ape and human lineage originated from one of these animals, but finding the direct ancestor is still rather a challenge, with the preferred candidate changing almost as often as new data arrive. In this case, the new fossil ape reported in \IScience\i prompted its discoverers to take another look at the \IKenyapithecus\i genus, which researchers had once thought to be a key ancestor. The upshot was that the more primitive species of \IKenyapithecus \ihave been transferred to the new genus \IEquatorius,\i allowing researchers to draw a clearer connection between the remaining \IKenyapithecus\i species and the living apes and humans.\p
Ward and Hill first found a fossil at the site of Kipsaramon in Kenya. Then they compared features in the Kipsaramon fossil's teeth and jaws to dental and facial patterns in two \IKenyapithecus\i species: \IKenyapithecus africanus\i and \IKenyapithecus wickeri.\i Their analysis revealed two distinct patterns in the sample: a more primitive pattern represented by \IKenyapithecus africanus\i (and also in the Kipsaramon specimen), and a more modern, great ape-like pattern represented by \IKenyapithecus wickeri.\i\p
The differences between the two types had been suspected earlier, but drawing a new division line depends on having enough material to work with, and the researchers believe they are only now in that position. It is quite possible that future finds will change the way we see the distinctions, but the division itself looks well-founded and likely to remain.\p
Support for the division came from the middle Miocene site of Pasalar in Turkey. The overall \IKenyapithecus wickeri\i dental pattern is also found in the teeth of an unnamed ape species found there, and it was this confirmation, with the same anatomy in a group of animals from a completely separate part of the world, which convinced the authors they were right to separate \IKenyapithecus wickeri\i from \IEquatorius.\i Kelley commented "The connection between the Pasalar material and the \IKenyapithecus wickeri \ispecimens gave us confidence that the differences we had found in the \IKenyapithecus \imaterial were real and of generic level importance."\p
So how does \IEquatorius\i relate to the later apes? This remains uncertain for now, but the genus generally fits the description of a "stem hominoid," one of a cluster of species positioned somewhere on the evolutionary ladder near the origins of the ape group. The stem hominoids are probably too primitive to be the direct ancestors of modern apes and humans, but \IEquatorius\i is more modern in some respects than earlier primitive apes, especially in its skeleton. For starters, it shows evidence of using the ground more frequently than earlier ape species. Future studies of the upper arm and shoulder of \IEquatorius\i should reveal a great deal more about the behavior of this animal\p
The material remaining under the \IKenyapithecus\i label now reveals the advanced features that link it to more modern apes. At the same time, \IKenyapithecus'\i exact relationship to the origin of these later apes is still somewhat of a mystery, but the way forward is now rather clearer, even if the final destination is still hidden.\p
The Kipsaramon site is in Kenya's Baringo Basin, where geologic sediments cover the period between 15 million years ago to the present, and contain the only complete record in Africa for portions of this crucial time in ape and human history. The specimen was found after searchers worked their way uphill from a large number of isolated teeth that had washed into a gully at the site. \p
According to a release put out by the researchers, the skeleton was moved from the site in one consolidated block, where the fragile bones were extracted from the surrounding rock. The fossil consists of most of a lower jaw with teeth, some upper incisors, and an unusually generous amount of skeletal material, including bones or parts of bones from the arm, shoulder, collarbone, chest, wrists and fingers, and vertebrae, all belonging to a single male individual. \p
See also: \BNew Ancestral Apes?,\b Science Review.\p
\BKey names:\b Steve Ward, Andrew Hill, and Jay Kelley.\p
\BFootnote:\b other abbreviations used in naming African hominid fossils include:\p
AL: Afar Locality, Ethiopia \p
ARA-VP: Aramis, Ethiopia \p
KNM-ER: Kenya National Museum, East Rudolf \p
KNM-WT: Kenya National Museum, West Turkana \p
KP: Kanapoi, Kenya \p
OH: Olduvai Hominid, Tanzania \p
SK: Swartkrans, South Africa \p
Sts, Stw: Sterkfontein, South Africa \p
TM: Transvaal Museum \p
#
"Did a nearby supernova cause a mini-extinction?",1182,0,0,0
(Aug '99)
A paper to be published soon in \INew Astronomy\i features a claim from Brian Fields and John Ellis that there may have been a supernova close to Earth, some five million years ago. Their evidence comes from the recent discovery of the rare radioactive isotope iron-60 in deep-sea sediments. This, they say, could be the telltale sign of a killer supernova, because if a supernova, the most likely source of the iron isotope, was close enough to have matter reach Earth, then it was close enough for cosmic-ray bombardment to have affected Earth's biosphere by enhancing the penetration of harmful solar ultraviolet radiation and by increasing the global cloud cover, leading to a "cosmic-ray winter" and a mini-extinction.\p
They think the supernova was not \Itoo\i close - about 100 light-years from Earth. The iron-60 data were collected by a German team led by Gunther Korschinek of the Technical University of Munich.\p
They argue that there is a correlation between solar activity as measured by the sunspot cycle, and the extent of Earth's cloud cover, and say that some researchers believe this correlation is due to modulation of the normal cosmic-ray flux observed today, caused by variations in the solar wind. From this, they conclude that the enhanced cosmic-ray bombardment from a nearby supernova could create a large increase in global cloud cover. The result would be a cooling of the Earth, in a cosmic-ray winter which could last for thousands of years.\p
The only problem: while there is fossil evidence for a couple of mini-extinctions during the Cenozoic Era, one of them occurred about 13 million years ago; the other occurred about 3 million years ago - not providing a very good match with the dating of the iron-60 deposits as they are being reported right now, but this could be a good topic to watch.\p
#
"Turkey earthquake",1183,0,0,0
(Aug '99)
We reported earlier this year \B(Earthquake deaths,\b January 1999) that the death toll from earthquakes in recent years has been well below the expected levels. That pleasing trend came to a nasty end in August with an earthquake on the northernmost strand of the Anatolian fault system in Turkey.\p
The magnitude of the quake was originally estimated at 7.8, based on recordings of seismic waves from a limited number of global stations that rapidly transmit data to the US Geological Survey's National Earthquake Information Center (NEIC) in Colorado. With additional data, this estimate was revised to 7.4.\p
The earthquake occurred at 3.01am (00:01:39.80 UTC, August 17), and the epicenter was about 11 kilometers, or seven miles, southeast of the city of Izmit. The earthquake originated at a depth of 17 kilometers (around 10.5 miles), and caused right-lateral strike-slip movement on the fault. Surface observation revealed that the earthquake produced at least 60 kilometers (37 miles) of surface rupture and right-lateral offsets as large as 2.7 meters (9 feet).\p
By the end of August, estimates of the total deaths exceeded 15,000 with another 25,000 injured.\p
At the end of August, many Turkish families were still living out of doors, either unable or unwilling to return to their destroyed or damaged homes. Much of the outrage after the quake was directed at builders who were alleged to have skimped on concrete standards and reinforcing specifications in multi-storey apartment buildings which should have stood up to such a quake.\p
Most of the damage was near Izmit, a city with half a million inhabitants, but Istanbul 100 km (65 miles) to the west was plunged into darkness, as was the capital, Ankara, about 450 km (280 miles to the east. Older historic sites in Istanbul, like the Blue Mosque, Santa Sophia, and the Topkapi Palace all survived - perhaps a credit to the better building standards of old, or perhaps a reminder that these buildings had to be tough to survive as long as they have done in a seismically active area.\p
A useful web site with information about the Turkish earthquake is http://cindi.usgs.gov/turkey/tquake1.html . CINDI is the acronym for the USGS Center for Integration of Natural Disaster Information. The site also has links to all kinds of natural disasters.\p
#
"How dioxin kills",1184,0,0,0
(Aug '99)
A recent award of a four-year grant of $816,000 from the US National Institutes of Health gave Prakash and Mitzi Nagarkatti the opportunity to describe on the Internet their recent work on dioxin. They have found a step in dioxin toxicity which may enable them to develop diagnostic, treatment, and even prevention methods in the future.\p
Dioxins are highly toxic environmental pollutants, commonly formed as byproducts during the manufacture and bleaching of paper. They are some of the most biologically potent chemicals, a member of the family of compounds known as halogenated aromatic hydrocarbons (which includes PCBs) which are found in herbicides, pesticides, automobile exhausts, and municipal and industrial waste.\p
Dioxins kill the immune cells using a unique process called apoptosis, triggering the cells to commit suicide by destroying their own DNA. The Nagarkatti lab has identified a protein molecule involved in such killing, and dioxin treatment seems to activate the molecule, resulting in the killing of the immune cells.\p
The molecule, CD-95 ligand, is a member of a family of proteins that occurs naturally in the body, and its function is to kill cells that are not needed. It seems that dioxins turn on the CD-95 ligand so that it cannot be turned off. Research is continuing on this: it is a story that could prove to be very important.\p
#
"Trouble for the world's turtles",1185,0,0,0
(Aug '99)
About half of the world's turtle species face possible extinction, according to a meeting of some sixty of the world's leading experts on freshwater turtles and tortoises at a special gathering in Nevada in mid-August. A large part of the risk comes from a growing demand for turtles as a popular dining delicacy and a source of traditional medicines.\p
The Powdermill IV conference (named after the site of their first gathering in Pennsylvania in 1980), identified what they called a "turtle survival crisis" as the most urgent topic of the conference. Turtles have managed to survive the upheavals of the last 200 million years, including the great extinction episode that eliminated the dinosaurs, but now we are on the brink of losing the entire group.\p
The risk, say the experts, is as great as that of the frogs, toads and other amphibians, but it has been largely ignored. People are aware of the problems faced by sea turtles, but their freshwater cousins seem to have been allowed to slip off the back deck of the Ark. Unlike the amphibians, who seem to be suffering mainly from habitat decline or destruction, the turtle problems appear to stem directly from human practices and human consumption of turtles.\p
A group called Partners in Amphibian and Reptile Conservation (PARC) has begun to address this whole class of threatened animals. If turtles are to be saved, they say, it will have to be through cooperative efforts, such as theirs.\p
So far, extinction has hit just one subspecies - a small mud turtle from Mexico, but many freshwater turtles and tortoises have very restricted geographic ranges, which means they face an even more critical situation than the marine species. At present, all sea turtles, most remaining tortoises, and many freshwater turtles are endangered or threatened and require urgent conservation action. About twelve turtle species are considered critically endangered, facing a high risk of imminent extinction unless long-term population trends are reversed.\p
The patterns of human consumption vary around the world: in places like Madagascar and Mexico, turtle meat is eaten by the very poor, while in other places, especially Southeast Asia, the wealthy eat turtles as a luxury food item. This demand is being driven from China, where increased affluence and the recent convertibility of Chinese currency, coupled with age-old traditions of consuming turtles for food and as medicine, have seen the prices of some of the most desired species rise to as much as $1,000.\p
China's own turtle populations have declined seriously, and several Chinese species only discovered in the last two decades are possibly already extinct due to high demand. Imports into China from Vietnam, Bangladesh, and Indonesia are all running at unsustainable levels.\p
Some 55 species of turtles, about 20% of the world's total freshwater turtle species, are in the United States. Of these, 25 species require conservation action, and 21 species are protected, or are candidates for protection. Yet the United States is a guilty party as well, exporting well over 7 million turtles of several species every year from the United States, either as pets or as food products. The main problems seem to be with the large, slow-growing river turtles, with large females being the most affected.\p
A range of solutions were put forward by the scientists, including captive breeding programs, better conservation trade laws, and regulations - and the enforcement of these new and all existing laws, as well as habitat conservation. As always, they tell us there is a need for more careful studies - and as always, we will probably ignore them.\p
#
"A new tree of life for plants",1186,0,0,0
(Aug '99)
The International Botanical Congress in St. Louis, Missouri in early August was the place to be for those interested in how life arose on Earth, as the work of the Green Plant Phylogeny Research Coordination Group was revealed in eight major symposia. A five-year effort from a team of 200 scientists in twelve countries to reconstruct the evolutionary relationships among all of Earth's green plants, the study has come up with a number of major surprises.\p
For starters, the team has rejected the traditional belief that the so-called "land-plant invasion" was led by seawater plants. Instead, they have found that primitive freshwater plants provided the ancestral stock from which all green plants now on land are descended and that this ancestor gave rise to every green plant now alive on earth. Some of the sea plants, they say, moved into fresh water, and then onto the land, while some of the other descendants of these earlier steps moved back into the sea.\p
Some plants, like the kingdom of red plants, mostly seaweeds, never left the ocean. In the brown plant kingdom, most of them remained as seaweeds (like the kelps) and continued living in the ocean as well, but a few, such as the diatoms, moved into freshwater.\p
The first thing a botanist does on finding a new plant is to look at how it is related to other plants, because the relationships are very important in predicting the traits of the new plant. As the researchers see it, the group traditionally thought of as "plants" is really four separate lineages or "kingdoms," with one group, the fungi, being more related to animals than to plants.\p
Green plant species are of particular economic value because they provide most of our food, shelter, and medicines, so any change in our knowledge and understanding of these groups will be important, with one of the speakers going so far as to claim that the new relationships, and an awareness of them, carry profound ethical, intellectual, ecological, and economic implications for science, medicine, industry, and society.\p
On the new structure, there are five main trunks, or lineages, of complex, "nucleated" organisms on the Earth's genealogical tree, four of which are classified as plants. They include the green plants, the brown plants, the red plants, the fungi, and the animals. Of these, the green plants group, with some 500,000 species, including all of Earth's land plants (trees, shrubs, grasses, flowers, ferns, mosses) and some of the aquatic plants, such as green algae, is the largest.\p
One interesting feature is that the land plants now appear to have a single common freshwater ancestor, which lived at least 450 million years ago. Before this, botanists believed that there were several completely separate lineages, deriving from several different "landings", that mosses were derived from a different aquatic ancestor than were flowers or ferns. The green plants may well have made their way onto land many times, but only one lineage survived and thrived there, the lineage which gives us all of today's green plants.\p
Coming onto dry land is a challenge for plants which are used to simply dumping their gametes in the water and letting random mixing take care of the rest. The first land plants lived in damp environments where a film of water could be used to provide a path for the male gametes to reach the female gametes. This is still the method that mosses and ferns rely on, while the "higher" plants have found better ways of dealing with this problem.\p
The question of how the earliest flowering plants evolved is still open to some debate, but one surprise from the research is the breakdown of the standard botanical division into dicotyledons and monocotyledons: it appears that some of the dicots (including the magnolias and the water lilies) are on a branch with the monocots (including grasses and orchids), based on the evidence of newly available DNA sequence data.\p
At certain points in the history of science, humans have suddenly been thrown into a whirl by a new way of seeing themselves, with humans being pushed further each time from the center of the universe. When Copernicus and Galileo revived the ancient notion of a solar system with the planets revolving around the sun, and made it better known to the general public, our planet lost its role as the focus point of all Creation. When Darwin published his \IDescent of Man,\i humans were made just another natural species to the general public, and as we saw the sun as "just another star", and developed an increasing awareness of the immensities of the universe, all of this contributed to unseating humanity from a central role.\p
It may be too soon to agree, but the botanists believe they have made another of these center-shifting breakthroughs, in that their research shows how all the plants and animals together form only a small branch on the tree of life. There is, they suggest, a universe of mostly single-celled, and poorly known organisms which makes up the most substantial parts of the tree.\p
Right now, scientists have identified about 1.4 million species of organisms on Earth. Estimates of the numbers of undiscovered and undescribed species of organisms range from 10 million to more than 100 million. There are probably more surprises waiting to be found, surprises which will almost certainly nudge the humans further from their self-assigned central role. We are just one species among many.\p
For more information on the group and its work, visit the Web page at http://ucjeps.herb.berkeley.edu/bryolab/greenplantpage.html and follow the links from there.\p
#
"Nearly half of Earth's land has been transformed by humans",1187,0,0,0
(Aug '99)
At the XVI International Botanical Congress in St. Louis, Missouri, Jane Lubchenco of Oregon State University, and Harold A. Mooney and Peter M. Vitousek of Stanford University presented a detailed look at how humans have affected the world they live on. The alterations to both land and water, they say, are severe enough to impair the planet's ability to maintain the quality of human life.\p
The data they presented showed that nearly half of the land surface of Earth has been changed, and some 50 "dead zones" (areas with little or no oxygen) have developed in the Earth's coastal waters. Land areas have been transformed by humans filling in wetlands, converting grass prairies and forests into cornfields and grazing lands, or converting forests into urban areas. As well, they have more than doubled the amount of available nitrogen in the environment because of excess fertilizer use and burning of fossil fuel. (In this context, "available nitrogen" should not be confused with atmospheric nitrogen, which is a comparatively unreactive gas - the term refers to nitrogen compounds such as nitrates, nitrites, ammonia, and nitrogen oxides.)\p
Rates of extinction, they say, are running at 100 to 1000 times what they would be without human-induced changes in the planet. On land, this is largely caused by habitat loss and species invasions that are crowding out native species. In water, this is caused by overfishing. The year 1998 was Earth's hottest on record, as human activities continued to increase the concentrations of carbon dioxide and other heat-trapping gases in the atmosphere.\p
The oceanic dead zones have been caused by excess nitrogen and phosphorus flowing down rivers: the largest, in the Gulf of Mexico, is caused by nutrients carried to the sea by the Mississippi River. Lubchenco suggests that we have assumed too long that the seas can take whatever we throw at them, and now we see many signs of the problems which will result from these changes, including toxic algal blooms, coral bleaching and sudden disappearance of fish from key fisheries.\p
In the tropics, half of the mangrove forests have been lost to a combination of coastal development and conversion to aquaculture, which produces more than a quarter of all the fish consumed by humans. In the case of shrimp and salmon - the fastest growing segment of aquaculture - two to three kilograms of other (less valued) fish are needed to grow one kilogram of the raised seafood. This practice is depleting the oceans of food for wild fish, birds, and marine mammals.\p
At the same time, the seas of the world are being affected by the release of organisms carried in the ballast tanks of ships (at any one time, says Lubchenco, there are some 3000 species in the world's ballast tanks), and to a lesser extent, by the release of cultivated plants and animals from aquaria.\p
In the near future, we can expect to see wars fought over resources like food and water (see \BA cause for war,\b Science Review), and the more the environment is damaged, the sooner that day will come. Lubchenco sees reason for hope, as people become more aware of the problems, but will they become aware soon enough, in large enough numbers? The next generation will probably be able to answer that question - after the events that will provide the answer.\p
In the meantime, forests, grasslands, and coral reefs which contribute to flood control and climate regulation are still being damaged. So too are the mangroves, estuaries, coral reefs, and kelp forests which protect shores from erosion and provide nursery areas or spawning habitat for economically important species.\p
#
"Are we in the middle of a mass extinction?",1188,0,0,0
(Aug '99)
The ultimate fate of every species is extinction, but any level of extinction which exceeds the rate at which new species can evolve is likely to cause severe disruption, sooner or later. Right now, the disruption looks set to come much sooner than we would wish, and some people who are alive today will see it before they see their grandchildren.\p
Peter Raven, President of the International Botanical Congress, used a compilation of the latest data on extinction rates of plant and animal life around the world to show that humanity's impact on Earth has increased extinction rates to levels rivaling the five mass extinctions of past geologic history. Raven, who is also Director of the Missouri Botanical Garden, is a highly-respected botanist, so when he suggests that between one-third and two-thirds of all plant and animal species, most in the tropics, will be lost during the second half of the next century, it is time to sit up and pay attention.\p
Raven's paper, released at a press briefing before a symposium on "Plants in Peril: What Should We Do?" at the International Botanical Congress, calls for an eight-point plan to arrest species loss within plant ecosystems. For some centuries, the known and documented extinction rates of a wide range of well-known groups of organisms have been several times higher than the background rate which have applied across the Mesozoic Era, the past 65 million years. At the start of that era, we lost not only the last surviving dinosaurs but two-thirds of all terrestrial organisms that lived then.\p
At a rough estimate, the current extinction rate is now approaching 1000 times the background rate and may climb to 10,000 times the background rate during the next century if present trends continue. Even when the members of many groups of organisms are relatively poorly known, their extinction rates can be estimated because of a simple mathematical rule, the logarithmic relationship between species number and the area in which they live. On average, a tenfold increase in area is correlated with a doubling in species number, and a tenfold decrease in area ties in with a halving of the original number. So given the relationship between species number and area, we can determine the number of species which will survive in a given area as it is fragmented.\p
As well, the yearly rates can be estimated: fragments will lose half of the species they are going to lose in about 50 years; three-quarters of them in a century. At present rates of destruction, we are likely to have just 5% of our tropical forests left in fifty years. If those remnants are preserved, the overall extinction rates will be three or four orders of magnitude higher than those prevailing between mass extinctions. At this rate, one-third to two-thirds of all species of plants, animals, and other organisms would be lost during the second half of the next century, a loss that would easily equal those of past extinctions.\p
Watch out for the scientific short-hand in the last paragraph: "three or four orders of magnitude higher" means 1000 to 10,000 times higher, \Inot\i "three or four times higher"! Raven argues that time is running out to act: there is a desperate need to monitor the situation, to conserve species - and genetic diversity within species, to increase the available knowledge, and to share that knowledge, so people become aware of the effects of their actions, and of the value of biodiversity.\p
This also means helping poorer countries to develop better scientific awareness. Currently, 80% of the world's scientists live in industrialized countries, which have about 20% of the world's population and only 20% of the world's biodiversity. The choice is simple, according to Raven: make the unpopular and costly choices now, or settle for our great-grandchildren living in a world in which more than half of the plant species that exist now will be known only as specimens. But if you are a young reader, you can change that to read "grandchildren".\p
#
"US drought may be the century's worst",1189,0,0,0
(Aug '99)
The United States has been suffering from severe drought, this northern summer, and as rivers and streams dwindle to mere trickles, scientists at the US Geological Survey say they are monitoring what could become this century's worst drought, beating the US droughts of 1929 and 1966.\p
Drought advisories, warnings, or emergencies have been declared by state authorities in all MidAtlantic states, including Delaware, the District of Columbia, Maryland, New Jersey, New York, Pennsylvania, Virginia, and West Virginia. Some parts of the MidAtlantic states have been experiencing drought conditions for the past three years, and there seems to be no immediate prospect of relief. It is this continued and compounding effect which is set to make this "the worst period of drought this century."\p
The Chesapeake Bay, for example, suffered record lows in fresh water inflows, and this led to increases in salinity and major fish kills. In other areas, there have been dramatic shortages in surface water and ground water, with many wells beginning to run dry. While the major public water suppliers have adequate reserves for now, if winter rains are poor, the situation could become extremely serious, according to the US Geological Survey.\p
Further information on the drought is available on the Web at http://md.usgs.gov/drought/mid_atl.html. This site offers river flow details across the entire United States, apparently in real time.\p
#
"Cockroaches and catnip",1190,0,0,0
(Aug '99)
According to an old wives' tale, placing catnip around the house discourages cockroaches, but now old husbands and young spouses would be wise to heed the tale as well. A report to the national meeting of the American Chemical Society in late August indicates that cockroaches are indeed repelled by catnip. Specifically, they respond to two forms of the chemical called nepetalactone, found in the catnip plant.\p
Chris Peterson and Joel Coats who made the report say this could lead to new natural insect repellents that could be used to keep the cockroaches from coming out of the walls. Most of the existing treatments do not repel the insects, perhaps because those who buy the sprays are happy to see the cockroaches end up dead, but there is always an environmental cost from these chemicals.\p
Peterson also tested osage orange, otherwise known as hedgeapple, as a repellant. Credited by folklore with the power to drive away "cockroaches, spiders, mice, flies, crickets, or just about anything people care to repel," as Peterson put it, the fruits do, in fact, get rid of insects as well, although more work is needed to identify the active agents.\p
The researchers worked on "German cockroaches," and while they expect the effect to hold with the "American cockroach," this remains to be seen. In the study, male cockroaches seemed to be more affected than females, though the researchers had no idea why.\p
Which allows us room to speculate wildly: could it be that cockroaches are afraid of the cats which are attracted by the catnip?\p
#
"Kindness to animals in India",1191,0,0,0
(Aug '99)
The elections due in early September in India have seen a few changes, and apparently, a scientific first: a careful feasibility study of elephant throttling. With four out of every ten Indians unable to read, symbols are used to identify parties - and the electoral authorities have some spare symbols in store, including a brick, a pressure cooker, and even a baby doll. \p
This year, a number of animals have been withdrawn, as opponents of a party using a rooster symbol have been known to strangle poultry in public as a partisan gesture. Apparently arguing that throttling an elephant is rather more challenging, authorities have ruled that India's largest mammal is still able to be used as an electoral symbol.\p
#
"A diet of worms?",1192,0,0,0
(Aug '99)
Early March saw an apparently hilarious little yarn published in \INew Scientist\i magazine, but there was a serious side to it as well. In simple terms, it looks as though regular doses of worms might rid people of inflammatory bowel disease, conditions such as ulcerative colitis and Crohn's disease, both of which fall under this umbrella term.\p
Joel Weinstock and colleagues at the University of Iowa believe these conditions arise because of an overactive immune system. The condition, which is increasingly common in the developed world, is caused by the absence of intestinal parasites, they suggest. As in other things, it seems that a little dirt may do us no harm - and probably does us a great deal of good.\p
The researchers fed six people who had inflammatory disease with a batch of eggs which hatched and developed into parasitic worms, and got such excellent results that they will begin a larger trial soon. Two to three weeks into the trial, five of the six patients went into complete remission, and a single dose of worms eased their symptoms for about a month.\p
Weinstock argues that our last three million years have seen us carrying populations of intestinal worms, giving our immune systems a chance to manage this sort of challenge. This century, as these worms have been firmly stamped out, he hypothesizes that the immune system is more likely to produce powerful inflammatory agents such as gamma-interferon. These then promote the formation of white blood cells called macrophages. In simple engineering terms, there is no damping-down of the immune system's oscillations.\p
The patients were selected because earlier attempts to damp down their immune systems with steroids had not worked. As a precaution, the selected species was one that does not normally infect humans, and which cannot reproduce in the human gut, so the worms are eliminated after a couple of months.\p
According to the same report, another researcher, Balfour Sartor, is experimenting with \ILactobacillus\i and \IBifidobacterium\i gut bacteria, again with the idea that they may have dampening effects on the immune system. With such a small sample, chance effects and even placebo effects cannot be ruled out, although anybody familiar with either condition, even indirectly, would be all in favor of exploiting the placebo effect if it occurs. Still, a real cure would be even better, and so larger controlled trials will soon be under way.\p
#
"September, 1999 Science Review",1193,0,0,0
\JProstate cancer: a special report\j
\JNew test can better uncover hidden breast cancer\j
\JEnteroviruses: a new threat?\j
\JDoes a fungus cause chronic sinusitis?\j
\JGetting proteins past the cell membrane\j
\JPreventing HIV transmission to infants\j
\JThird generation pills and thromboembolism\j
\JHemoglobin started as an enzyme\j
\JDNA sequencing advances\j
\JFirst adult diabetes gene\j
\JGM: the scientists begin to speak out\j
\JThe first clear picture of a ribosome\j
\JNobel awards announced\j
\JSingle mothers and schoolwork\j
\JBreast-feeding and babies' IQs\j
\JMaking a smart mouse\j
\JEarthlike planets?\j
\JMars Climate Orbiter goes missing\j
\JEuropa's cracks created by ocean tides\j
\JPaperless publishing?\j
\JPaperless publishing? (2)\j
\JWhy paintings turn yellow\j
\JTokaimura nuclear incident\j
\JJurassic mammals in Madagascar\j
\JEarthquake news\j
\JHow hot is the Earth's core?\j
\JFire ants and the elderly\j
\JWhat will 2100 bring?\j
\JClimate change and greenhouse gases\j
\JIndian rivers threatened\j
\JMcMurdo Dry Valleys images released\j
\JLead poisoning\j
\JPicking the baby's gender\j
\JNanci Youngblood Brasket (1946 - 99)\j
#
"Prostate cancer: a special report",1194,0,0,0
(Sep Æ99)
Prostate cancer kills as many men as breast cancer kills women, yet it gains just 7% of the funding given to research into breast cancer, and research into prostate cancer lags more than a decade behind the equivalent work on breast cancer. According to prostate researchers, this is because the world's men are unwilling to talk about their bodily functions, especially those involving urination or virility, two aspects which can be affected by a prostate condition.
Right now, twenty times as many Australian men die each year from prostate cancer as die from AIDS. It is the most common cancer among men, and the largest killer of men. In Australia, much of the research on this killer disease occurs at Sydney's Garvan Institute, and your reporter went there in September for a briefing - and for some good news on promising developments which cannot be reported yet, as the research has not been properly published in a peer-reviewed journal.
There is, say the Garvan researchers, a need to make people more prostate-aware. This process has already started in the USA, but the rest of the world is lagging behind. With delightful statistical bluntness, they say that the disease kills the equivalent of 190 first-grade rugby league teams in Australia each year, polishing off in just three months as many Australian men as died in the whole of the Vietnam War. Across the world, there were some 395,000 new cases in 1990, and probably more in each year since then. More than 40,000 men died of this cancer in the USA in 1998 - quite a lot of baseball teams! About 180,000 new cases are reported in the USA each year.
We still have no clear indication of what the cause or causes of prostate cancer may be. There is a 70-fold difference between the highest rate (Afro-Americans) and the lowest rate (Vietnamese), suggesting that there is either a gene or some cultural or dietary effect involved.
Japanese men living in the USA show an incidence four times that of Japanese men in Japan, but they are still getting the cancer at only 40% of the rate experienced by other men in the USA. As well, Asian-born Australians show 40% of the level experienced by locally-born Australians, which tends to suggest that there is both a genetic effect \Iand\i a cultural effect.
The problem with dietary theories is that Utah Mormons have a higher rate than the US average, while Seventh Day Adventists, who have a similar diet, are lower than the US average. Married men are more likely to get the disease than unmarried men - which might be explained by assuming that unmarried men tend to die younger, but divorced men and widowers have higher rates than men who are still married.
If there is a dietary effect, it may well relate to either selenium or cadmium or zinc levels or vitamin A in the diet, but the information is still somewhat confused and conflicting, so we will leave it there for now. A past history of venereal disease seems to increase the risk, which made epidemiologists suspect that there may have been some unknown virus, transmitted at the same time as the venereal infection, but once again, conflicting evidence exists: Catholic priests closely shadow the rates seen in the populations they come from.
The risk appears to be about 50% higher if the man had a vasectomy more than twenty years earlier, but as vasectomies tend to be more common in higher socio-economic groups, and those groups tend to have higher rates of prostate cancer, there could be something else going on here. There is sometimes a family link, especially when the cancer appears in men under 50, once again suggesting a genetic effect.
Like breast cancer researchers, the prostate cancer workers face the same dilemma: should they screen the population or not? In reality, it is an old man's disease, and if nothing else killed them, all men would die of prostate cancer before the age of 120, but the symptoms show up late, and the disease is a slow killer, so many men who develop this condition will die of something else first, meaning that heroic measures are best avoided \Iin some cases,\i since the intervention causes problems, and may well do no good.
Prostate cancer tends to be found after the age of 50, and there are few deaths before the age of 70. When maps of relative incidence have been drawn up in Australia, these show localized pockets of high incidence, but these pockets seem to be an artefact of screening in those areas: while significant rises were seen in the incidence of reports of new cases, the level of deaths has remained constant, which suggests that there were many undetected cases before that time, and in other areas.
Screening would reveal histological evidence of the cancer in 60% of men when they retire, but most of these men do not develop the cancer as such, and even fewer die of it. So there may be a case for not screening the whole population - and while the victims may still have the vote, they pay little tax, and they do not talk about the problem, so a "rational" medical system is likely to take the easy route.
Realistically, if screening for a disease is to make sense, it should be targeting an important health problem with a recognizable early stage (like the lumps in breast cancer), and it should repay the cost of screening because early treatment should be more beneficial. As well, screening should be convenient and tolerable to the patient (talk to a mammogram recipient to assess what is "convenient and tolerable"!), there should be facilities for screening, and the screening should be acceptable to society.
In a population of less than 20 million, screening all of Australia's men over 55 would cost some AUD$400 million, and screening will undoubtedly lead to some people being treated unnecessarily, and suffering the side-effects of treatment.
The prostate gland sits immediately below a man's bladder, just in front of the rectum, and a cancerous gland increases in size. This explains why diagnosis may involve a doctor inserting a gloved finger into the rectum ("digital rectal examination"), in order to feel the gland. It also explains why one of the common early symptoms is the need to get up in the night to pass urine - as the gland gets bigger, it trespasses on space normally occupied by the bladder.
The cancer can be confirmed by a blood test for PSA, Prostate Specific Antigen, and once this is known, the patient needs to make a choice. For some people, watchful waiting is best, because the expected time until death from the cancer exceeds their reasonable life expectancy. The other choices are surgery, radical prostatectomy, or hormonal therapy, or radiotherapy, each carrying its own risks and side-effects, which must be balanced against each other.
The area around the prostate is an area with complex plumbing, and in particular, the nerves which control erection of the penis are right there - if these are damaged in surgery, or if they turn out to be cancerous, and need to be removed, then the ability to develop or maintain an erection may be diminished or destroyed. A 50-year-old with a young wife may have entirely different life-goals from a 75-year-old who wants to see his grandchildren become teenagers, or graduate, or marry. So the treatment chosen may depend on what the patient is prepared to trade off.
According to some of the prostate cancer patients at the seminar, it is frustrating, trying to get straight advice from medical specialists until they understand that the problems are so complex. The patients stressed that the best thing a newly diagnosed prostate sufferer can do is to identify one of the support groups which exist, and go and talk to other people who have already been through the process. Internet sources, they say, vary from the helpful to the downright dangerous.
Hormonal therapy ("chemical castration") uses hormones to shrink the prostate, but the side effects include what Australians call "hot flushes", and Americans call "hot flashes", loss of libido and depression, so in the end, you make your choice, and take the consequences.
While research is well behind that on breast cancer, advances are now being made. Patients who have a cancer detected are likely to be asked to contribute tissue and data to an on-going study which may one day link lifestyle, or the way the disease develops, to certain tissue characteristics. There may be genetic markers which identify those who get the cancer but do not die from it.
Other treatments are in development: ultrasound, precisely located radiation, and there are rumors of GM-based therapies (see, for an example, \JWhy a modified cold virus kills cancer cells\j, April 1999) both in Australia and overseas, which could revolutionize the treatment of this cancer in the next few years. If your reporter finds himself in the 1-in-9 basket, he will be watching and waiting for a couple of years, balancing the worry of possible problems against the certainty of side-effects, knowing that the available treatments are only going to get better.
Mid-September saw a report in \IThe Lancet\i of a simple genetic mutation related to prostate cancer. African-American and Latino men in the USA who carry the mutation have a five times greater risk of developing prostate cancer than do men without the mutation, University of Southern California researchers say in the report.
In the USA, the incidence of prostate cancer is substantially higher in African-American men than in Caucasian men, and their death rate is twice that of Caucasians with the same diagnosis. The new discovery helps explain why one drug used in treating this condition, finasteride, or Proscar, sometimes fails. It seems the drug does not work particularly well in men who have the mutation.
Prostate cancer is androgen-dependent: it can feed off male hormones like testosterone in much the same way that some breast cancers rely on the female hormone estrogen, and it seems possible that changes in the way the body processes androgens may play a role in determining a man's risk of developing the disease.
The newly-discovered gene specifies an enzyme found in the prostate called steroid 5a-reductase. This controls the "activation" of testosterone when it is converted into dihydrotestosterone (DHT). DHT is the most powerful androgen in the human body, some 10 to 50 times more powerful than testosterone.
The mutation is just a single nucleotide substitution in the genetic code for this protein, which results in the amino acid alanine being replaced with threonine, but this is enough to change the protein's structure, so that it becomes more efficient, making the prostate grow faster. This does not explain \Ihow\i the mutation causes cancer, but it explains part of the \Iwhy\i. The rest of the cause is likely to be an interaction with other mutations caused by environmental factors, somatic mutations, which turn potential into reality. This theory, of course, fits in well with the observations of apparent racial and cultural differences in cancer frequencies.
The study was based in Hawaii, a good place to sample a variety of races, and the researchers looked at the genomes of 218 African-American and 172 Latino men with prostate cancer, and 261 African-American and 200 Latino controls, who were healthy. The researchers also looked at Asian and Caucasian men, but the link between the mutation and prostate cancer was not as clear.
The mutation was found in less than 1% of normal, healthy men, but in African-American and Latino men with prostate cancer, the rate of mutation can rise to as high as 10%. This, however, is less than the whole story, as Reichardt pointed out in a release on the Internet: "The one thing I know is that this won't be the only gene discovered: it's sure to be the first of many."
But at least it is a start.
\BKey names for the \ILancet\i study:\B Juergen Reichardt, Nick Makridakis, Ronald Ross, Malcolm Pike, Laura Crocitta, Leigh Pearce, Brian Henderson, and Laurence Kolonel.
#
"New test can better uncover hidden breast cancer",1195,0,0,0
(Sep Æ99)
The main problem with surgical treatment of cancers such as breast cancers is that they may have metastasized, released cancerous cells which move through the body, and set up secondary cancers, rather like colonies. If surgeons cannot tell who has such secondary cancers, they need to treat their patients with aggressive anti-cancer treatments and chemotherapy.
A September report in \IThe Lancet\i described a test which appeared to discover hidden cancer cells in women with breast cancer more effectively than current detection methods, offering the hope that some women may be spared the harsher treatments when they do not need them. Currently, a general recipe of surgery, radiation and chemotherapy is considered best for everyone with early stage breast cancer.
In most breast cancer surgery, surgeons remove several lymph nodes, called axillary nodes, to see if the cancer has metastasized. Pathologists slice the nodes, stain them with two dyes called haematoxylin and eosin, and examine them for signs of cancer, but around 25% of patients who show no sign of cancer in the nodes end up developing cancer in other places in their bodies. Detection involves a search under the microscope for cells with the right shape and clustering which indicates cancer. This is highly reliable when the cancer is normal and established, but some cancers behave abnormally, and a newly established cancer may be hard to spot.
The new test involves antibodies which react to proteins that are present in breast cancer cells, but not present in the normal cells of the lymph node. When the antibodies are placed on the breast tissue, they will react with the antibodies. This produces a color change in the cancer cells which a pathologist can detect under the microscope. Even very small numbers of cancer cells can be detected through this new technique.
The new procedure was validated in a study of 738 breast cancer patients who were considered to have no evidence of metastases in their lymph nodes by conventional analysis. The patients' lymph nodes were then tested with the antibody mixture to look for signs of occult (hidden) cancer. The tests revealed occult nodal metastases in 20% of the patients using the antibody technique.
As a rule, these hidden metastases were more commonly found in lymph nodes from women with large breast cancer tumors, which typically have a higher risk of developing metastases, but some were also detected in patients who had small tumors in the breast.
The researchers noted a curious age effect, where older (post-menopausal) women with occult metastases seem to have a greater chance of developing overt metastases and dying of breast cancer than younger (pre-menopausal) women. They think this may indicate differences in breast cancer spread and behavior between older women and younger women, and will no doubt be attracting research attention in the near future.
That finding aside, the technique appears to be fully validated, and the antibody detection method holds promise for other types of cancer as well, including melanomas, colon carcinoma, non-small-cell lung carcinoma and esophageal carcinoma.
\BKey names:\b Richard Cote.
#
"Enteroviruses: a new threat?",1196,0,0,0
(Sep Æ99)
The \INew England Journal of Medicine\i looked in detail at enteroviruses during September. These are a family of viruses (the picornaviruses) which commonly infect humans across the world, and they cause a number of illnesses such as hemorrhagic conjunctivitis, myocarditis, pericarditis, several central nervous system syndromes (most commonly aseptic meningitis), as well as a variety of fevers, with or without respiratory tract symptoms. Coxsackievirus A16 is associated with hand-foot-and-mouth disease, a "vesicular ulcerating eruption" which affects the hands, feet, and mouth, and lasts about 5 to 10 days.
In the past, the enteroviruses have been divided into subgroups largely on the basis of differences in the range of hosts and pathogenicity: the polioviruses, group A coxsackieviruses, group B coxsackieviruses, and echoviruses. The viruses are further divided by their reactions to serum neutralization tests, explaining names like "Coxsackievirus B1", but there is considerable overlap, making the classification system less than perfect.
Since 1970, enteroviruses have been given identifying numbers, starting with enterovirus 68 and extending to the most recently recognized member, enterovirus 71, discovered in 1969. This is the virus at the center of the new reports on recent virus disease outbreaks in Taiwan, the main focus of the \INEJM\i report. From March to December 1998, 129,106 cases of hand-foot-and-mouth disease and herpangina were reported by sentinel physicians.
If the sentinel physicians were dealing with a representative sample of the population of Taiwan, then, as many as 1.5 million cases may have occurred in Taiwan. The viral isolates that were obtained from inpatients during the outbreak were 61.9% enterovirus 71, 27.5% Coxsackievirus A16, and the remaining 10.6% were other enteroviruses.
Since it was recognized, enterovirus 71 has been seen in sporadic cases and outbreaks in various parts of the world, including the United States, Brazil, Europe, Australia, and Malaysia. Three other large outbreaks of enterovirus 71 infection "with severe and fatal cases" have occurred in Bulgaria, Hungary, and Malaysia. The reasons behind the 1998 Taiwan epidemic are still unclear, but the enteroviruses are believed to be spread by fecal-oral and, perhaps, respiratory routes.
Two possibilities offer themselves: either there was a new strain of enterovirus 71 that was particularly efficient in transmission throughout the population, or some change in the form of spread occurred - which might relate to a new strain. The shape of the epidemiologic curve of the outbreak is consistent with respiratory spread, say the \INEJM\i writers, but without more information regarding immunity in the population before the outbreak, and the evolution of this epidemic, it is hard to be certain.
Once again, the best we can do is to tag this as another area of scientific interest and importance to we humans, which needs further study.
#
"Does a fungus cause chronic sinusitis?",1197,0,0,0
(Sep Æ99)
Mayo Clinic researchers in the USA say they have found the cause of most chronic sinus infections. They believe it is an immune system response to fungus. And that, they say, gives them a chance to attack this very common problem, which affects an estimated 37 million people in the United States - about one in seven Americans, and similar proportions in other populations - and the condition has been getting more common over the last decade.
Chronic sinusitis involves an inflammation of the membranes of the nose and sinus cavity, and common symptoms are runny nose, nasal congestion, loss of smell, and headaches. Often the chronic inflammation leads to polyps, small growths in the nasal passages which hinder breathing.
The researchers began with an expectation that about 10% of all cases would be fungus-allergy related, and found that ". . . in fact, fungus is likely the cause of nearly all of these problems. And it is not an allergic reaction, but an immune reaction."
The research involved collecting and testing mucus from the noses of 210 patients with chronic sinusitis. They found fungus in 96% of the patients' mucus. They identified a total of 40 different kinds of fungi in these patients, with an average of 2.7 kinds of fungi per patient.
It appears that, in sensitive individuals, the body's immune system sends eosinophils to attack fungi and the eosinophils irritate the membranes in the nose. As long as fungi remain, so will the irritation. So now the way is clear, for the first time, to treat the cause of the problem, instead of the symptoms.
#
"Getting proteins past the cell membrane",1198,0,0,0
(Sep Æ99)
It is an accepted fact of biochemistry that most biologically active molecules, enzymes, hormones, drugs and so on, will be large molecules. This makes life difficult for those carrying out medical treatments, because most drug chemicals need to be inside a cell to have much effect. Some chemicals can have an effect by attaching to a receptor on the outside of a cell, but the rest need to get inside, and that is a challenge, because cell membranes get in the way.
With this background, it is easy to understand why a report, published in \IScience,\i early in September, created a lot of excitement in pharmaceutical companies. Essentially the news was that researchers have slipped a protein more than 200 times larger than the typical drug which can just get past the membrane barrier into the cells of living mice. Better still, the protein has been shown to function within the cell.
There is a major advantage in getting large proteins into cells, mainly because smaller drugs tend to interact with unintended targets, which means they are wasted, and larger doses are required. Larger proteins fit only onto the molecules for which they were designed, so they could be given in lower doses, resulting in fewer side effects.
In simple terms, says Steven F. Dowdy who led the research team, "For the very first time, we've introduced a large, biologically active protein into every cell of the body - including cells in the brain that are normally protected by the blood-brain barrier." Dowdy is an assistant investigator of the Howard Hughes Medical Institute and an assistant professor of pathology and medicine at Washington University School of Medicine in St. Louis.
A group led by Dowdy reported in \INature Medicine\i last December and January that they had managed to smuggle an enzyme into HIV-infected cells. The idea was to introduce a human enzyme that makes cells self-destruct, but their enzyme was modified to include a string of 11 amino acids which served as a passport for crossing a cell's outer membrane. The researchers needed to prove that large proteins could slip into cells in model animals before considering human applications, and this is exactly what they have now done.
They attached a molecular "passport" called a protein transduction domain (PTD) to a compound whose uptake by cells could be monitored. This is a normal experimental practice, and they used a common compound, a dye called fluorescein, which turns green when exposed to fluorescent lighting.
This dye is normally locked out of cells because of its size - the limit for drugs entering cells is around 500 \Jdalton\j, and the molecular weight of fluorescein is around 2000 dalton. This test was passed with flying (green) colors when the researchers injected mice with this combined PTD-fluorescein protein. After 20 minutes, they isolated cells from the animals' blood and spleens, and found that all the cells fluoresced green.
The next step was to go after a serious enzyme: they linked a bacterial enzyme to the PTD and fluorescein. Their fluorescent analysis revealed that the 120,000 dalton enzyme, beta-galactosidase, entered all the cell types tested. But fluorescence by itself is not enough: the important question is whether the enzyme would still be active "on the other side". Enzymes are proteins, and proteins pass through more easily if they are partially unfolded, and this might inactivate the enzyme.
The enzyme is able to turn a clear chemical target into a blue dye, and the target cells in a mouse kidney turned a pleasing shade of blue, showing that the enzyme was active inside the cell. The liver, lung, and other tissues of the injected mice also turned blue when exposed to the enzyme's target, and the animals' entire brains also stained blue within four hours of injection. So within that time, the enzyme had refolded, and become active, but at the same time, the PTD did not work its magic in the brain by destroying the blood-brain barrier. More important, the animals had no visible behavioral changes or other differences compared with untreated mice.
Dowdy is reported to be working on modified versions of the PTD that should allow proteins to enter brain cells and other cells even more rapidly, and he is also using his new technology to determine whether a malfunctioning protein helps jump-start cancer.
In a press release, he notes that the laboratory's protein-targeting technology also may enable companies to create drugs that act only in disease-related cells, opening up a completely novel avenue of therapeutic approaches. "We can now do things in normal cells of mice that you could never even dream of doing with any reliability a year ago," Dowdy said.
#
"Preventing HIV transmission to infants",1199,0,0,0
(Sep Æ99)
Computer modeling suggests that an antiviral drug, nevirapine, may produce the cheapest and fastest way of stopping HIV transmission from mother to child in sub-Saharan Africa. The study, published in \IThe Lancet,\i simulated the costs and effectiveness of treating 20,000 mothers, assuming a 30% incidence of HIV infection. It took into account various factors such as rates of HIV infection, breast-feeding habits, life expectancies, and effectiveness of the treatment, based on recent medical research. The nevirapine treatment was significantly cheaper and more effective than three similar drug treatments that have recently been tested.
Cost-effectiveness can be improved by several ways: finding better drugs, applying drugs more effectively, applying drugs where they will do the most use, and reducing the cost of the drugs, but a computer model has the advantage that it provides good guidance without putting people in danger. The results of modeling need to be confirmed in reality, but this work suggests that there are grounds for hope.
The problem is that sub-Saharan Africa is poor, and is unlikely to be saved by expensive treatments: the human will is just not there. HIV/AIDS is a devastating problem, even if people are not prepared to throw the world's resources against it, so if human will cannot be changed, the solution may be to find treatments that can be afforded.
Around 3 million children have been infected with HIV since the pandemic began, 90% of those in Africa. While long-term treatment with the antiviral drug AZT is the most effective way to prevent HIV transmission, it is expensive and must be started early in pregnancy. Most African mothers cannot afford AZT and often do not get prenatal care, thereby missing their chance to begin treatment.
A clinical trial called HIVNET 012, published in the same issue of \IThe Lancet,\i showed that nevirapine treatment resulted in almost 50% fewer HIV cases in infants compared with a short-course AZT treatment. This was followed by the computer modeling, looking at five treatments involving nevirapine, AZT, and lamivudine, in different standard treatments.
The universal nevirapine program would cost approximately $80,000 a year to treat 20,000 women, and would prevent 603 infections during a 12-month period. That gives each prevented case a dollar cost of $138, against amounts as high as $2,809 for the most expensive method.
#
"Third generation pills and thromboembolism",1200,0,0,0
(Sep Æ99)
Danish research, reported in the \IBritish Medical Journal\i in September, made a cautious suggestion that there \Imight\i be a link between the level of primary venous thromboembolism in women of fertile age and the increased use of third generation oral contraceptives.
Rates for primary venous thromboembolisms among women fluctuated around 120 per million person years during 1977-88, but increased to about 140 per million person years during 1989-93, while the rates for men remained constant during this period. The use of third generation pills went from 0.2% of all oral contraceptives in 1984 to 17% in 1988, 40% in 1990 and 66% in 1993.
The change is small, but the increase in women's admissions for this condition, although small, could not be explained by changes in diagnostic procedures or in the threshold for admission, since no increase was seen among men. The problem is that a correlation, such as we see here, does not necessarily mean that one causes the other: a change in female drinking, smoking, dieting or even cosmetic practices could turn out to be the real cause. The information does sound a quiet warning that the situation needs to be watched.
Against that, four previous studies have come up with similar results, and while each has been attacked for flaws in design, there seems to be a pattern emerging, but as a \IBMJ\i editorial commented, "àIn a $3bn world contraceptive market the stakes are high" - and that applies if the link is a valid one or just a coincidence. The third generation pills are less androgenic, less likely to produce masculine characteristics, and may cause less acne, which could make them attractive to some women, but the editorial concludes that "It is not that third generation contraceptives are unsafe - it is just that we have something safer."
#
"Hemoglobin started as an enzyme",1201,0,0,0
(Sep Æ99)
The nematode worm \IAscaris lumbricoides,\i is an intestinal parasite which infects one billion people worldwide. It has uncommonly strong hemoglobin that binds to oxygen 25,000 times more tightly than does human hemoglobin. In fact, this hemoglobin binds oxygen so strongly that many scientists believed the molecule could not possibly play any role in respiration.
So researchers have been trying to find out why a molecule which functions as an oxygen carrier would bind oxygen so tightly that the oxygen molecules would never come off - and now they have found a link between the way primordial bacteria detoxify themselves of atmospheric gases, and the modern mammalian hemoglobins that manage respiration.
The work was reported in \INature\i at the end of September, and has the potential to explain how a wide variety of organisms evolved the ability to use or guard against atmospheric gases such as oxygen and nitric oxide. It may also become a source of new cancer treatments, offering us ways to starve tumors of the oxygen they need.
In the worm, hemoglobin acts as an enzyme to neutralize oxygen, which in high doses is toxic to the worm, but we use almost the same molecule to deliver oxygen and nitric oxide to tissues. The worm uses nitric oxide to trigger an enzymatic reaction which actually consumes oxygen, suggesting that the main role in the nematode is to get rid of oxygen. This would make sense, because the \IAscaris\i parasite has a low tolerance for oxygen.
The human intestine is a low-oxygen environment, but even there, odd molecules of oxygen can seep in, and that is when the worm's hemoglobin plays its role as a deoxygenase, driving what the researchers think is a ten-step chemical reaction.
The two hemoglobin types, worm and human, differ only in the location of a single sulfur-containing amino acid, cysteine, within the oxygen-binding pocket of the nematode's hemoglobin. Put the cysteine at the back, away from the oxygen, and you have normal mammalian hemoglobin, but if it is at the front, near the oxygen, the hemoglobin uses nitric oxide to destroy oxygen.
In the world's early atmosphere, nitric oxide came before oxygen, and it was probably there before any life form, so the first bacteria needed a way to protect themselves from the nitric oxide. Like oxygen, this chemical can destroy many biologically important molecules if it is let loose inside an organism. On this theory, it is highly likely that the first hemoglobins had nothing to do with oxygen at all.
Recent years have revealed a number of roles for hemoglobin, apart from the "standard" one. For example, when hemoglobin binds to nitric oxide (NO) it causes blood vessels to dilate, but no previous discovery has suggested that any hemoglobin could function as an enzyme to catalyze a series of chemical reactions.
So the picture we have now is that in bacteria, hemoglobin is used as an enzyme to destroy NO, while in mammals, hemoglobin carries both oxygen and NO, using the NO to ensure oxygen delivery by dilating blood vessels. In the nematode, NO is used to remove oxygen, which means that it is using hemoglobin as mammals do in one sense: as a regulator of oxygen, though it is eliminating the oxygen, rather than using it. This puts the \IAscaris\i worm on the evolutionary divide between the bacteria and the higher animals.
Work done over the past three years by Stamler and his colleagues has brought about a major change in the way scientists see the importance of NO. Their publications, say observers, have been laying the groundwork for challenging accepted beliefs about hemoglobin and have revealed entirely new functions.
\BKey names:\b Jonathan Stamler, Dena M. Minning, Andrew J. Gow, Joseph Bonaventura, Rod Braun, and Mark Dewhirst.
#
"DNA sequencing advances",1202,0,0,0
(Sep Æ99)
The September issue of \IGenome Research\i carried a report of a new miniaturized device that works far more quickly than current machines to sequence DNA. Developed by Dieter Schmalzing, Daniel Ehrlich, and colleagues at the Whitehead Institute in the USA, the machine has already been used in the successful sequencing of "real world" DNA samples.
Made from glass wafers, the workers set out to use a microdevice to sequence "typical" human DNA samples, as prepared for the Human Genome Project. They chemically etched glass wafers with long, microscopically thin channels no more than a hundredth of a centimeter wide and about twelve centimeters long. Using short side channels, the researchers injected tiny "plugs" of DNA, labeled with fluorescent dyes, at one end of the channels before applying an electric field to separate and move the DNA strands down the length of the channels before they were carried over a fluorescence detector which identified the components of the DNA.
According to the report, the device accurately and rapidly sequenced DNA from human chromosome 17, offering a prospect that DNA sequencing could be done almost anywhere.
The October issue of the journal \INature Biotechnology,\i released in late September, promised that gene chips may soon be within easy reach of most biologists. These chips can "deconstruct" long segments of DNA, allowing scientists to search through the pieces for genes that promote disease, or for the genetic switches that govern such biological phenomena as aging, and the DNA codes that permit microorganisms to make antibiotics.
Right now, the chips have to be customized, explaining why the single supplier, Affymetrix, sells the chips for as much as US$12,000, and may take months to prepare a chip, if it contains the DNA of a specific organism or tissue. (Affymetrix can also offer off-the-shelf versions for around US$2,500.) A group of scientists from the University of Wisconsin-Madison say they can cheaply and simply manufacture the customized chips, using MAS, which is short for a Maskless Array Synthesizer.
Standard gene chip manufacture relies on photolithography, a process which requires shining ultraviolet light through a series of stencil-like masks onto a glass chip to synthesize tens of thousands of DNA molecules of interest. Sometimes, as many as 100 masks are required to make a single chip which has as many as 500,000 tiny DNA-laden compartments - making the cost and time involved in producing a gene chip a little easier to understand.
The new system uses an off-the-shelf Texas Instruments technology, normally used in overhead projection, and called Digital Light Processors. This is based on an array of 480,000 tiny aluminum mirrors arranged on a computer chip.
The Wisconsin team, a mix of molecular biologists and semiconductor engineers, found they could shine light in very specific patterns by manipulating the mirrors. This entirely eliminates the need for the delicate and expensive masks used in traditional DNA chip technology. One of the team describes it as rather like desktop publishing, reducing manufacturing time from several weeks to about eight hours.
The Wisconsin group has applied for a patent for the new technology through the Wisconsin Alumni Research Foundation, a not-for-profit corporation which manages intellectual property on behalf of UW-Madison scientists. Rights to the technology have been licensed to a Madison-based company known as NimbleGen Systems, founded by three of the paper's authors, which will focus on development and commercialization of the new technology.
#
"First adult diabetes gene",1203,0,0,0
(Sep Æ99)
The first genetic defect linked to insulin resistance, the precursor to most cases of adult diabetes, was reported in the September issue of the journal \IDiabetes.\i This is a useful breakthrough, because the gene produces an abnormal protein which may now be targeted by new classes of drugs. Around 20% of the population of Europe and the USA are insulin resistant, a condition in which the body's insulin cannot efficiently metabolize glucose, and about a quarter of those, some 5% of the population, develop adult diabetes when they become unable to maintain normal insulin and glucose levels. Adult diabetes, also called type 2, or NIDD (non-insulin-dependent diabetes), is by far the most common form of diabetes.
There is a protein called PC-1, produced in abnormally high concentrations by many insulin-resistant people. This interferes with insulin's ability to stimulate cells' use of glucose, and that has drawn the attention of researchers to the PC-1 gene. The mutation involves just one nucleotide in 2500 (an adenine is replaced by a cytosine, changing one amino acid from lysine to glutamine), but this produces a form, called the Q allele, which is about three times more common among insulin-resistant people and more than twice as frequent among adult diabetics than in people with normal insulin function. This finding was only possible when the entire gene was mapped.
This form of diabetes is a geneticist's nightmare, since the condition is largely lifestyle dependent, and can be limited by exercise (or, as a short report in the \INew England Journal of Medicine\i revealed in August, by spending some time in a hot tub!). So pinning down even one gene is a major advance.
\BKey names:\b Vincenzo Trischitta, Antonio Pizzuti.
#
"GM: the scientists begin to speak out",1204,0,0,0
(Sep Æ99)
August and September saw a groundswell of scientific response from around the world to the anti-genetically modified (GM) scare campaigns which grabbed most of the headlines in 1999. Scientists are now approaching the GM issue from their own specialist areas, and applying a bit of common logic to some of the more extravagant claims of GM opponents.
It may be a little too late to roll back public opinion, but if the scientists do succeed, it will be largely because of the enthusiastic over-statement of the GM case which has so delighted the opponents and enthused the tabloid journalists. They have used the monster of "Frankenstein foods" to terrify their public, but now the monster is likely to turn around and devour \Ithem.\i As we noted last May (see \JEngineered corn and monarch butterflies\j), those who invoke the monster's name have a poor understanding of both literature and genetics, and now they may pay the price of that ignorance.
Last May we reported that alarm bells were ringing over the alleged threat from GM corn pollen to monarch butterflies. Now two prominent entomologists, Anthony M. Shelton from New York, and Richard T. Roush from Australia, warn that three recent studies on the effects of genetically engineered crops have distorted the debate about engineered crops and that this could have "profound consequences" for science and public policy.
They write in "False reports and the ears of men," in the September issue of \INature Biotechnology,\i that the public should not be swayed "by laboratory reports that, when looked at with a critical eye, may not have any reality in the field or even in the laboratory."
They say that one study, carried out by Shelton's Cornell associate, John E. Losey and colleagues, and the basis of our May article, "can only be considered a preliminary laboratory study." Shelton and Roush agree that this result was expected under such laboratory test conditions, but they question whether this test was realistic. In simple terms, the doses of pollen fed to caterpillars were entirely unrealistic, and scientists, they say ". . . need to make assessments that are pertinent to the real world."
The second study they criticize came from researchers at Kansas State University who reported in \IScience\i that they had discovered corn borer resistance to Bt toxins. They are unimpressed by the methodology, noting that there are in fact many Bt toxins, and ". . . that the authors did not demonstrate that resistance was actually to the same Bt toxin as in the plant or that the insects could survive on the Bt plant." They are also critical that what they call a " questionable laboratory study" should then become a basis for a lot of serious debate.
The third paper they criticized in a recent University of Arizona study, showed that the pink bollworm's resistance to Bt-cotton was recessive in inheritance, but the paper questioned whether resistant bollworms developed more slowly than susceptible bollworms.
The problem here is that so long as the resistance gene is only present in low concentrations, most of the genes will be carried as a masked recessive gene and have no selective effect, so with random mating, the recessive gene should never build up in numbers to the point where it can swamp the dominant form of the gene which makes the borers vulnerable. But if the resistant bollworms developed more slowly, they would not mate randomly, but would mate with each other more often than chance would predict, building up their numbers. "We hope that the take-home message won't be converted to another premature claim that Bt crops are doomed," Shelton and Roush said in their commentary.
More importantly, Shelton and Roush asked why a previous and more relevant and realistic study had been largely overlooked. While the Losey et al. laboratory study showed high mortality among monarch larvae that ingested genetically engineered pollen, an Iowa State University study by Laura Hansen and John Obrycki showed low mortality even when monarch larvae were fed milkweed that had the highest levels of Bt pollen that would be encountered in the field.
They commented that it was unlikely these high Bt pollen levels would be encountered by the insects in the field, and said, "few entomologists or weed scientists familiar with the butterflies or corn production give credence to the \INature\i article."
At about the same time, Ben Miflin of the Institute of Arable Crops Research (IACR), was addressing the British Association Festival of Science in Sheffield, UK. His theme: how safe are the "normal" crop plants, the ones without even a hint of GM about them? He warned that the GM food debate is taking place in the absence of much public understanding of conventional crops.
Most plants in the wild, even close relatives of crop plants, produce deadly poisons to discourage animals and insects from eating them. Potatoes, for example, carry levels of a nerve poison that would probably be unacceptable for an artificial product; cassava contains substances which generate fatal levels of cyanide; and even beans can cause intense stomach cramps. And these are all natural crops, which we use happily.
The crops we use are generally modified by centuries of breeding to be less harmful, or we have learned careful and sometimes lengthy preparation to make our foods fit for consumption, as happens with beans and cassava. Natural organically grown crops can still be deadly: a British weed, corncockle \I(Agrostemma githago),\i and a fungal infection called \Jergot\j \I(Claviceps\i sp.) can make wheat highly poisonous - the drug LSD is a chemical derivative of the active component in ergot. Only careful weeding or herbicide use along with rigorous selection of seed can ensure that these sources of poison in the grain are not passed on.
Miflin also made the point that some of our crops show severe signs of natural genetic modification - all without direct human assistance. The analysis of the DNA of domesticated wheat shows that it has six or eight wild ancestors, and it has three sets of genetic material, which happens only rarely in the wild. It is clear that farmers at some time in the past managed to cross wheat with unrelated species to create a species unlike anything in nature.
And by natural breeding, farmers have created a plant which cannot survive in the wild, because the seedÆs distribution mechanism has been bred away in crops like wheat so as to ensure that we do not lose valuable seed when it drops from the stalk. But once we have made our crops safe for us, are they safe forever?
In short, no, they aren't safe at all. Conventional breeding techniques may re-introduce genes coding for poisons, and at least one new potato strain had the poison solanine accidentally re-introduced into it. At least when GM techniques are used, the methods are precise enough to ensure that the genes for poisons are not accidentally introduced into new varieties, while desirable qualities are more accurately targeted. On top of that, GM crops are inspected and tested far more closely than any natural crop needs to be.
Our crops began, says Miflin, when our Neolithic ancestors first became farmers, some 8,000 years ago, and they have been developing since then. Cultivated species have become separated from their wild ancestors, often by assimilating genes from other species in a random and uncontrolled manner. GM, says Miflin, is a way of continuing this process in a more controllable and less arbitrary way. According to him, "we should use the best of both GM and conventional technology in a knowledge-based way."
#
"The first clear picture of a ribosome",1205,0,0,0
(Sep Æ99)
Ribosomes are a cell's molecular machines, translating the genetic code and synthesizing proteins. Without ribosomes, no cell could survive. Inside every living cell, tens of thousands of ribosomes churn out proteins with mind-boggling speed and precision, but up until now, the fine detail has been missing. Images of the structure of ribosomes were published in \IScience\i in late September, bringing us a whole new level of understanding.
UCSC researchers believe they have taken a huge step toward solving one of the most fundamental and daunting problems of molecular biology by obtaining images of the complete structure of the ribosome, a key component of all living things, and few would be inclined to disagree with them. The images show how different parts of the ribosome interact with one another and how the ribosome interacts with certain molecules involved in protein synthesis.
They do not tell the whole story, but the images show us that the ultimate goal of understanding exactly how the ribosome works is finally within reach. Harry Noller, head of the group which obtained the new images, commented that "What we have at present are a few snapshots, and ultimately what we would like is a movie of the ribosome in action."
The ribosome comes from a bacterium, \IThermus thermophilus\i: most research has focused on bacterial ribosomes, which are a bit smaller than those in higher organisms. The image was achieved through x-ray crystallography, using the crystallography facility at Lawrence Berkeley National Laboratory, one of a handful of sites with a synchrotron capable of producing the high-energy x-rays needed for crystallography of a structure as large as the ribosome.
The complete ribosome is claimed to be the largest molecular structure ever solved by x-ray crystallography: some virus structures of comparable size have also been solved, but their symmetry allowed scientists to extrapolate from results obtained from a small portion of the whole structure.
\BKey names:\b Harry Noller, Jamie Cate, Gloria Culver, Marat Yusupov and Gulnara Yusupova, and Thomas Earnest.
#
"Nobel awards announced",1206,0,0,0
(Sep Æ99)
While this monthly science news was being prepared, news of the Nobel Awards came through. This month, we give you a quick preview of the awards, with an in-depth look at the significance of the awards next month.
The 1999 Nobel Prize for Literature was awarded to Gⁿnter Grass, best known for his book "The Tin Drum" published in 1959. The official announcement describes Grass in these terms: "Whose frolicsome black fables portray the forgotten face of history."
Born in Danzig in 1927, Grass has described himself as a "SpΣtaufklΣrer", a belated apostle of enlightenment in an era that has grown tired of reason. His other well-known works include "Cat and Mouse" and "Dog Years".
The 1999 Nobel Prize for Physiology or Medicine was awarded to Gⁿnter Blobel, for his discovery that proteins have intrinsic signals that govern their transport and localization in the cell. This discovery becomes important when we realize that proteins are continually being made in parts of cells, but that they then need to be transferred past a membrane barrier, either to another organelle in a cell, or even into another cell.
This immediately raises the questions: how are newly made proteins transported across the membrane surrounding the organelles, and how are they directed to their correct location? Gⁿnter Blobel, a cell and molecular biologist at the Rockefeller University in New York, discovered in the 1970s that newly synthesized proteins carry a signal which is essential for them to cross the membrane of the endoplasmic reticulum, one of the cellÆs organelles. Over the next 20 years Blobel worked out the molecular mechanisms underlying these processes, and he showed that similar "address tags" direct proteins to other intracellular organelles.
The 1999 Nobel Prize for Physics was awarded to Gerardus 't Hooft and Martinus J.G. Veltman for elucidating the quantum structure of electroweak interactions in physics. Their work has placed particle physics theory on a firmer mathematical foundation. They have in particular shown how the theory may be used for precise calculations of physical quantities. Experiments at accelerator laboratories in Europe and the USA have recently confirmed many of the calculated results.
With the development of large accelerators, starting in the 1950s, it became possible to study matter at the level of quarks, to study the creation of new particles and the forces that act between them. All that was needed was a good theory to explain the observations, and to guide the experiments. This came in the middle of the 1950s, and resulted in the standard model of particle physics, which groups all elementary particles into three families of quarks and leptons, which interact with the help of a number of exchange particles for the strong and the electro-weak forces. The work of 't Hooft and Veltman, says the Nobel Foundation press release, has given researchers a well functioning "theoretical machinery" which can be used for, among other things, predicting the properties of new particles.
The 1999 Nobel Prize for Chemistry was awarded to Ahmed H. Zewail for his studies of the transition states of chemical reactions using femtosecond spectroscopy. In particular, Zewail showed that it is possible to use rapid laser technique to see how atoms in a molecule move during a chemical reaction, to bring the reaction into a slow motion view.
Zewail has undertaken pioneering investigations of fundamental chemical reactions, using ultra-short laser flashes, on the time scale on which the reactions actually occur. He has brought about a revolution in chemistry and adjacent sciences, since this type of investigation allows us to understand and predict important reactions.
Zewail's technique uses perhaps the world's fastest camera, relying on laser flashes of femtosecond duration, down at the time scale on which the reactions actually happen. There are 10\U15\u fs in a second, so 1 fs is 10\U-15\u seconds. This time scale has been used to name the work femtochemistry. The use of these methods helps to explain why some chemical reactions take place but not others, and why the speed and yield of reactions depend on temperature.
The 1999 Nobel Peace Prize was awarded to the organization MΘdecins Sans FrontiΦres (Doctors Without Borders), which was founded in the early 1970s. They take the view that all disaster victims, whether the disaster is natural or human in origin, have a right to professional assistance, given as quickly and efficiently as possible.
They argue that national boundaries and political circumstances or sympathies must have no influence on who is to receive humanitarian help, and they use their perspective to assist in other areas of health as well. The group is uniquely placed to create openings for contacts between the opposed parties in critical situations, marked by violence and brutality.
The 1999 "Nobel" for Economics (strictly, the Prize in Economic Sciences in memory of Alfred Nobel) was awarded to Robert A. Mundell "for his analysis of monetary and fiscal policy under different exchange rate regimes and his analysis of optimum currency areas."
Mundell's work is the basis of the theory which dominates practical policy considerations of monetary and fiscal policy in open economies. His work dates back several decades, but his contributions were described by the Nobel Foundation as outstanding, that they "constitute the core of teaching in international macroeconomics." In his key work he studied how the effects of monetary and fiscal policy are related to the integration of international capital markets. This involves seeking answers to questions like how these effects depend on whether a country fixes the value of its currency or allows it to float freely, and even whether a country should even have a currency of its own.
#
"Single mothers and schoolwork",1207,0,0,0
(Sep Æ99)
Does growing up in a single-parent home put children at risk of poor school performance or social or behavioral problems? In the past, studies have given inconsistent results, but now a large, multiethnic Cornell University study has found that single motherhood does not necessarily compromise how well-prepared six- and seven-year-olds are for school. This question has been worrying US policy-makers because the number of one-parent families with children under 18 years of age soared from 7% to 24% of the nation's families between 1970 and 1990. In that period, the proportion of children under 18 living in one-parent families went from 12% to 27%, and similar patterns are to be seen in a number of other western countries.
One-parent families typically have lower incomes, but a child's readiness for school depends more on their mothers' ability and educational levels. Interestingly, these were about the same in both of the large samples analyzed of single- and two-parent families, according to Henry Ricciuti, professor emeritus of human development at Cornell, who published his results in the September issue of the \IJournal of Family Psychology\i (http://www.apa.org/journals/fam.html).
Ricciuti reports "strong and consistent links in white, black, and Hispanic families between a mother's education and ability level and her child's math, reading, and vocabulary scores, as measured in the home by survey interviewers."
Single motherhood was defined as the mother having no partner or spouse living at home at the time of the survey. The average mother's age at birth of her child was 20-21 years, and the survey examined the effects of single motherhood on school readiness, achievement, and behavior in about 1,700 six- and seven-year-old children from white, black, and Hispanic families in the National Longitudinal Survey of Labor Market Experience of Youth.
So if there \Iare\i adverse effects to education from single-parent families, these effects must arise later in childhood. Ricciuti takes this to mean that any " . . . support for single-parent families should begin in the early childhood years, when the potential adverse effects of single parenthood have not yet emerged or become apparent as they tend to do in adolescence and young adulthood."
#
"Breast-feeding and babies' IQs",1208,0,0,0
(Sep Æ99)
A report published in the October issue of the \IAmerican Journal of Clinical Nutrition\i indicates that a breast-fed babyÆs IQ is three to five points higher than that of formula-fed babies. James Anderson at the University of Kentucky in the USA, reports that breast-feeding, compared with formula feeding, is associated with significantly higher levels of cognitive development. The difference increases the longer a baby is breast-fed, and low birth weight babies receive the greatest benefits.
There are two obvious explanations for such effects: it could derive from nutrients found in human milks and not available in formulas, or it could be a matter of maternal bonding. Anderson's study was a meta-analysis, a critical review and summary of results from 20 clinical studies. "Infants deprived of breast milk are likely to have lower IQs, lower educational achievement, and poorer social adjustment than breast-fed infants," said Anderson in a statement released on the Web.
With a number of studies to rely on, it is possible to use statistical methods to assess whether it is breast milk nutrients such as long-chain polyunsaturated fatty acids like docosahexanoic acid (DHA) and arachidonic acid (AA), which may support neurological development, or bonding, or a combination of the two.
The results: nutrition seems to be good for 3.2 IQ points if all else remains the same, while maternal bonding seems to be good for 2.1 IQ points when other factors are constant. What is more, the enhanced cognitive development was seen as early as 6 months and was sustained through 15 years of age. The longer a baby was breast-fed, the greater the increase in cognitive developmental benefit.
While IQ is a touchy subject, especially when used on an individual by somebody unfamiliar with basic psychometrics, as a group measure, it is a perfectly valid and adequate tool. Even though the nature of the "intelligence" being measured is open to doubt, there is clearly a benefit to be gained from breast-feeding.
The analysis accounted for such factors as the mother's age and intelligence, birth order, race, birth weight, gestational age, and socioeconomic status. The study was funded in part by Martek Biosciences Corporation, which produces a plant source of DHA and AA for inclusion in formula sold worldwide, and this needs to be kept in mind when examining Anderson's conclusion that these are important elements in the relationship.
There is, however, a healthy literature on these subjects, suggesting that if they are arguing their own cause, they may well be doing so in the US public interest. During pregnancy a mother mobilizes DHA and AA to support brain development, and then continues this support through her milk. In more than 60 countries, including most of Europe, infant formulas contain DHA and AA, but they are not found in formulas sold in the United States. Clearly, the hope is that this will lead to the use of these supplements in US formulas - or perhaps to more babies being breast-fed, when they can, but the link to DHA and AA is more tenuous: the effect could be related to some other chemical in breast milk.
Meanwhile, an Australian study, published in the \IBritish Medical Journal\i during September, suggests that waiting until a child is at least four months of age before giving it milk other than breast milk may be a good move. the researchers say breast milk for this period of time may protect against asthma and atopy (a predisposition to various allergic reactions) when the child is older.
Australia is a world leader when it comes to suffering from asthma, and so it is also an active center for research into the condition. Wendy Oddy, Senior Research Officer, TVW Telethon Institute for Child Health Research, West Perth, Australia, and her colleagues studied 2,187 children in Perth, following their progress from antenatal clinics to the age of six years. They found that there was a significant reduction in the risk of childhood asthma by the time they reached this age, if they had been exclusively breast-fed for at least four months after they were born.
The important measure is not how long breast-feeding went on for, but the age at which other milk was introduced into an infant's diet. They say, however, that they cannot definitively reject the possibility that it is breast feeding itself that is of prime importance.
#
"Making a smart mouse",1209,0,0,0
(Sep Æ99)
One day, scientists may be able to boost human intelligence - at least if we are to believe the media reactions to a report in \INature\i early in September, that Princeton University researchers have genetically modified mice to have improved learning and memory. So as long as we regard human intelligence as the equivalent of maze-running, this could be a useful breakthrough, but some of the observed effects would almost certainly influence what we call human intelligence.
The researchers found that adding a single gene to mice significantly boosted the animalsÆ ability to solve maze tasks, though the strain of mice, named Doogie, also show as adults, certain brain features of juvenile mice, which, like young humans, are widely thought better than adults at grasping large amounts of new information.
The real point is that the research shows us a common biochemical mechanism at the root of nearly all learning. The brain uses the same basic tool when it forms associations, and this seems to hold, even though parts of the brain work in apparently different ways, on different types of data such as sight, sound, and taste.
It also confirms the ideas, first put forward by Donald Hebb in 1949 about how memory operates, through certain synapses being strengthened, but at the same time, the finding offers a way in which genetic technology could be used to "improve" humans, and this is bound to cause a strong back-lash, perhaps explaining why researchers have been quick to suggest that the methods might be useful in treating dementia.
A gene called NR2B is a key switch that controls the brainÆs ability to associate one event with another, and this is the core feature of learning, at least at a simple level. Mice which lack this gene in a tiny region of their brains have impaired learning and memory, suggesting the gene's importance, but adding the same gene is and a more rigorous test of the gene's function.
The NR2B gene exists in humans, but its potential for enhancing learning in humans is not known, so aside from the likely drawn-out debates on the ethics of enhancing human performance, we are still a long way from seeing any practical impact, and it is quite likely that the debate will be diverted to the ethics of using a drug which enhances the activity of the gene itself.
The NR2B gene codes for a protein that sits on the surface of neurons and serves as a docking point, or receptor, for certain chemical signals. This receptor, \IN\i-methyl-d-aspartate, but always called NMDA, is like a double lock on a door; it needs two keys, two signals, before it will respond, and this is the secret to a memory effect, linking two separate events in an association, like that of lit match and burning feeling.
In young animals, the NMDA receptor reacts even when the events are far apart, relatively speaking. This means that the young are able to learn more easily by forming associations, but after adolescence, the receptor becomes less responsive, making learning more difficult. This effect has been seen in species as different as songbirds and primates.
The treated mice were not only given extra copies of the NR2B gene, but these were set to increase in activity as the mice age, reversing the normal "age" effect. The experimental mice had had a much greater learning response than normal mice. As adults, their brains retained physical features that usually characterize juvenile animals; in particular, they had a high level of plasticity, the ability to form long-term connections between neurons. This may help to explain the strain's name, "Doogie", which commemorates the precocious character on the television show "Doogie Howser, MD."
Some of the research centered on measuring the number and function of NMDA receptors at individual synapses. Once this was done, it was easy to decide whether the altered gene led to increased NMDA receptor activity.
So how do you test the memory of a mouse? In this case, the answer was to put the mice into a space and let them explore two objects for five minutes. After a delay of several days, one object was replaced with a new one and the mice were returned to the space. A normal mouse would give equal time to looking at each of the objects, indicating that it was unremembered, while the transgenic mice gave their time over to exploring the new object. With repeated tests, it became apparent that the gene-modified mice remembered objects four to five times longer than their normal counterparts.
The transgenic mice also had better "emotional memory," which was tested by putting the mice in a chamber where the floor gave them mild electric shocks, and after one hour, one day or ten days, the transgenic mice had a much more pronounced fear response than the control mice, when returned to the same chamber. The same result occurred when the shock was associated with a tone, rather than with a particular chamber - this is exciting, because hearing uses a different type of "brain circuitry".
Just as importantly, while the transgenic mice were better at recognizing a situation where shocks were likely, they were also better at recognizing that the situation was not as bad as they recalled. And finally, the transgenic mice scored better at spatial learning, when they were put in a pool where there was a hidden platform that they could use to get out again. The transgenic mice learned to find the platform after three sessions, while the control mice required six.
\BKey names:\b Joe Tsien, Guosong Liu, Min Zhuo, Ya-Ping Tang, Eiji Shimizu, and Claire Rampon.
\BBackground link:\b William Calvin's MIT book "The Cerebral Code" is also available on the Web at http://williamcalvin.com/bk9/ and this has a useful glossary at http://williamcalvin.com/bk9gloss.html
#
"Earthlike planets?",1210,0,0,0
(Sep Æ99)
With the number of planets and brown dwarfs around other stars now exceeding 30, with two planetary objects around pulsars, three protodiscs and some 14 unconfirmed objects orbiting out there, astronomers are starting to wonder if we will ever see Earth-like worlds.
Well, we already have, according to a story in \INew Scientist\i in late September. Independently two groups have found promising candidates for low-mass planets that could play host to life. The large extrasolar planets have been detected by looking for stars that "wobble" due to the gravity of planets that orbit them. The problem is that it takes a massive planet to do this, and low-mass planets like ours just do not have the mass to make their mark.
One group watched the way the gravity of stars magnifies the light from objects behind them, an effect called gravitational lensing: if two stars line up, the star behind is magnified briefly, and any planet passing behind the near star, to one side of its "sun", may also show up as a magnified blip - and that is exactly what one group think they have seen. They believe it may be as light as a few Earths, and that it orbits its star in the inner zone where rocky planets such as ours are likely to form.
The other main method is to look for small changes in the brightness of a star as a planet moves in front of it, and a second group reports that a faint double star system 55 light years away called CM Draconis may be host to a planet 2.5 times the size of Earth. What is more, the planet is right in the middle of the "habitable zone", the not-too-hot, not-too-cold region for planetary orbits.
Jump directly to the \INew Scientist\i story at http://www.newscientist.com/ns/19990925/newsstory8.html
#
"Mars Climate Orbiter goes missing",1211,0,0,0
(Sep Æ99)
Towards the end of September, NASA's first interplanetary weather satellite was expected to complete an engine procedure which would place it in orbit around Mars. Instead, it disappeared behind Mars, and never reappeared. The probe was behind the planet and out of radio contact during most of the 17-minute engine firing which should have placed it in orbit, after a trip that began on December 11, 1998.
The orbiter and its companion, the Mars Polar Lander, both had instruments onboard designed to discover what happened to the water which may have once formed rivers or lakes on the planet. Mission scientists say water is the key to determining whether life ever existed on Mars. Fortunately, the lander was launched separately, and does not arrive at Mars until December 3.
Significantly, as it turned out, the report of the loss described how the "1,387-pound orbiter" was to slow to ô9,840 mph from its interplanetary cruising speed of 12,300 mph". Further, we were told, "the signal takes 11 minutes to cross the 121.9 million miles to Earth". The significant feature is the choice of units in the report.
The original orbit was to be highly elliptical, and then, over 45 days, the spacecraft would dip gently into the planet's thin atmosphere and use the slight drag of that atmosphere to make it take up a more circular orbit. By mid-November, the probe would have been in a "262-mile-high" orbit and ready to relay data between Earth and the Mars Polar Lander when it arrived.
In space terms, the whole project is a "cheap" US$327.5 million mission jointly known as Mars Surveyor '98. Recent Martian studies have included Mars Pathfinder and its rover, Sojourner, which explored the geology of the planet and found some evidence in 1997 that water may once have flowed, while Mars Global Surveyor has been mapping the surface and finding further evidence of past water since late in 1997. (See \JA river ran through it\j, February 1998, \JMartian dunes?\j, Feb 1999, and \JEvidence of plate tectonics on Mars?\j, May 1999 for earlier stories.)
By the following day, when none of the fall-back contact options had worked, mission controllers had given up hope of finding the craft again, and they had gone into investigative mode, trying to find out why the craft had crashed, and their news releases, again significantly, reported that they now believed that the US$125 million NASA spacecraft had come within "37 miles" of the surface, and then crashed. Software problems and human error, said project operations manager Richard Cook, were most probably to blame - and how right he was!
It now appears that a mix-up over metric and "English" measurements caused the problem. These English measurements, according to American sources, are the familiar terms (to Americans) like inches, feet and pounds, while "metric units" (that is, SI units), used exclusively in the rest of the scientific world, are also commonly used in space exploration - but not totally.
A spacecraft team at Lockheed Martin Astronautics in Colorado apparently submitted acceleration data in pounds of force, but JPL navigators assumed the numbers were metric newtons. Error checkers failed to notice the discrepancy, thus committing an error that would cause any physics student to be shredded by his lab supervisors - one of the greatest crimes for an undergraduate is not making sure of the units being used.
Pounds of force and newtons are not the same thing, and when you exchange one for the other, while navigating a narrow window of opportunity behind a planet, far, far away, this is not a recipe for success and career enhancement. In reality, though, the problem is not so much why the error happened as to why it was not detected.
Like the problem with the simple O-ring on Challenger, this small error has been hugely magnified at a great distance - recalling to mind what \JArchimedes\j said about levers.
"People sometimes make errors," said Dr. Edward Weiler, NASA's Associate Administrator for Space Science. "The problem here was not the error, it was the failure of NASA's systems engineering, and the checks and balances in our processes to detect the error. That's why we lost the spacecraft."
Mars Polar Lander is still set to arrive on December 3, and now mission controllers will have to rely on direct communications with Earth as well as relaying information via the Mars Global Surveyor. "Our clear short-term goal is to maximize the likelihood of a successful landing of the Mars Polar Lander on December 3," said Weiler. "The lessons from these reviews will be applied across the board in the future."
#
"Europa's cracks created by ocean tides",1212,0,0,0
(Sep Æ99)
Scientists at the University of Arizona believe they can explain how Europa's chains of scalloped lines are created, according to a report in \IScience\i in mid-September. The lines have been known since the \JVoyager Project\j revealed the details in 1979, chains of scalloped lines joined arc-to-arc at the cusp ran for hundreds of miles across the frozen, fractured surface. These strange "cycloidal" features, or "flexi," as they were named by the International Astronomical Union, have remained a mystery until now, with no good explanations.
Europa has a 160 km (100 mile) thick layer of water, but the visible top layer is frozen. Gregory V. Hoppa, B. Randall Tufts, Richard Greenberg, and Paul E. Geissler of the University of Arizona Lunar and Planetary Laboratory believe that the cracks are a result of daily tides, with the subsurface ocean rising and falling under the icy surface. They have come up with this notion after painstakingly modeling and scrutinizing images of Europa taken by the Galileo spacecraft between 1996 and 1999. The new images show that cycloidal cracks and ridges are widely distributed over all of the moon.
Europa is about the size of our moon. Tidal stresses on its ice-covered ocean ebb and flow as it orbits Jupiter, which is 300 times as massive as Earth, so Europa's ocean tides rise and fall a distance of 30 meters. On top of this, Europa's orbit is eccentric, so that its distance from Jupiter varies, and it is this variation, they say, which causes the cycloids to form.
Hoppa has posted extra information, pictures and animations on the Web at http://pirlwww.lpl.arizona.edu/~hoppa/science.html
#
"Paperless publishing?",1213,0,0,0
(Sep Æ99)
A learned journal, once created as a means of sharing information, becomes a prestigious organ, able to draw a huge advertising revenue, and suddenly, there is angst and ill-feeling among those claiming ownership. The world's main science journals for reporting original research include \INature,\i the \INew England Journal of Medicine\i (NEJM), \IScience,\i \IThe Lancet,\i \IThe Journal of the American Medical Association\i (JAMA), and the \IProceedings on the National Academy of Sciences\i (PNAS). Of these, only \INature\i (and its stable of more recent specialist publications) is under the control of a private owner. All the rest belong to public organizations, whose first priority is to gather and share information.
Recently, there was a change of editor at the \INEJM,\i when the Massachusetts Medical Society, which owns and publishes the journal, decided not to renew Jerome P. Kassirer's contract. According to the new editor, in a frank editorial early in September, this was the result of a long-standing struggle between Kassirer and the society's leadership over the leadership's ambitious plans to expand its role as a medical publisher, although the July announcement referred only to "honest differences of opinion between Dr. Kassirer and the Medical Society over administrative and publishing issues."
What Kassirer strongly opposed in his capacity as editor-in-chief was the use of the journal's name to promote products for which he and his staff had no responsibility, with plans in place for launching and acquiring new publications, repackaging the NEJM's content for consumers, and entering into joint arrangements ("cobranding") with various information-based commercial enterprises.
Marcia Angell, the new editor of NEJM, spells out the problem as she sees it: "For many in the academic research community, it is an invaluable and integral part of their professional lives; furthermore, much of the research published in the journal is publicly funded, and its peer reviewers are all volunteers." In other words, while the society might wish to capitalize on a fine resource in an economically rational fashion, she argues that there are other values than economic ones to consider.
This editorial restraint and concern is not necessarily found in the other learned journals, great or small, privately or publicly owned, around the world. Editors of print journals are resisting change to online publishing and to reduce the costs of journals to learning institutions around the world. To the extent that research relies on access to other published research, research in the developing world is hamstrung by the sheer cost of gathering even a representative sample of journals.
Instead, scientists in the world's poorer countries must rely on access to specialist Internet lists and willing specialists in the richer countries who can access and transmit the information they need. The result is that the third world scientists gain less and less experience in key elements of research, such as carrying out their own literature searches, limiting their development still further.
As we reported last month, there are strong moves afoot in the academic community to use more online publishing, to publish faster, to make information freely available to the public and scholars alike. At a time when the pressures of economic rationalism seem to demand that scholarly publishing be turned into a "user pays" cash cow, the same pressures are stripping away the financial base which gave scholars the money they needed to access data.
The Internet is undoubtedly making changes that nobody would have anticipated. Critics of the genetic manipulation frenzy, which is now working the world into a lather of fear about GM foods, claim that the reality is one of a few activists using the Internet in a clever way, to spread their fears. The activists claim that without the Internet, the "Frankenstein foods" would already have overtaken us. It certainly appears that the activists have managed to get in first in this particular battle.
Other observers say that the UN intervention in East Timor has arisen in part from the effective use of the Internet to present a case for the East Timorese. Historians recall the way in which Roland Hill introduced the Penny Post into Britain, providing a way in which campaigners against the Corn Laws could spread their ideas.
The message is clear to scholars: if they are forced to pay exorbitant prices for journal articles, they will look to find some way around the impasse. As a group, academics tend to be at the front of the "information wants to be free" school of thought, but most of them realize that this is about as sensible as saying that petrol (or gasoline) wants to be free - somebody needs to pay for the basic production, quality control and distribution, but academics can see clearly that there are cheaper ways of spreading the word than in the hugely expensive paper journals of the world.
Here is the nub of the problem faced by publishers, whether they are of the book variety, the newspaper variety, or the learned journal variety: the next few years are likely to see a huge shift in the way information is spread. Right now, the limitation is how people can read electronic text - critics are quick to point out that it is hard to curl up with a good computer, but that could change with a single item: the "killer appliance".
Personal computing changed around the time the "killer application," Visicalc, came on the market. It was the first spreadsheet, and it revolutionized the way offices worked, and it validated the pre-existing computer technology. People who already owned computers were delighted, people who did not own computers wanted them.
In the same way, the first successful e-book, an electronic device able to present text (and possibly pictures) in a readable form, will be able to both validate and capitalize on the huge established base of machine-readable literature. What needs to be decided is whether the device will read HTML or some variant of it, or plain ASCII text, or something else instead (our bet is that the device will be fitted with a ROM carrying applications able to handle several different formats, including HTML, Adobe Acrobat's PDF {Portable Document Format}, and ASCII).
Weight is the other real problem: a smallish book weighs around 400 grams (a bit less than a pound), while a book which can be managed in comfort weighs about 600 grams, (about a pound and a quarter to a pound and a half). So long as an e-book comes in at under 600 grams, it has potential, but low eight means a smaller battery, which reduces its battery life - perhaps the answer is to have a separate plug-in battery pack which can supplement the on-board batteries, as well as a power pack running off the mains for home, classroom and office use.
The screen will need to be readable in bright sunlight, which probably means using an LCD screen which can be back-lit when needed - these have been tried and tested over the years in all sorts of hand-held devices. Obviously a paper-white display with "black print" would be desirable, and several ideas are around for just such a display, but the first practical e-books are likely to use LCD screens. And because of battery limitations, these devices will almost certainly contain no hard discs or other moving parts.
Over the years, various proposals have been made for devices which clip down over one eye, to provide hands-free reading, and a number of prototypes have been shown at trade shows or on news broadcasts, but so far, no manufacturer seems to have bitten the bullet, and moved into mass production of such readers. This brings us to a critical chicken-and-egg problem: unless the e-texts are available to be read on the e-books, the e-books will not sell, but until the e-books are available, will anybody be willing to provide e-text?
The answer, it seems, is a resounding "yes". People are producing e-text already, for all sorts of other reasons, and this means that there is a sufficient body of material available to justify the production of e-books, readers which will store and display perhaps 32 Mb of text and illustrations, downloading these from a conventional computer which took them from either a CD-ROM or from the Internet.
And returning briefly to our starting-point, once e-books are available, scholars and librarians will not wish to pay huge amounts to store paper copies of research reports, just in case it may become relevant in the future. Instead, they will wish to store bits rather than atoms, with several layers of fall-back open to them if they should lose their files. Photocopiers will get less use, trees will be less threatened, and classrooms will change their focus.
There is, of course, one drawback: if a future Dark Ages should descend upon us, how easy will it be for later generations to recover the lost information stored on tiny silvery discs, if the plastic starts to break down? CD-ROMs appear to have an excellent shelf-life, far better than magnetic media such as tapes and discs, but will it extend to hundreds of years? A book can be read even when it is falling apart, but will we be able to do the same when all the batteries have gone flat, and the generators have been cannibalized to make spears?
It matters little: the e-publishing revolution is already here, and not just in CD-ROMs like this one. The revolution, it appears, may be unstoppable, but the future of paper seems assured for quite a few years yet.
Clifford Irving illustrates this well. Irving was black-listed by publishers after he prepared a biography of Howard Hughes. Using the methods of a novelist, and detailed research, Irving prepared a biography of the notorious recluse, claiming that it was an authorized work, and featuring long quotations from the subject of his book. By January 1972, the McGraw-Hill Book Company had announced the imminent publication of \IThe Autobiography of Howard Hughes: Introduction & Commentary.\i
At that time, Hughes was locked in a battle with Paul Getty and the Sultan of Brunei for the title of "richest man in the world" and he was also world-famous as a moviemaker, a record-breaking test pilot, and as the lover of several dozen Hollywood movie stars. He was also a total recluse.
Irving had reckoned without Hughes, who surfaced long enough to denounce the forthcoming book as a fraud, which left Irving stranded, and blacklisted by his American publishers. The book itself could not be published, and would not be published still, were it not for the action of a company called Terrific Books (closely associated with Irving) who decided to create a Web site (http://www.terrificbooks.com/) to market his book, advertising it as "Blacklisted for 27 years".
In an interview by e-mail, Irving commented to us: "Sales are steady but awfully slow. As a writer, I haven't got the time to devote to this venture; I'm just hanging in there. I want to get back to \Iwriting\i. No, I \Iam\i back to writing." Irving told us he had in mind becoming a multimillionaire as a result of founding a cutting-edge Internet publishing venture, but may end up with just an interesting learning experience. He added that Terrificbooks.com "have a couple of books lined up by best-selling writers but they're on hold."
Aside from the prospect of publishing those books, the Terrificbooks exercise is more like traditional self-publishing of books on paper but sold through the Internet (or by phone in the US, adds Irving - "505-988-9662. We ship free" he told us. It is closer to Barnes and Noble or Amazon.com (late news - Amazon have just taken on the Hughes autobiography) which sell print on paper, rather than bits of information.
#
"Paperless publishing? (2)",1214,0,0,0
(Sep Æ99)
In a different vein, we find a firm like 24x7, operated in Australia, let it be said, by our parent company. The aim of 24x7 is to allow people to access to the best computer books in a hurry, by carrying some 600 indexed titles. Publishers include Macmillan (Que and Sams), Osborne/McGraw-Hill, MIT Press, John Wiley and Sons, and Sybex, among others.
The search engine is extremely robust and intelligent. If it is fed a really stupid search string like "computing or compute or computer or Java or Internet or chip", it looks at the content, and returns just the 10 most relevant books, but given "Java" alone, it returns 142 titles, allowing you to then refine your search to a winnowing of just those particular titles.
A subscriber can choose to read a chapter or chapter portion online, or print it, or order the book online - at the moment, only American suppliers of hard copy are available, but we understand that it is due to change. So 24x7 sells both atoms (hard copy) and bits. (To make contact, go to http://www.24x7.com/ and use the email link found there.)
Moving along to a more radical form of publishing, there is Novels Online, who are, as you might expect, on the Web at http://www.novelsonline.com/novels/novels.htm. Details of their acceptance methods can be found at http://www.novelsonline.com/novels/submissions.htm.
The text is set up in HTML using Front Page, though not without the occasional glitch, and the home page promises another book (their third) "in August". This underlines the point that you do not need a large amount of capital to set up as a publisher of electronic texts, though authors are more likely to be happy looking at a larger organization, especially if they are from overseas. Novelsonline sells e-texts for $US3.95, and takes the credit card fee plus 40% of the purchase price.
Compared with this backyard operation, another Web book publisher, 1st Books Library (http://www.1stbooks.com) advertises that they have more than 2,000 electronics books, including hundreds of free books, and established authors get a 40% royalty on prices which range from $2.00 to $5.95 for electronic copies, while hard copy, sometimes available, is generally more. (New authors get a 30% royalty, and there are set-up fees as well.) The royalties are a higher percentage because the prices are lower, so that overall, the writer will get about the same payment for a work.
The hard copy in this case uses Print On Demand (POD) technology, said to be available from some 24,000 points in the USA and 1,700 points outside the USA. Think of a device looking like several large photocopiers lined up end to end, and costing anywhere up to $250,000, and you have a modern digital printer. Wherever one of these exists, and is connected to the Internet, you have a potential POD outlet.
Imagine you are the author of \IPond Snails of the Gobi Desert,\i which is a work of excellent scholarship, likely to sell 200 copies a year in a good year. It would be next-to-impossible to get a publisher to look at the work, because the costs of typesetting, editing, proofing, printing, maintaining stock and distributing copies would just be too great. And you can forget about color printing for the pictures . . .
Now imagine that you have prepared your book on a word processor, you have scanned in your pictures of all the snails of the Gobi, and you have stored the text, the pictures, and the formatting information on a Web server somewhere. You can now offer the work for sale either as an electronic file, to be stored on another computer, or as a POD book.
The purchaser, having selected your book as a POD purchase (to take into the field) and as an e-book to refer to at home, clicks on two buttons, provides credit card details, collects the electronic file immediately, and then must travel to the nearest POD outlet, perhaps at a local print shop, perhaps at a local book shop, where the book will be waiting, all neatly bound. There are no freight costs to get the book there, no real inventory costs, and typesetting, editing and proofing have been thrown back on the author.
This is probably the biggest drawback in the POD system, since few authors are capable of producing a good book without the hand of an experienced editor - they are too close to the story they are telling, and take too much for granted. Even text like this has gone through the hands of an editor who checks grammar, sense and phrasing, to ensure that the reader gets an even break. This is easy enough to fix - it is just a matter of getting authors to realize that they cannot get by without an editor.
Alongside these new trends there is more normal "Web publishing". Taking a real-life example, your reporter and his family happen to have an interest in algae, but this is, we would have to admit, unusual. Moreover, even in that specialized household of biologists, none of us would be likely to buy a book on freshwater algae, though there is a suitable work available, \IFreshwater Algae in Australia\i by T. J. Entwisle, J. A. Sonneman, and S. H. Lewis (published by Sainty & Associates). Now this information is also being introduced onto the Web, as part of the Sydney Royal Botanic Gardens' new Plantnet service.
This can be accessed at http://plantnet.rbgsyd.gov.au/index.htm, where first-time users have to register (registration is free, at the moment). Services available include a weed alert, information on plants at risk, and just added, Tim Entwisle's freshwater algae section, still under construction, but already showing considerable signs of promise. While the printed book remains available for true specialists, the basic information and the illustrations from the book will be on the Web.
The information goes beyond the original book - according to Entwisle, "based on Day \Iet al.Æs 1995 Bibliographic Checklist of Non-Marine Algae in Australia,\i the census consists of a searchable, nomenclaturally revised database of freshwater algae reported from Australia, including distribution data and preliminary conservation codes," and it also relies on a number of other recent works.
Studies like this census of "non-marine" (i.e., terrestrial, aerial, and freshwater algae) are much easier to search in electronic form, and many of them would be unlikely to gain commercial publication in this day and age. A hint for first time users: there was no obvious browse facility, but inserting a space or the letter "e" seems to serve to flush out most or all of the records for viewing.
The letters column of the September \INew England Journal of Medicine\i was full of discussions about the E-Biomed proposal (\JSerious publishing\j, August 1999). One writer, supporting the idea, pointed out that Internet dissemination of the results of clinical research on the Internet is inevitable, and that it would be far better to have this done from a high-quality, peer-reviewed site under the leadership of a publicly funded institution, such as the National Institutes of Health (NIH). This would give credibility, an international reach, and independence from commercial support.
Another writer, also supporting the idea, noted that the start of on-line publishing appeared to be blurring the distinction between journals and data bases. At the same time this might affect journal review and financing, said two other correspondents, arguing that electronic publishing would allow scientists and clinicians access to a more detailed archive of data than is available in print journals. They instance scientific and clinical papers which are associated with extremely large data sets, such as whole-genome sequences or the results of large, multi-center clinical studies, and suggest that the electronic form of presentation would be a great deal more thorough than in a print journal.
(The various genome projects are now beginning to offer their data in searchable form, where scientists can insert a sequence of interest, to see if it turns up in somebody else's data as well. This aspect is more like data storage and retrieval than traditional publishing.)
Another pair of writers, Charlotte Bell and Keith Ruskin, opined that to " . . . advocate the continuation of the current traditional but cumbersome system in the face of the evolving new media is, at best, short-sighted and, at worst, may signal the loss of control by physicians of the information systems on which we depend." They see Internet publication as offering interactivity, multimedia capability, and dynamic response, and they point out that results and conclusions can be presented through interactive spreadsheets or streaming video, rather than being limited to text or still photographs.
They see the opportunity for the reader to download the data to carry out extra analysis (although this could carry its own problems - see \JComputers and chemistry: a look backwards and forwards\j, August 1999), and they suggest that review panels "can disseminate commentary to large, geographically diverse audiences without relying on interpretation by the media, allowing peer review to take place within a virtual community of experts and users."
Moreover, if the learned societies do not take over this task, they suggest, commercial interests will step in: "Pharmaceutical and biotechnology companies with virtually unlimited resources are already using principles of marketing and advertising, rather than those of science and scholarship, to disseminate biomedical information."
This consideration would no doubt alarm the NEJM's last correspondent, who points out that the biotechnology industry, "spawned less than a quarter of a century ago", has changed the nature of the game. Today, says Edmund C. Tramont, the stock exchanges respond to biomedical news, and in the past, the announcement of scientific results, sometimes in advance of critical review, can cause stock markets to behave wildly. If there exists a "publication" sponsored by the federal government - which might be inferred from the NIH running "E-Biomed" - which lacks the traditional checks and balances, this could be easily misinterpreted or, at worst, misused.
The letters were in response to an editorial, and the author, Dr Arnold Relman, responded that it really came down to how you share the data - he favors publication first in a peer-reviewed paper journal, followed by on-line release - rather like the "man with a red flag" supposedly required to walk in front of motor vehicles in Britain in the 19th century. This would, he suggested, still allow for the fuller publication of data sets in the electronic format.
So it begins to look as though paperless publishing is on its way - but right now, there are people working to keep paper in there somewhere as a part of the equation. And where would you rather store your gourmet recipes - in a spiral-bound plastic-covered book that will lie flat on the kitchen bench, or on a computer, that you will need to lug into the kitchen, hoping against hope that there will be no sun-fried tomato in the mouse, or virgin olive oil in the keyboard, when you are finished? The only alternative, it seems, would be to print out the page you needed - and that seems to make a mockery of the name "paperless publishing".
\BInteresting links\b: William Calvin, mentioned this month in \JMaking a smart mouse\j, has put nine of his books on the Web: see http://williamcalvin.com/ and follow the links to see how Web publishing may look in the future. A database of almost half a billion words can be searched through this front page, http://www.logos.it/literature/literature.html. Located in Italy, there are more than 160 languages listed for this collection, though some of them are represented by just a single work, ranging to more than 700 in French and Italian and nearly 2,500 in English. The major collection in English is Project Gutenberg, located at http://www.gutenberg.net/. Australian literature is presented at http://dargo.vicnet.net.au/ozlit/ and The University of Sydney SETIS collection at http://setis.library.usyd.edu.au/, which includes explanatory notes.
#
"Why paintings turn yellow",1215,0,0,0
(Sep Æ99)
A recent Dutch report has shed some interesting light on the question of why old paintings take on a yellowish appearance. Researchers at the FOM Institute for Atomic and Molecular Physics in Amsterdam have been looking into the chemistry of varnishes on old masters, and the effects of the cleaning process. They mainly studied particles of varnish taken from more than 50 paintings by such Dutch masters as Rembrandt, Steen, and Van Gogh.
Varnishes of that period were mainly made from tree resins, and they were put on the paintings both to protect the paint layer and give a shiny surface, and also to make the colours of a painting appear more intense. With age, however, the layer of varnish gradually changes from colorless to a yellowing, shrinking crust with the characteristic craquelΘ of old paintings, a network of fine cracks, stretching across the surface. The yellowing with age has the effect of altering the whole painting, changing blue to green, for example.
The chemical composition of the varnish changes with time, some molecules being oxidized or decomposing into smaller molecules, while others polymerize into larger molecules. Restoration usually involves washing with various alcohols or acetone, solvents which easily remove the small molecules such as the oxidation fragments of the original triterpenoids in the resin, but the larger products are much harder to remove, and so they build up on the surface of the paint. The only solvents which remove these large molecules also take off the paint!
The researchers have been trying to reproduce this sort of breakdown in the laboratory, exposing pure triterpenoids or triterpenoid varnish to light in order to artificially induce accelerated ageing. In just a few weeks, the molecules had formed cross-links, creating the characteristic yellow color, but they still have to show that these larger complexes have the same chemical structure as the substances which cause the yellow color in aged varnish.
\BKey names:\b Jaap Boon, Gisela van der Doelen
#
"Tokaimura nuclear incident",1216,0,0,0
(Sep Æ99)
On September 30, a nuclear accident occurred at a Japanese nuclear fuel reprocessing facility. Three workers were reportedly exposed to serious radiation, and the area was evacuated. Tokaimura is a village of some 34,000 people in the Ibaraki Prefecture, 140 km (90 miles) northeast of Tokyo. There are 15 nuclear-related facilities in the area, and it was the scene of Japan's worst nuclear plant accident in 1997, when 37 workers suffered radiation contamination.
That figure has now been exceeded, with at least 55 people known to have been exposed to radiation, including 45 workers at the plant, three firemen and seven people working at a nearby golf course. The 1997 incident was also in a nuclear reprocessing plant, and involved a fire which caused radiation to escape, and also an explosion. Japan has no useful deposits of fossil fuel, and so has to rely on 51 commercial nuclear power reactors providing one-third of the country's electricity.
The incident began when workers at the plant, run by JCO Co. (owned by the Sumitomo Metal Mining Co.), put nearly eight times the proper amount of uranium into a mixing tank, triggering an uncontrolled nuclear reaction which left three workers seriously ill. The full scientific detail of this incident will be discussed in a depth report next month, to be called \JNuclear fuel processing methods\j. The report will look at the nuclear industry, nuclear fuels, and some of the concerns being expressed about the industry.
Hisashi Ouchi, one of the workers at the site, was reported to have suffered a radiation dose worse than anything experienced at Chernobyl. Doctors said he would need luck to survive. He received 17 sieverts of radiation - seven sieverts is considered a lethal dose for half of a population exposed to that level of radiation. The odds against Ouchi surviving are in the 5-10% range. Masato Shinohara, 39, received the second highest level of exposure, about eight sieverts, and both men were given stem cell transfusions in a treatment similar to that given to leukemia victims. Ouchi's stem cells came from his brother, while Shinohara, who has no close relatives who would make suitable donors, received his from the umbilical cord of a newly-born baby.
The accident has now being given a grading of level five, the same as the US Three Mile Island incident in 1979, while Chernobyl was level seven, the highest rating - at least for now. The main difference is that the effects were localized, unlike Chernobyl, which had global implications. Even so, the incident was a public relations disaster for nuclear power generally, and for Japanese nuclear power in particular.
Twelve days after the incident, it was discovered that a fan was still venting radioactive material from the plant. This was discovered when heightened levels of radio-iodine were detected around the plant.
#
"Jurassic mammals in Madagascar",1217,0,0,0
(Sep Æ99)
A report in \INature\i in early September revealed that three tiny teeth, embedded in a piece of lower jawbone from a small Jurassic period Tribosphenid mammal, had been found on the island of Madagascar. This single find more than doubles the age of the oldest known mammal from Madagascar, and adds significantly to the range of terrestrial fossils known from the island.
The Tribosphenids is the group which includes marsupials and placentals. This find overturns the belief that the Tribosphenids evolved in the Northern Hemisphere and later moved south, because this fossil is older than anything found in the north. It is still possible that something as old, or older, may be found in the north, as fossils like this are usually tiny and easy to miss, but it now seems more likely that the mammals evolved in the southern hemisphere, on the Gondwanan supercontinent.
In the Middle Jurassic period, some 165 million years ago, these small, furry creatures, the size of a house mouse, were scampering around under the feet of allosaurs and brachiosaurs, much earlier than most people think.
These small and inconspicuous mammals managed to avoid the larger dinosaurs, perhaps by being active at night - when the dinosaurs may have been less active, if they were cold-blooded at that time. The find helps to underline something that scientists have known, and the public have largely ignored: mammals and dinosaurs appeared on the evolutionary scene at about the same time, and the two groups lived side by side for more than 100 million years.
All the early mammal fossils are small fragments which are hard to find, but teeth are very good indicators, and these survive very well, though they are unlikely to be spotted in the field by the naked eye, which has helped maintain the popular misconception that mammals came later.
The teeth, three molars, are each smaller than the head of a pin. They were sifted out of bags of sediment and identified during microscopic inventory at the Field Museum of Natural History, where researchers were delving into material collected three years ago.
Previous molecularly-based estimates of a Middle Jurassic divergence between the major groups of tribosphenidans, also called therians, have proposed that the divisions happened earlier than this fossil find seems to imply. While the "molecular clock" group will want to see other fossils offering confirmation, the find reminds us of the importance of using \Iall\i of the evidence to get a consistent picture of past evolutionary history.
#
"Earthquake news",1218,0,0,0
(Sep Æ99)
By the end of September, accurate estimates for the death toll in August's \JTurkey earthquake\j were still not available, and the figure, stated only as "more than 15,000" could still have been as high as thirty or even forty thousand. Earthquakes of slightly greater magnitude and similar depth occurred in Taiwan (September 20) and Oaxaca, Mexico (September 30), killing about 2,100 and 20 people respectively. A smaller earthquake in Athens (5.9 against 7.4 in Turkey and 7.5 in the other two cases) killed about 200 people.
One good point from all that death and destruction was the development of "earthquake diplomacy," with Greeks and Turks realizing that the "other side" was not so bad after all, considering the help which had been forthcoming, without question, on each occasion.
A magnitude 6.5 quake, centered about 20 km (12 miles) southwest of the city of Tehuaca, in the state of Puebla in Mexico, took place on June 15 last, and passed without notice, lacking the extra awareness that people felt after Izmit. All the same, it killed 17 people, damaged more than 1,000 schools, nine hospitals and more than 14,000 homes, displacing about 20,000 people.
A report on the June damage was released in September, and it stressed the need to enforce special seismic design standards for critical buildings such as schools, hospitals and fire stations. Some of the hospitals, for example, were forced to close just when they were most needed for surgeries and emergency services.
Part of the problem stems from the roofing materials used: in Mexico, these tend to be heavier than the more expensive steel joists and girders preferred by American builders, and it is the movement of these heavy roofing elements, and the loads they impose, which actually does the damage. The quick answer, in some cases, is merely to add a few more walls to increase the structural integrity of key buildings. Julio Ramirez, who led the team of more than 20 researchers, said that in the case of one apartment building which collapsed, it is likely that just a few extra walls would have protected the building.
According to \INew Scientist,\i Eddie Grant, an electrical engineer at North Carolina State University, has designed a robotic caterpillar to listen for survivors trapped deep inside wrecked buildings, and six senior engineering students have built it. The robot is powered by compressed air, to avoid igniting any escaping gas.
The idea is that the robot would crawl through gas, water and sewerage pipes inside buildings damaged by earthquakes and explosions, listening for signs of life. Grant points out that even in a collapsed building, pipes are likely to offer a natural channel into the rubble.
The machine is a meter long, 15 cm across, in three segments, carries lights, a video camera and a microphone to detect calls from survivors, and it is called Moccasin II. It inflates four pneumatic pads to wedge its front end in a pipe, then pulls the back segments forward before wedging the back and extending the front further into the pipe. The 15 cm standard is common for service pipes in the USA, and Moccasin is able to make its way around formed multiple 90-degree bends.
The \INew Scientist\i article can be found at http://www.newscientist.com and the story appears in the issue dated October 2.
#
"How hot is the Earth's core?",1219,0,0,0
(Sep Æ99)
The old question has a new answer, according to a paper in \INature\i at the end of September. Scientists at University College London have developed a novel way of taking the core's temperature. Knowing this value is crucial, because there is a tremendous outflow of energy. The swirls and turbulence of the liquid iron in the core creates the Earth's magnetic field, but without a knowledge of the temperature, it is impossible to understand how the mechanism works.
While it is possible to make an informed guess at the temperature by finding the melting point of iron under the enormous pressures at the earth's center, the new answer comes from exploiting the power of massively parallel computers, in this case, the power of a Cray T3E computer.
Their result is a value of 6500║ C, about the temperature of the Sun, but they suggest that the core is probably not pure iron, so a figure of 5500║ C is probably more accurate. And that is the last word on the subject - for now.
#
"Fire ants and the elderly",1220,0,0,0
(Sep Æ99)
Tales of ants attacking and killing people sound like discarded plot lines from the horror movie industry, but fire ants in the United States are turning bad fiction into worrying fact. Fire ants arrived in the US through the port of Mobile, Alabama, in the 1930s, from South America. They are now all across the south-east of the US, and also in Puerto Rico. Imported fire ants now infest more than 310 million acres (124 million hectares). They have eaten most of the native black ants, and have also been reported from some western states, including Arizona, California, and New Mexico.
The ants will attack people who disturb their mounds, and they have killed about 80 people in this way over the years, but two case reports in a September issue of \IAnnals of Internal Medicine\i describe worrying incidents of fire ant attacks in two Mississippi nursing homes. In each case, an elderly patient was found covered with ants, with ant trails leading from the floor to their beds. The ants thrive in disturbed habitats, such as those found around construction sites - both of the nursing homes were newly built on former farming land.
Fire ants usually get their food by stinging and killing invertebrates, but they have been known to kill farm animals if other food is unavailable. They also scavenge dead and dying animals, and must have taken the bed-ridden victims as "dying". There have been about ten previous attacks on humans indoors, mainly on the elderly and infants, and two previous deaths. The author, Richard deShazo, warns that , "any sighting of a swarm of ants indoors is a warning. Residents and caregivers of infants, children and bedridden people, such as patients in health care facilities, should be closely watched until the ants are eliminated." He recommends assuming that if one fire ant is seen, an active infestation is present.
Fire ants are attracted to electrical fields, so deShazo recommends checking the interiors of all electrical equipment, including computers, air conditioners and circuit breakers, anything with wires, fuses and switches. As well, professional baiting and spraying may be called for, to kill the ant colony's queen.
The ants have not only wiped out native ants, but threaten some species of ground-nesting birds and other wildlife. The imported species, \ISolenopsis invicta,\i (there is also a major native US fire ant, \ISolenopsis xyloni,\i and a number of lesser native fire ants) usually nests in mounds formed in the ground. In one case report, the bed-ridden patient was checked at 1 am, when no ants were present, but at 4 am, many ants were present, and more than 500 bites were identified. This patient died five days later.
The second victim was also found covered in ants at a 4 am check, in a separate nursing home. His condition deteriorated slowly, and he died of apparent sepsis 13 months after the fire ant attack. The families of both victims are reported to have launched legal action against the homes, the builders, and the homes' pest inspection contractors.
The ants are fast-moving (1.6 cm/sec), and communicate by pheromones and also through visual and vibrational stimuli. They can accumulate on the body in large numbers before detection, and will then sting almost simultaneously, probably on some signal, with each sting bringing a sense of intense burning. The fire ant venom is unusual for an insect venom, as it is almost 1% protein, as well as "aliphatic substituted alkaloids that are cytotoxic" - organic chemicals which kill cells. The protein portion can cause severe allergic reactions in some victims.
The experts who did not agree that the world population topped six billion were satisfied that the magic number was reached in October, but that is nothing compared with what some experts are predicting for the year 2100. In one scenario, democratically determined population-control practices and sound resource-management policies could see a sustainable population of two billion, while the alternative is a sprawling brawling misery of 12 billion humans.
David Pimentel is the lead author of the report titled "Will Limits of the Earth's Resources Control Human Numbers?" in the first issue of the journal \IEnvironment, Development and Sustainability,\i and while he recognizes that it will be difficult, he argues that it should be possible. In fact, it might be the easy way out: "It will be much more difficult," Pimentel says, "to survive in a world without voluntary controls on population growth and ever diminishing supplies of the Earth's resources."
A lifestyle less wasteful of resources, and not as luxurious as it is for many Americans today, but about equal to that of the average European or Australian in the 1990s, could see stability, without the suffering we are now beginning to see as disease and malnutrition become the limiting factors to human population.
According to the report, the world's population is set to double again by about 2050, and that even if a worldwide limit of 2.1 children per couple were adopted tomorrow, Earth's human population would continue to increase before stabilizing at around 12 billion in more than 60 years, because of "population momentum," driven by the predominantly young age structure of the world population.
The report also argues that increasing population will place restrictions on certain freedoms. People will no longer be able to assume they have the freedom to travel and commute to work quickly and efficiently, the freedom to visit and enjoy natural areas, the freedom to select desired foods or the freedom to be effectively represented by government.
Right now, half of the world's humans suffer from malnutrition, which increases the susceptibility to diseases such as diarrhea and malaria. Yet since 1983, the production of grains per capita has been declining, due largely to a 20% decline in cropland per capita, a 15% decrease in water for irrigation and a 23% drop in the use of fertilizers. Emerging biotechnologies have not been applied fast enough to keep up with the population uprush.
#
"Climate change and greenhouse gases",1222,0,0,0
(Sep Æ99)
The nature of science and the way science is done means that nothing is ever totally absolutely proved to be true. Even so, there are many things that all scientists believe and accept, because to believe otherwise seems ludicrous.
This leaves the problem of what to believe when two competing viewpoints are both feasible. This is the position at the moment with global warming, the greenhouse effect and rising sea levels, because the small changes we have seen \Imight\i be caused by something other than the greenhouse effect. Perhaps the world would have warmed up like this anyhow, maybe the warming is part of a natural cycle, or perhaps we are seeing what we expect to see.
At this stage, the end of 1999 has seen a change. Where once it was the greenhouse supporters who were seen as eccentric mavericks, it is now the greenhouse deniers who are seen in this light - but that still does not mean they are wrong: in science, truth is not determined democratically, it is determined by the facts. But right now, the facts appear to support the views of the majority.
The American Geophysical Union published a position statement on Climate Change and Greenhouse Gases in \IEos\i last February, but this was intended for lay audiences, and did not include all of the references to the published scientific literature upon which it was based. In late September, this gap was filled with a thorough, documented analysis of the peer reviewed literature, written by the same authors and itself rigorously peer-reviewed. The document included a bibliography of 189 authoritative research studies, and was based on those studies.
The new study means we now have available to us a meticulous study of the available data, and the nearest possible thing to an authoritative statement on the current state of global warming. Their conclusion is that atmospheric concentrations of the main greenhouse gases produced by humans (especially carbon dioxide, but also methane, nitrous oxide, and chlorofluorocarbons) have significantly increased during the industrial period.
The paper reminds us of one important point: the "greenhouse effect" is a natural phenomenon that we need to survive. Infrared (IR) active gases, mainly water vapor (H\D2\dO), carbon dioxide (CO\D2\d), and ozone (O\D3\d), are naturally present in the EarthÆs atmosphere. These gases absorb thermal IR radiation emitted by the EarthÆs surface and atmosphere. Our atmosphere is warmed by this mechanism and, in turn, emits IR radiation, with a significant portion of this energy acting to warm the planet's surface and the lower atmosphere.
As a result, the average surface air temperature of the Earth is about 30░ C higher than it would be without atmospheric absorption and reradiation of IR energy. That sort of change would see permanent ice almost everywhere outside of the tropics, and dry conditions in the tropics themselves, because all the water would be tied up as ice.
So the "greenhouse effect" is not bad or worrying on its own - the worrying thing is the runaway effect which seems to be following on from our addition of extra greenhouse gases, generated by our industries.
They predict that these raised concentrations will last for up to thousands of years, and that by increasing the amount of infrared radiation absorbed into the atmosphere, these gases produce a warming influence at the Earth's surface. They found that in the past 150 years, global temperatures have increased an average of 0.3-0.6 degrees Celsius (0.5 to 1.0 degrees Fahrenheit).
They predict that further increases in greenhouse gases will cause more changes in the climate system. These changes include increases in average surface temperature, increases in rates of rainfall and evaporation, rising sea levels, and changes in the biosphere. The size, location, and rates of these changes remain uncertain.
The text of the report, and the detailed bibliography, can be obtained on the Web at http://www.agu.org/eos_elec/99148e.html
\BKey names:\b Tamara S. Ledley, Eric T. Sundquist, Steven J. Schwartz, Dorothy K. Hall, Jack D. Fellows, and Timothy L. Killeen.
#
"Indian rivers threatened",1223,0,0,0
(Sep Æ99)
One of the less obvious aspects of global warming is the threat leveled at up to 600 million people who depend on rivers which are fed by Himalayan glaciers. A number of rivers crossing the north Indian plains could run dry within 50 years, given the speed at which Himalayan glaciers are retreating. A number of smaller Himalayan and Kharakoram glaciers have already disappeared entirely. The Khatling glacier, one of those which used to feed the Ganges, is still marked on trekking maps, but now there is just an empty valley, free of ice.
Glaciers which melt slowly during the summer provide a steady flow of water to the rivers, with winter snows replenishing the stock of ice and snow for the next summer. In some parts of the world, sphagnum bogs provide large-scale storage which drip-feeds water to the lowlands in summer, but in these high mountains, the steep valleys only have one form of storage: glaciers.
Gangotri, the biggest glacier in the Garhwal Himalayas, the mountains which are the main water source for the Ganges, is receding up into the mountains at almost a kilometer a year. Already, experts are predicting wars over water resources in the next century (see \JA cause for war\j, January 1999), and this scenario would see some 10% of the human population under direct threat, in an area where nuclear arms are available.
In Nepal, the situation is even worse. With massive deforestation and an enhanced melting rate of glaciers, downstream areas are threatened with flooding. On top of this, irregular melting causes instability in the glaciers, and this results in the formation of boulder moraines which are likely to breach in a cloudburst, or after movement in this area of active tectonic uplift. The Indian plate is pushing up into Asia, and forcing the Himalayas to soar seven or eight kilometers into the sky. There are currently seven lakes in Nepal which have been classified as dangerous.
This is more than a potential risk waiting to happen: it is already happening. Nepalese deforestation, together with a bursting glacier lake, caused severe floods in the downstream Indian state of Bihar in 1998, and in 1999, smaller floods have already killed 250 people.
#
"McMurdo Dry Valleys images released",1224,0,0,0
(Sep Æ99)
The McMurdo Dry Valleys form the largest relatively ice-free area in Antarctica (approximately 4,800 square kilometers within a 75 x 60 kilometer region), and during September, the US government released satellite imagery of the area which has been classified since the 1970s. The area is a designated site in the US National Science Foundation's Long-Term Ecological Research (LTER) study (more information: http://lternet.edu). In fact, it is the only Polar desert site in the LTER program, which consists of a network of 21 ecosystem research sites extending from Alaska to the continental United States to Puerto Rico to Antarctica.
The Dry Valleys are ecologically significant, according to the NSF ". . . because they are a region where life approaches its environmental limits and they stand in stark contrast to most of the world's other ecosystems, which exist under far more moderate environmental conditions." The first image, at a scale of 1:300,000, shows the McMurdo Dry Valleys, in the Transantarctic Mountains. The second image, at 1:50,000, shows the eastern end of Lake Bonney and vicinity in the Taylor Valley, and it covers the small area indicated in the inset on the first picture.
The permanently ice-covered lakes, ephemeral streams and extensive areas of exposed soil within the Dry Valleys are subject to low temperatures, limited precipitation and salt accumulation. The life forms found there are restricted to a few sparsely distributed microorganisms, mosses, and lichens, and higher forms of life are virtually non-existent.
The main value of the pictures, aside from their 8-meter resolution, ten times better than anything available before, lies in the historical record the pictures offer at a time when glaciers all over the world are retreating fast. And better still, the images will be available in stereo pairs. This means that because of recently developed digital methods, the satellite images will improve the topographic record by 2 to 4 times and will provide the first high-resolution, digital terrain-elevation model of the region, with accurate elevation data every 20 meters.
#
"Lead poisoning",1225,0,0,0
(Sep Æ99)
Correspondence to the \INew England Journal of Medicine\i during September revealed an unusual case of lead poisoning in America, in four adolescent girls, 14 to 16 years old who, within a three-week period, were found to have lead levels of 18, 19, 20, and 28 ╡g per deciliter. Usually, lead poisoning is found in young children, adults who are exposed to lead in their working life, or people renovating old homes and stripping back old, lead-based paint.
All four girls were competitive markswomen at a single indoor firing range, and all had reportedly followed standard safety measures recommended by the range, including hand-washing and clothing changes. The author of the letter noted that the lead is likely to accumulate in bones, especially in a large trabecular-bone pool that is readily mobilized in times of need. He concludes that women poisoned with lead during their childbearing years are at risk for exaggerated mobilization of lead during pregnancy, with increased exposure to the fetus and then to the baby through lead in the breast milk.
The sport, he notes, is becoming increasingly popular, and should be managed with greater concern and education.
\BKey name:\b Michael Shannon.
#
"Picking the baby's gender",1226,0,0,0
(Sep Æ99)
Researchers at the Johns Hopkins School of Public Health decided to test a few bits of folklore, according to an article in the September issue of the journal \IBirth\i. And that is: can a pregnant woman find out the sex of her baby by dangling her wedding ring by a string over her abdomen and noting which way it swings? And can you tell, because a woman is carrying her baby high, that it's a boy?
These days, ultrasound answers questions like that, far more satisfactorily, but in the past, mothers have had to rely on mythology if they wanted any hint about what sex the baby was. The short answer is no, but with a surprise in the tail. The traditional methods used to predict the gender of unborn babies are no more accurate than mere guessing, but strangely, mothers-to-be with more than 12 years of education were far more accurate than less educated women at predicting their baby's sex.
What makes the finding really odd is that the educated women said they were basing their gender predictions on dreams or (may we say it?) gut feelings, and these apparently unlikely predictions were the most accurate. The investigators asked 104 pregnant women to use whatever method they liked to guess the sex of their babies, and the guess rate was just 55% - well within the range that could be achieved by chance, but women with more than 12 years of education got it right 71% of the time, while the group with less education hit the mark just 43% of the time - again, well within the range of variability that could be achieved by chance guessing.
There seems to be no ready explanation for the 71% hit rate, which is well outside the normal levels of probability, though in statistics, even the most improbable event will happen, sooner or later. Without replication in other studies, and an assurance that a few of the educated mothers-to-be had not been peeking at the ultrasound scans, individual women who feel sure of their baby's sex before birth should not be painting the nursery blue or pink just yet - no matter how well-educated they are.
#
"Nanci Youngblood Brasket (1946 - 99)",1227,0,0,0
(Sep Æ99)
Boeing and NASA staff were saddened to hear in September of the death of one of their administrative and research staff, Nanci Brasket. She will long be remembered for her sterling efforts in testing equipment designed for women astronauts, and described in her hilarious research report the experience, which appears on her memorial Web site, located at http://www.io.com/~stargazr/stories.html.
Renowned for her knowledge of recycled militaria, motorcycles and some of the more unusual aspects of chemistry, Youngblood's Internet friends have arranged for her ashes to be scattered at points across the world on October 31.
#
"October, 1999 Science Review",1228,0,0,0
\JLicorice reduces testosterone levels\j
\JA new implant therapy for prostate cancer\j
\JFamine can cause obesity\j
\JIs heart attack caused by Helicobacter pylori?\j
\JA gene for leukemia\j
\JThe evolution of sex chromosomes\j
\JThe 1999 Nobel Prize in Physics\j
\JThe 1999 Nobel Prize in Chemistry\j
\JThe 1999 Nobel Prize in Physiology or Medicine\j
\JThe 1999 Volvo Environmental Prize\j
\JAn asteroid with a moon\j
\JNew images of Neptune\j
\JNew astronomical Java program\j
\JNuclear fuel processing methods\j
\JTokaimura nuclear incident: the background\j
\JINES\j
\JNuclear waste\j
\JJapan and nuclear energy\j
\JUranium mining and safety\j
\JFast breeder reactors\j
\JA more recent date for the last Neandertals\j
\JThe oldest dinosaurs?\j
\JThe strange tale of the homosexual beetles\j
\JThe earthquake dangers beneath the Pacific Northwest\j
\JThe 1999 earthquakes\j
\JLemur news for 1999\j
\JIce-age sediment cores record extreme climate change\j
\JIs the West Antarctic ice sheet in its death throes?\j
\JSnowball Earth?\j
#
"Licorice reduces testosterone levels",1229,0,0,0
(Oct '99)
Licorice (or liquorice) root extract is commonly used in many countries as a flavoring agent, in breath fresheners, and also in a variety of confectioneries. The active component of licorice is glycyrrhizic acid, which is hydrolyzed in the body to glycyrrhetinic acid. A recent Italian study, reported in the \INew England Journal of Medicine\i during October, reveals that licorice reduces the serum testosterone levels in men aged 22-24, at dosages commonly encountered in a normal diet.
The report outlines a clear line of biochemistry which could account for this effect, since test tube studies have shown that licorice products can block 17(beta)-hydroxysteroid dehydrogenase, which catalyzes the conversion of androstenedione to testosterone. During the study, the men's serum testosterone concentrations decreased and their serum 17-hydroxyprogesterone concentrations increased, as might be expected. The report ends by suggesting that men with decreased libido or other sexual dysfunction, as well as those with hypertension, should be questioned about how much licorice they eat.
\BKey names:\b Decio Armanini, Guglielmo Bonanni, and Mario Palermo.
#
"A new implant therapy for prostate cancer",1230,0,0,0
(Oct '99)
A report published in \IRadiation Oncology Investigations: Clinical and Basic Research\i in late October, describes a new and more effective radiation implant therapy, using the isotope palladium-103, rather than the more usual iodine-125 to wipe out prostate cancers. Radio-isotopes, introduced into the cancer in special needles, are an effective way of dealing with prostate cancers, but the palladium treatment was 13% better in preventing moderate long-term complications, although the treatments were equally effective in preventing major long-term complications.
The study was headed by Richard Peschel at Yale University, and it also showed that the minimum dose for palladium-103 could be increased without extra side effects, which could offer an even better chance of achieving a cure. The side effects of treatment with isotopes include inflammation of the urethra and inflammation of the rectum or anus.
The treatment usually involves placing between 75 and 100 small radioactive seeds throughout the prostate gland, using a one-time, minimally invasive procedure. Once the cancer has been successfully imaged (see \JUltrasound imaging of prostate cancer\j), the seeds are placed in carefully calculated positions under computer control in a process called brachytherapy. For many patients, this is the preferred treatment because it is a cheaper, simpler out-patient procedure that involves far less recovery time, a lower complication rate and cure rates that are equivalent to radical surgery.
#
"Famine can cause obesity",1231,0,0,0
(Oct '99)
A recent report in the \IAmerican Journal of Clinical Nutrition\i describes the effects on a group of 741 men and women, now in their 50s, whose mothers were starved while pregnant in the Dutch famine of 1944-45, which followed the Nazi occupation. The women in particular had an increased rate of obesity, as measured by Body Mass Index (or BMI, calculated by taking the body weight in kilograms divided by their height in meters squared) and by their waist circumference.
The women typically had deposits of body fat around the abdomen. This was associated with the so-called "metabolic syndrome" which involves a tendency of high cholesterol levels, hypertension, diabetes and cardiovascular disease, and this was more common among women whose mothers were exposed to the famine early in their pregnancies. Their obesity seems not to be explained by increases in food intake or other lifestyle factors such as socioeconomic status, smoking or alcohol. Instead, the increased obesity seems to be a result of changes in the way their bodies carry out the metabolic regulation of accumulated body fat later in life.
Early pregnancy is a time when the unborn child is remarkably vulnerable, and the Dutch famine started abruptly, lasted five months, and ended just as abruptly, making it unlike other famines. During the famine, rations varied between 400 and 880 calories (1700 to 3700 kilojoules) per day, and rose to about 1000 calories (4200 kilojoules) after the May 12, 1945 liberation by Allied forces. However, this short exposure left an indelible mark on a generation - or at least on the female half of it, because the men born just after that period show relatively mild effects of the famine.
#
"Is heart attack caused by Helicobacter pylori?",1232,0,0,0
(Oct '99)
\IHelicobacter pylori\i, best-known as the cause of ulcers, and implicated in some cancers, is now being linked, once again, (see \JWill the heart ever be safe again?\j) with heart disease. A report by Dr John Danesh in the \IBritish Medical Journal\i in late October reports on case control and sibling studies which suggest that \IHelicobacter pylori\i infection is linked to a heart condition called early onset \Jmyocardial infarction\j.
The study involved a case-control study at ages 30-49 and a study of sibling pairs, once again comparing patients between the ages of 30-49 who had survived a myocardial infarction with controls who had no history of coronary heart disease. They found a moderate association between coronary heart disease and testing positive for \IHelicobacter pylori\i infection. In the case control study, myocardial infarction was twice as common in infected people, and among sibling pairs myocardial infarction was a third more common. This is not regarded as enough to say for sure that \IHelicobacter pylori\i causes heart disease, but it certainly suggests the need for a larger follow-up study.
#
"A gene for leukemia",1233,0,0,0
(Oct '99)
A report in the \IProceedings of the National Academy of Sciences\i in late October describes the discovery of a clue about how chromosomal translocations cause acute leukemias, and points the way to possible new ways of treating cancers of the bone marrow and blood. The leukemia, acute myeloid leukemia (AML), is related to a \Jtranslocation\j, a rearrangement of the chromosomes, and most commonly to a translocation called the inv(16) translocation.
This translocation results in production of a protein which collaborates to turn genes off, even when they should be turned on, leading to an acute leukemia, a rapidly progressing disease that results in the accumulation of immature, functionless cells in the blood which interfere with the body's production of healthy blood cells.
For a translocation to happen, chromosomes have to break and rejoin, producing new combinations of base pairs at the point where the new joins form. The inv(16) is the most frequent of several translocations associated with AML, and it happens on chromosome 16 when the middle of the chromosome breaks and flips around, creating an inversion translocation.
The new gene produced by the inversion changes the activity of a protein called AML-1. This is a transcription factor, a protein which acts by turning other genes on and off, but AML-1 is described by Scott Hiebert, the team leader, as a "master regulator" because it controls major cellular pathways that take part in the proliferation, survival, and differentiation of blood cells.
The gene which codes for AML-1 is one of the most commonly mutated in human leukemia, and translocations interfering with the AML-1 gene are associated with both acute myeloid and acute lymphocytic leukemias. But where researchers previously thought that inv(16) simply got in the way of AML-1 doing its job of turning genes on and off, Hiebert and his team have shown that inv(16) joins with AML-1 to actively turn genes off, even those that AML-1 usually turns on. The exact way that it does this is unclear, but they think that inv(16) "recruits" other proteins to help it force AML-1 to always turn genes off, and this could be the basis for new treatments.
One of these recruited proteins, based on evidence from other studies, could be an enzyme called histone deacetylase (HDAC). Hiebert and colleagues will now be trying to find out if HDAC is part of the group working with inv(16). In a release on the Internet, Hiebert commented: "If HDAC is involved, then drugs that block its activity - HDAC inhibitors - might have an impact. We have these HDAC inhibitors, and they're being tried for other diseases. Our data suggest that they might provide effective treatment for AML."
The other side of future work will involve identifying the genes that are being turned off by the group. Hiebert says his preliminary data suggest that the complex is turning off tumor suppressor genes, which usually protect against malignancy by causing cancerous cells to self-destruct, to "commit suicide". If these suppressors are turned off, it only takes one or two further mutations to trigger a cancer in a single cell, which can then increase without control, forming a tumor.
\BKey names\b: Scott Hiebert, Bart Lutterbach, Yue Hou, and Kristie L. Durst.
#
"The evolution of sex chromosomes",1234,0,0,0
(Oct '99)
Our gender is determined by a single \Jsex chromosome\j. Human beings have 46 chromosomes: 22 pairs of autosomes, identical pairs which carry most of our genes, and the two sex chromosomes which are the same in women, who have two X chromosomes, and different in men, who have an X chromosome and a Y chromosome. As a rough approximation, the standard body plan is female (which explains why men have nipples), and genes on the Y chromosome alter the basic body plan to the male form. (This is over-simplified, but acceptable as a starting point.)
Crocodiles and turtles have a different way of determining sex, based on the temperature at which the eggs incubate, and some species of fish can even change sex from female to male if there are no males about, so it seems that somewhere along the path of evolution, a new model for gender determination must have evolved. Over time, what were once autosomes have changed, until the Y chromosome retains just a few dozen genes - 19 of them also found on the X chromosome. For the most part, genes on sex chromosomes do not recombine, although the process of "crossing over" can be seen in the tips of the human sex chromosomes during meiosis in a male.
A late October report in \IScience\i by David Page and Bruce Lahn described how they have reconstructed this evolutionary development. The 19 genes are all concentrated on the tip of the short arm of the X, but on the Y chromosome, they are scattered across the length of the whole chromosome. In general, the order of the genes is not consistent between the two chromosomes.
Because DNA can code for the same thing in different ways, the researchers were able to look at "synonymous nucleotide divergence", changes in the base pairs, the nucleotide sequence of the chromosome, which have no effect on the operations of the genes, but which work as a molecular clock, giving us a measure of the evolutionary time which has elapsed since the gene pairs began to separate and change independently. In simple terms, the more differences there are between two forms of the same gene, the longer they have been separated from each other.
The genes proved to cluster into four age groups, each group with a different level of sequence similarity, suggesting that they diverged at the same time. And while these genes are scattered all over the Y chromosome, on the X chromosome, the groups appear in order, like four geological strata of different age. Page and Lahn interpret this to mean that the X-Y differentiation was probably initiated one stratum at a time. In other words, there were four separate stages in the evolution of the sex chromosomes.
Genes cannot diverge when there is exchange of material between the two chromosomes, so the changes we see in the DNA today could only have happened after X-Y recombination was suppressed in each stratum. This suppression was probably the result of a series of chromosomal inversions on the Y chromosome. This would explain why the genes are scrambled on the Y chromosome, when they seem to be in order on the X chromosome.
They say that the differentiation events, based on the molecular clocks, seem to have happened around 240 to 320 million years ago, after the ancestors of mammals parted company with the ancestors of birds, then around 130 to 170 million years ago, shortly after our ancestors parted company with the ancestors of the monotremes, the egg-laying mammals, the platypus and the echidna, still found in Australia and New Guinea.
The third differentiation was at about 80 to 130 million years ago, shortly after our ancestors parted company with the ancestors of the marsupials, and the fourth and last stage was 30 to 50 million years ago, shortly after our ancestors parted company with the ancestors of lemurs.
Previous thinking set the origin of the sex chromosomes at about 170 million years ago, so this pushes the date back by about 100 million years, but neither the greater age nor the four strata were being looked for when the two set out to take a fairly straightforward measure. "We didn't anticipate this - the numbers just fell out," said Page. "It's stunningly beautiful."
But what drove the change from a situation where the environment determined the gender to a large group of animals which had the chromosomes doing the job? In mammals, that defining event was probably the alteration of an existing gene to create the male-promoting SRY gene on the Y chromosome. This would have been followed by a rearrangement on the Y chromosome which would keep the Y chromosome from lining up with the X. This would then keep the mutated male-favoring gene, SRY (Sex-determining Region Y), from mixing with the female-favoring or neutral version on the X chromosome. SRY, says Lahn, is the master-switch for creating a male.
Then it would only be a matter of time before other male-favoring and male-specific genes found their way to the Y chromosome, where natural selection would ensure that they were favorably delivered to the generations that followed - it happens in fish, for example, where male guppies have a number of genes for making flashy fins to attract females, and those genes are clustered on the male sex chromosome. The female is able to avoid the flashy colors which, while they look attractive to females, also draw the attention of predators. So the male guppies have inherited "risky behavior" because it helps the males in each generation to pass on their genes, even if they do get eaten, but it has helped the females, who can remain hidden and uneaten.
#
"The 1999 Nobel Prize in Physics",1235,0,0,0
(Oct '99)
As reported last month, this prize was awarded to Professor Gerardus 't Hooft of the University of Utrecht, the Netherlands, and Professor Emeritus Martinus J. G. Veltman of Bilthoven, the Netherlands, for having placed the standard particle theory on a firmer mathematical foundation. The Academy's citation put the award in these terms: "for elucidating the quantum structure of electroweak interactions in physics."
Gerardus 't Hooft was born in Den Helder, the Netherlands in 1946. He gained his Doctoral degree in physics in 1972 at the University of Utrecht. He has previously been awarded the 1979 Dannie Heineman Prize from the American Physical Society and the 1982 Wolf Prize for his work on renormalizing gauge theories. He has been a member of the Dutch Academy of Sciences since 1982.
Martinus J. G. Veltman was born in 1931 in the Netherlands, and like 't Hooft he is a Dutch citizen. He gained his Doctoral degree in physics in 1963 at the University of Utrecht, was Professor of Physics at the University of Utrecht from 1966-1981 and at University of Michigan, Ann Arbor, from 1981, although he is now retired. Among other awards Veltman has received is the 1993 High Energy and Particle Physics Prize from the European Physical Society for his work on renormalizing gauge theories. He has been a member of the Dutch Academy of Sciences since 1981.
Their work has shown how the standard particle theory may be used for precise calculations of physical quantities. Their theoretical work has been tested by recent experiments in Europe and the USA, confirming many of the calculated results. Without their mathematics, we would either have no theory at all, or at the very best, a theory which could not be experimentally conformed.
Today, we believe that the everyday objects around us are made up of atoms, which are in turn made of electrons and atomic nuclei. Inside the nuclei, there are protons and neutrons, and these, like electrons, are made up of still smaller particles, the quarks. The only problem is that we need large accelerators to study matter at the quark level - and some seriously complicated mathematics.
Planning an accelerator experiment works best when we know that there are likely to be just a few possible results, results that we can anticipate, and which will have clear implications. Science does not really involve very much trial-and-error work, especially at this level, and even if it did, the results of any fiddling would need a sound mathematical theory before they could be interpreted. These serious atom-smashers were first designed in the 1950s, but the standard model took some three decades to develop, and much of it makes little sense, so we shall approach the theory by stages.
First, the overview, and remember that this is the simple version: modern theory, which gave rise to the "standard model", groups all elementary particles into three families (also called generations) of quarks and leptons, which interact with the help of a number of exchange particles which mediate the strong and the electro-weak forces. The two physicists gained their Nobel Prize for having placed this theory on a firmer mathematical foundation, providing robust "theoretical machinery" which can be used for, among other things, predicting the properties of new particles.
Now let us examine the theory in a little more detail. For most of this century, we have understood that the mass of an atom depends on the masses of the particles which make it up, and also on the energy which binds the whole thing together: this is why we can calculate exactly how much energy is to be gained from various radioactive processes. The standard model is built from these components: the particles of matter that we know, together with the forces that act on them. Then comes the hard part: we are talking probably six quarks, and six leptons, since experiments have shown that there are no more than six varieties of lepton, and that has been taken for some years to mean that there are no more than six quarks.
The leptons include the neutral leptons, the neutrinos (electron-neutrino, muon-neutrino and tau-neutrino, and the charged leptons, the tau, the muon, and the electron, which is about 1/3600 of the mass of the tau, or 1/210 the mass of a muon. The quarks vary from a fraction of the mass of a proton to the "top" quark, which has a mass more than 100 times as great as the proton. The other quarks in the model are the bottom, up, down, strange and charm quarks.
But we are not finished yet: there are also the "messenger particles" or "exchange particles". These are the photon, which has no mass or charge, and which is described as mediating the electromagnetic force, while three other particles, with masses about 100 times that of the proton, the W+, the W-, and the neutral Z\U0\u particle, mediate the weak force. There are also eight particles known as "gluons".
There are several forces involved in physics at this level, called the electromagnetic and weak forces, the gravitational force, and the strong force. We are reasonably familiar with the effects on our scale of the electromagnetic and gravitational forces, but the strong and weak forces require a little more explanation.
The strong force works only inside the nucleus, at distances of about 10\U-15\u meters, and this force is needed in theory to explain the way protons and neutrons are held together inside the nucleus. In particular, the strong force operates to stop the protons in the nucleus from blowing apart, although the neutral neutrons seem to "dilute" the electromagnetic repulsion forces in the nucleus, as the positive protons repel each other. Experiments in particle accelerators suggest that the strong force is about 100 times the strength of the electrical repulsion forces operating between the protons.
The weak force is about 1/10,000 the power of the strong force, but it is still enough to keep neutrons from decaying inside the nucleus, although after just under 15 minutes, a typical isolated nucleus will decay to a proton, an electron and an antineutrino. In simple terms, the strong interaction holds the atomic nucleus together while weak interaction allows certain nuclei to decay radioactively.
To an outside observer, this collection of different forces looks like a mess, and to a physicist, it is most definitely a mess, and the sort of mess which inspires physicists to clean it up. The aim of the physicists everywhere is to reduce the number of forces by finding a "theory of everything" which can explain some or all of the forces in the one theory, but progress is slow. It began in the 1960s, with a suggestion by Peter Higgs of Edinburgh that there could be a mechanism, now called a Higgs field, which allowed us to assume for the first time that certain particles have mass.
We are now almost back to our 1999 Nobel winners, after a quick stop-off at two earlier winners. Steven Weinberg and Abdus Salam realized that the Higgs mechanism would help them to create a unified theory for the electromagnetic and weak forces, called the "electroweak force". This led to the assumption that there is a particle called the "Higgs boson", and to the construction of a "Higgs field" with interesting properties. (The Higgs boson is also called a "Higgs particle" on some occasions.)
Then in 1971, Gerardus 't Hooft found that the Higgs mechanism could be used to cancel out some meaningless infinite values that otherwise made rather a mockery of the electroweak force, and since that time, the electroweak force has been well integrated into the standard model.
This is not to suggest that everything was suddenly perfect, far from it. Writing in \IScientific American\i in 1986, Veltman said " . . The only legitimate reason for introducing the Higgs boson is to make the Standard Model mathematically consistent . . . The biggest drawback of the Higgs boson is that so far, no evidence of its existence has been found. Instead, a fair amount of indirect evidence already suggests that the elusive particle does not exist. Indeed, modern theoretical physics is constantly filling the vacuum with so many contraptions that it is amazing a person can even see the stars on a clear night."
To make matters worse, at the end of 1995, Stephen Hawking expressed doubts about whether the existence of the Higgs boson could be proven, although other physicists had suggested, even before that, that the Higgs field and Higgs mechanism could exist, even without the Higgs boson. As well, they pointed out that what Hawking actually said was that the equations he derived from quantum mechanics and general relativity made the Higgs boson impossible to detect.
Theoretical physicists are now working on GUTs, (Grand Unified Theories of everything), which would tie the strong force in with the electroweak force, so far without success, and getting some way of explaining or including gravitational force in the same theory is still well beyond us.
The mathematics of the modern theory is a little daunting, so let us begin in a simple way. Touch an overhead high voltage line, and you will be killed, yet a sparrow can sit on the same line, and a tree snake can crawl along the line to hunt the sparrow, without ill effect. There is no relative difference between the electrical potential at different parts of the line, so we can regard all parts of the power line as having zero potential, as the snake and the sparrow see them - but if the snake dangled its tail and touched a steel post, it would suddenly discover that things were not quite that neutral!
In field theories, electric and magnetic fields have gauge symmetry, they can be expressed using potential functions which can be exchanged (gauge-transformed ) according to a certain rule without changing the fields. The 1860s theory of electromagnetism of James Clerk Maxwell is a \Jgauge theory\j in today's modern terminology: this was a small-scale "theory of everything" which united electricity with magnetism and predicted, among other things, the existence of radio waves.
The very simplest transformation we can perform is to add or subtract a constant to the electrical potential. For example, electrical potential can be calculated from an arbitrary zero point, since only the differences in potential are of significance. This is why a snake and sparrow can wander along a high-voltage cable without being injured. The fact that the zero point can be moved in this way gives us a symmetry in the theory, called gauge symmetry.
The order in which one performs two gauge transformations makes no difference in what we call an abelian gauge theory, after the Norwegian mathematician, Niels Henrik Abel, who died aged 26, founded modern group theory. There are also non-abelian transformations.
You can perform a simple abelian transformation by turning clockwise 180║, and then turning a further 90║ in the same direction, or performing these two transformations in reverse order, where the transformations took place in the two horizontal dimensions only, but if you switch to three dimensions, the result is different.
For reasons of safety, if you are doing this at home, you may prefer to try it out on a Smurf or a teddy bear, for reasons which will soon become obvious. In the simple non-abelian transformation, the two transformations are the same: a rotation of 180║, but this involves turning you (or some figure) upside down, followed by a 90║ turn to the right. Begin facing north, turn upside down, right turn, and you are on your head, facing east. Begin facing north, right turn, then turn upside down, and you are facing west! So in this case, the order of the two transformations is all-important.
The mid-1920s saw physicists struggling to combine the wave functions of quantum mechanics and the fields of electromagnetism into a quantum field theory that would bring the two together. The results were less than satisfactory, yielding unreasonable results. The main cause for this was that quantum theory predicts that the electromagnetic fields close to an electron or a proton can spontaneously generate quantities of very short-lived particles and anti-particles, virtual particles, making a system of a single electron suddenly become a multi-particle problem!
This way lies madness, so a simple answer was needed, and this was the contribution of Shin'ichiro Tomonaga, Julian Schwinger and Richard P. Feynman (who shared the 1965 Nobel Prize in Physics for their contributions). They developed a method called renormalization, which allows individual particles to be viewed "somewhat at a distance". Now the virtual particle pairs can be ignored, and the "cloud" of virtual particles can be allowed to obscure the central, original particle.
There is no such thing as a free lunch, of course, so the method required that the original particle gains a new charge and a new mass, among other things, but it worked. In today's terms, Tomonaga, Schwinger and Feynman renormalized an abelian gauge theory. Even before that, in the 1930s, some physicists were trying to formulate a quantum field theory for the weak interaction, but this theory suffered from problems that were even worse than those quantum electrodynamics had suffered and not even the renormalization method could solve them.
By the mid-1950s, we had the first example of a quantum field theory with new features, a non-abelian gauge theory. In this form of theory, as we have seen, the order of the transformations is everything. The mathematical structure is more complicated, but all sorts of new possibilities become available. In the 1960s, this was the key to the work which led to Sheldon L. Glashow, Abdus Salam, and Steven Weinberg gaining the 1979 Nobel in Physics for their part in the development of a non-abelian gauge theory which unites electromagnetism and weak interaction in an electro-weak interaction.
As a result of this theory, the W and Z particles were predicted, pursued, and detected in 1983 at the European CERN accelerator laboratory in Geneva, winning the 1984 Physics Nobel for Carlo Rubbia and Simon van der Meer. But as had happened in the 1920s, with quantum field theory, when physicists tried in the 1960s to use the existing theory for calculating in more detail the properties of the new W and Z particles (and many other physical quantities), they kept getting unreasonable results. Without some way of renormalizing the non-abelian gauge theories, they appeared destined to go nowhere.
Martinus J. G. Veltman had not given up on the renormalization of non-abelian gauge theories. By the end of the 1960s, he was a newly appointed professor at the University of Utrecht, when he was joined by a 22-year-old pupil, Gerardus 't Hooft, who had expressed the desire to study high-energy physics. By then, Veltman had developed a computer program which, using symbols, performed algebraic simplifications of the complicated expressions that all quantum field theories result in when quantitative calculations are performed. We see this program now as a parallel with the \JFeynman diagrams\j which, in a pre-computer age, had been rapidly accepted by particle physicists as a useful tool.
Late in 1969, 't Hooft was accepted as a doctoral student, charged with the task of helping in the search for a method of renormalizing non-abelian gauge theories. He succeeded magnificently, and in 1971 published two articles that represented an important breakthrough in the research program. Veltman's computer program helped verify 't Hooft's partial results and together they worked out a calculation method in detail. The non-abelian gauge theory of electro-weak interaction was now a functioning theoretic machinery and it was possible to start performing precise calculations, just as it had become for quantum electrodynamics 20 years previously.
The rest, as they say, is history. 't Hooft's and Veltman's work allowed more precise prediction of physical quantities involving properties of W and Z, and large quantities of these particles have recently been produced under controlled conditions at the LEP accelerator at CERN, and measurements taken have shown a strong agreement with the theory, offering further confirmation, since any theory is only as good as the confirmed predictions it makes - and their theory even succeeded in predicting, several years before it was detected, the mass of the top quark which was observed directly for the first time in 1995 with an instrument called the Collider Detector at Fermilab (CDF) in the USA.
The last frontier of the present round of physics (there will no doubt be more) is the discovery of the as-yet undemonstrated Higgs boson. Originally, this was expected to be found by the Positron-Electron Tandem Ring Accelerator (PETRA) in Hamburg in the 1980s; then they were to have been found at the Large Electron-Positron (LEP) machine in Geneva in the early 1990s. Now it appears that we must wait for an even newer, even more powerful device, the Large Hadron Collider (LHC) at CERN, which will not be complete until 2005.
\BSee also\b: \JAbel, Niels Henrik\j, \JFeynman, Richard (Phillips)\j, \JGlashow, Sheldon (Lee)\j, \Jgroup (mathematics)\j, \JMaxwell, James C(lerk)\j, \JRubbia, Carlo, \JSalam, Abdus\j, \JSchwinger, Julian (Seymour)\j, \JTomonaga, Shin'ichiro\j, and \JWeinberg, Steven\j.
#
"The 1999 Nobel Prize in Chemistry",1236,0,0,0
(Oct '99)
Professor Ahmed H. Zewail of the California Institute of Technology, Pasadena, USA was awarded the 1999 Nobel Prize in Chemistry for showing that it is possible to use a rapid laser technique to see how the atoms in a molecule move during a chemical reaction. Professor Zewail holds dual Egyptian and US citizenship, and has been featured in the past on Egyptian postage stamps. Zewail was born in Egypt in 1946, where he grew up, and he studied first at the University of Alexandria. He completed a PhD in 1974 at the University of Pennsylvania, and after two years at the University of California at Berkeley, he was employed at Caltech where he has held the Linus Pauling Chair of Chemical Physics since 1990.
Zewail's work involves watching atoms and molecules in "slow motion" during a reaction to see what actually happens when chemical bonds break and new ones are created. In a sense, he uses the world's fastest camera, with flashes so short that their duration is on the same scale as that on which the reactions actually happen - the scale of femtoseconds (fs).
One femtosecond is 10\U-15\u seconds or 0.000000000000001 of a second, which has the same relationship to a single second as a second has to 32 million years. The expression of such small numbers is not a precise art, but a femtosecond is a thousand-million-millionth, a million-billionth, or a quadrillionth of a second, depending on how you prefer to say it.
Zewail's technique, femtochemistry, is a branch of physical chemistry, now used all over the world by scientists who are studying processes with femtosecond spectroscopy in gases, in fluids and in solids, on surfaces and in polymers. The technique tells us why certain chemical reactions take place but not other reactions, and why the speeds and yields of reactions depend on temperature, but it also shows us the answer to problems as different as how catalysts function and how molecular electronic components must be designed, and lays open to us the inner workings of the most delicate mechanisms in life processes.
Chemical reactions take place at different speeds. Alfred Nobel became seriously rich and able to fund the Nobel prizes because he invented dynamite, which explodes very fast, while a candle burns slowly, a nail rusts even more slowly, and the chemical changes which make a fossil can take thousands of years, or even more. Most chemical reactions are comparatively fast, but they go faster as the temperature rises, when molecular motion in the reactants becomes more violent.
When two molecules collide, nothing normally happens, they just bounce apart once more, but at a high enough temperature, the collision is so violent that they react with one another and new molecules form. A slow reaction like a nail rusting is only slow because the "temperature kicks", which get the reactants over the threshold, occur less often and slowly.
There seems to be a barrier to reaction, a barrier which is related to energy. Today, we consider this barrier to be determined by the nature of the chemical bonds, the forces that hold atoms together in the molecule. In much the same way that a rocket to the moon has to get over the barrier of the earth's gravity before it falls into the moon's gravitational field. The problem was: what does the molecule really look like when it is exactly at the top, its ôtransition stateö?
The study of reaction speeds began with Svante Arrhenius, who was the Nobel laureate in Chemistry for 1903, but he in turn was inspired by van't Hoff, who took the first Nobel laureate in Chemistry in 1901. Yet it was only in the 1930s that Henry Eyring and Michael Polanyi developed a theory for this area, based on reactions in microscopic systems of individual molecules. They, and many chemists after them, assumed (correctly) that the transition state was crossed very rapidly, and they also assumed (incorrectly) that it would never be possible to perform experiments over such short times, on the same scale as the vibrations of individual atoms.
The transition state was originally just a convenient mental picture to give chemists extra insight into chemical reactions. As a molecule of sodium iodide breaks up into separate atoms, there is a point at which the sodium and iodine atoms are pulling apart, but not yet separated, and they are half-way between ions and atoms - in a state of transition. And \Ithat\i is the Eyring-Polanyi transition state - but it was just a neat idea, something that chemists never expected to see.
Zewail began his work which proved them wrong at the end of the 1980s, using a camera which was based on new laser technology with light flashes of some tens of femtoseconds (fs). When atoms vibrate, they do so on a scale of between 10 fs and 100 fs, so if the chemical reactions took place on the same time scale as the scale on which the atoms oscillate in the molecules, the reactions should be "capturable", and they were.
If we are going to understand how a molecule breaks up, we need to understand how it absorbs energy. In a simple molecule with two atoms, the extra energy will either make the bond between the atoms stretch or compress, which means the atomic nuclei either get closer together, or further apart. But to make any predictions, we need to make a few assumptions, first suggested by the physicists Max Born and J. Robert Oppenheimer in 1927. First, we assume that the electrons readjust instantaneously to the positions of the nuclei, and we assume also that the heavier nuclei move comparatively slowly, directing the electrons into whatever happens to be the lowest energy arrangement.
Given the Born-Oppenheimer approximation, we can draw a map of the "energy surface", and this is the key to practical studies in femtochemistry. The remaining problem is to inject extra energy into a system, and that is where the laser comes into the picture. Femtosecond spectroscopy involves mixing the reactants or starting molecules as beams of molecules in a vacuum chamber, then an ultrafast laser fires off two pulses. The first is a powerful \Bpump pulse,\b which strikes the molecule and raises its energy ("excites itö in chemist-speak), and this is followed by a second weaker \Bprobe pulse,\b at a wavelength chosen to detect the original molecule or an altered form of this. The idea is that the pump pulse sets the reaction off, while the probe pulse checks for a reaction - in particular, for a product of a reaction.
If researchers vary the time between the two pulses, they can measure how quickly the original molecule is transformed. As the molecule changes shape, either because it has been excited, or because it changes to a new product, so its spectrum changes in a very characteristic way. The time interval is altered by sending the laser's probe pulse on a side track through a mirror, though the distances are rather small - in 100 femtoseconds, the light travels just 0.03 mm, about 1/800 of an inch.
These were no ordinary lasers when Zewail began his work, because the laser used has to be tunable to different frequencies. The laser used at first was a \Jdye laser\j, which has an organic dye, dissolved in a viscous solvent, which is continuously formed into a thin jet. The focused beam of an argon laser is used to "pump" the dye, exciting a large number of the dye molecules, which then spontaneously emit light, triggering further emissions from other dye molecules. All of this goes on in a region surrounded by mirrors, reflecting the light back into the beam triggering more stimulated emissions.
As in other lasers, one of the mirrors is "partially transmittingö, and eventually allows light to escape, forming the laser beam. A given dye will emit light over a range of frequencies (the dye "rhodamine 6G", for example, emits light from 640 nanometres (red) down to 560 nanometres (green-yellow)).
This is a very simple account, which does not deal with techniques such as passive mode-locking or colliding pulse-mode locking (CPM), or the use of optical cavities, developed by Charles Shank and his colleagues: the main point is that complex but standard laser methods like this were available to Zewail and his colleagues. It also leaves out more recent work, which has seen femtosecond solid-state lasers made from titanium-doped sapphire - where the dye lasers needed highly-trained operators, the same cannot be said of the solid-state lasers, so the technique is now much more available to researchers.
From the chemist's viewpoint, the only important thing is that molecules can be broken by a first laser pulse, and the pieces (or breaking molecules) can be examined by the light emitted from the pieces after a second laser pulse, as recorded by a charge-coupled device (CCD) camera. But because the researchers are using tunable lasers at different frequencies, each frequency responds to a particular energy level, and that means we are getting a slightly different freeze-frame with each color of laser that we use.
The work done by Zewail and his colleagues means that in one sense, chemists have come to the end of the road, for no chemical reaction can take place faster than those observed with Zewail's technique. Arrhenius offered us a formula for temperature dependence, and van't Hoff gave us more formulae for reaction rates, but now we can study the mechanism that underlies these reactions. Yet if Zewail's work is an end of the road, chemists have found an excellent playground at the end of the road!
#
"The 1999 Nobel Prize in Physiology or Medicine",1237,0,0,0
(Oct '99)
This prize went to Dr Gⁿnter Blobel, a cell and molecular biologist at the Rockefeller University, New York, for his work over many years, explaining how proteins travel across cellular membranes. In the words of his citation, the award was for the discovery that "proteins have intrinsic signals that govern their transport and localization in the cell". The life of a cell depends on making proteins, some of them building blocks, some of them enzymes controlling reactions to form other compounds, most of them needed somewhere other than where they are formed. In some cases, the proteins are needed in a different cell from the one in which they were formed, but even when they remain within the same cell, the proteins will usually have to cross at least one membrane.
An adult human being is made up of about 10\U14\u cells, 100,000 billion cells, each one with a membrane around the outside, and each filled with smaller organelles, surrounded by membranes of their own. (According to \Jserial endosymbiosis theory\j, these were once separate cells which came together to form the cells that make up all of the higher organisms.) Organelles carry out specialized functions, with the endoplasmic reticulum, together with the ribosomes, being responsible for synthesizing proteins.
Each cell in our body contains about a billion protein molecules, made up of somewhere between fifty and several thousand amino acid building blocks - and somehow, our cells and our bodies have to manage all of these proteins, and must get the right molecules to the right places at the right time.
Early in the 1970s, Blobel realized that newly synthesized proteins had to have an "intrinsic signal" that is essential for steering them to and across the membrane of the endoplasmic reticulum. It took him 20 years to identify the detail of the molecular mechanisms underlying these processes, but in the process, he showed that similar "address tags" direct proteins to other intracellular organelles.
The principle Blobel discovered is universal to life, working in much the same way in yeast, plant, and animal cells. There is also a practical use: a number of human hereditary diseases are caused by errors in these signals and transport mechanisms. The hereditary disease primary hyperoxaluria causes kidney stones already at an early age, and this is now thought to be caused in this way, as are some forms of familial hypercholesterolemia, where deficient transport signals cause a very high level of cholesterol in the blood, and cystic fibrosis is caused when proteins do not reach their proper destination. There is also an economic side, since Blobel's work has helped other researchers to develop more effective ways of using cells as "protein factories" for the production of important drugs.
The membrane of an organelle is made of phospholipids and some proteins, but it remained a puzzle how large proteins could cross the apparently uncrossable membrane, or how they found their targets, and Blobel made an excellent move when he joined the famous cell biology laboratory of George Palade at the Rockefeller Institute in New York at the end of the 1960s. Here, there was a long tradition of studying protein synthesis and transport, and Palade himself won the Nobel Prize in Physiology or Medicine in 1974 (which he shared with the Belgian scientists Albert Claude and Christian de Duve) for work in this area.
The cell membrane is usually imagined as something like a cellophane bag, a simple passive barrier, marking the boundary between one cell and the next. Nothing could be further from the truth, because the membrane is a very precise regulator, controlling the passage of molecular cargo in both directions, somehow sensing what needs to be passed across, and in what direction, and also recognizing what must be kept out.
The membrane's structure is a double layer of phospholipids, molecules with a water-attracting phosphate head and a water-repelling lipid (fat) tail. The phospholipids line up in two sheets, with the lipids in the center, and the phosphates on the outside (or inside, if we are talking about the inner surface of the cell). This means that the phosphate ends are always near water, and the lipid ends are kept away from water.
There are also cholesterol molecules across the surface of the membrane, giving the cell rigidity, and making the membrane water-resistant. The membrane is no barrier at all to some molecules, like oxygen and carbon dioxide, which simply diffuse across the membrane, from areas of high concentration to areas of low concentration, but other chemicals rely on protein "shuttles" which move them across, even against a concentration gradient - but the shuttles can only play their role if they know which molecule is which.
Blobel developed his first version of the "signal hypothesis" in 1971, when he suggested that secreted proteins must carry some sort of signal. By 1975, he was able to describe the various steps in these processes of recognition. The signal consisted of a peptide, a group of amino acids in a particular sequence, he suggested, and the protein passed through a channel in the membrane.
By 1980, Blobel, working with other research groups, was able to show that similar intrinsic signals target the transport of proteins also to other intracellular organelles, and he suggested that each protein carries in its structure the information needed to specify its proper location in the cell. On this model, specific strings of amino acids, called topogenic signals, are needed if a protein is to pass through a membrane into a particular organelle, become integrated into the membrane, or be exported out of the cell.
When the entire human genome is mapped, we can expect to be able to deduce the structure and topogenic signals of the proteins, and this will lead to further advances in disease control - it may be possible, for example, to construct new drugs which are then carried to precisely that part of the cell where they are needed.
#
"The 1999 Volvo Environmental Prize",1238,0,0,0
(Oct '99)
The 10th annual Volvo Environmental Prize was awarded on October 26, to Indian economic ecologist Dr. Monkombu Sambasivan Swaminathan at a ceremony at Columbia University. This prize, regarded as the "Nobel of environmental awards", goes to individuals who have demonstrated "outstanding scientific innovation in pursuit of environmental protection", and includes a cash award of 1.5 million Swedish kroner (around $US250,000).
The prize committee's citation for Dr. Swaminathan states: "The 1999 Volvo Environment Prize is awarded to Dr. M. S. Swaminathan because of his achievements as a plant breeder and administrator which led to dramatic increases in crop yields, his international leadership in agriculture and resource conservation, his deep concern for the poor and disadvantaged, and his continuing research and leadership to ensure that they get the opportunities needed to develop in ways that enhance the natural environment on which they depend."
Swaminathan joins a remarkable group of laureates which includes Peter Raven, Paul Ehrlich, Norman Myers and Gilbert White. Trained as a plant geneticist, Swaminathan is widely considered the scientific force behind the farm revolution movement in India that has led to its agricultural renaissance. In July 1988, he founded the M. S. Swaminathan Research Foundation as a non-political trust dedicated to integrating the principles of ecological sustainability with economic efficiency and social equity.
#
"An asteroid with a moon",1239,0,0,0
(Oct '99)
How many moons are there in the solar system? Whatever your answer, increase the number by one, to include the moon orbiting the asteroid (45) Eugenia, announced in \INature\i in early October, although the actual pictures revealing the tiny moon were taken in 1998. The pictures published in the journal were taken with the Canada-France-Hawaii Telescope (CFHT) on Mauna Kea, Hawaii, and they are the first images of an asteroidal satellite taken from Earth.
The find is even more of a first, as previous attempts to photograph such satellites, using both ground-based telescopes and the Hubble Space Telescope, have found no satellites - and the only previous asteroidal moon was located by the interplanetary spacecraft, Galileo, when it discovered the small moon, now known as Dactyl, around asteroid (243) Ida in 1993. The new discovery was made using a new technique, called adaptive optics, which reduces the blurring caused by the Earth's atmosphere.
Adaptive optics have now proved their worth, helping us get past the distortions caused by unevenness in the atmosphere, the sorts of effects which make stars appear to twinkle. It analyzes the distortions and corrects the light beam by means of what is essentially a "fun-house mirror" back into its original, undistorted form.
From the observations, astronomers have been able to calculate the relative density of the asteroid, which seems to be about 1.2 - only 20% greater than water. Most asteroids are dark, suggesting that they are rocky, which should mean a density about three times that of water. In July 1997 we learned that another asteroid, (253) Mathilde, also has a low density - about 1.3 times that of water, which means we may need to rethink asteroids. Either these objects are highly porous rubble-piles of rock, or they are mostly water ice, it seems.
The moon lets scientists calculate the mass of an asteroid because of the effect of the primary asteroid's gravity on its small moon. Since we know the sizes of most asteroids from standard astronomical studies, we can estimate the asteroid's density. The density then gives a clue to the asteroid's makeup, either in terms of composition or structure. That in turn influences how we see the asteroids, either as a rubble pile, resulting from rocks smashing together, or as something else. If the objects are largely ice, covered with a dark-coating, then they may be remnants of burned-out comets and this will help our understanding of the connection between comets and asteroids.
Eugenia orbits the sun in the main asteroid belt, along with thousands of other asteroids, between the orbits of Mars and Jupiter. Once seen as the remains of a planet destroyed in a cataclysm, they have long been regarded rather as bodies which never formed a planet in the first place. Why? Nobody knows, but one theory is that the gravity of Jupiter interfered, so they collided with each other at high speeds, perhaps either fragmenting or forming satellites, rather than colliding gently, sticking together, and gradually building up to make a new planet.
The satellite has an estimated diameter of about 13 kilometers, while Eugenia's diameter is about 215 kilometers. The satellite travels in a circular orbit about 1,190 km away from Eugenia, orbiting the asteroid about once every five Earth days. Until it gets a more impressive name, the International Astronomical Union has given the satellite the provisional designation of S/1998(45)1. This translates as the first satellite of asteroid (45) which was discovered during 1998.
For more information on the uses of adaptive optics, see \JNew images of Neptune\j.
#
"New images of Neptune",1240,0,0,0
(Oct '99)
Some of best Earth-bound images ever taken of the planet Neptune have been captured by an adaptive optics (AO) camera. The near-infrared images of Neptune were taken by the Palomar High Angular Resolution Observer (PHARO) camera on the 200-inch Hale telescope at Palomar Observatory. The difference between images taken with and without the AO camera can be simply explained: with AO, we can see clouds on the planet; without the AO system, we see nothing at all.
Neptune is the eighth planet from the sun, a mostly gaseous planet, and only the upper layers of its atmosphere are visible. The new images show the planet embellished with a massive cloud, the size of the European continent, and numerous smaller clouds.
Data gathered by PHARO's spectrometer will allow a detailed analysis of the planet's clouds and their altitude. Astronomers will also be able to measure the abundance of methane in the atmosphere, and they expect to be able to determine the physical properties of individual cloud features and eventually learn something about the planet's atmospheric circulation.
Neptune is a suitable target for the AO system, which requires a small target to lock onto so that it can correct the effects of the atmosphere. For this purpose, Neptune's disk is small and bright enough for the system to lock on.
The new AO system, which Jet Propulsion Laboratories (JPL) began to develop in 1995, is basically a mirror placed between the telescope and the camera. This mirror is adjusted up to 500 times a second to correct atmospheric distortions, and the result is to give planet-bound telescopes the same sort of clear vision that could once be found only on the orbiting Hubble Space Telescope, far beyond the Earth's atmosphere. Now the system has been proven, the next target will be Titan, the largest of Saturn's 18 known satellites.
See also \JAn asteroid with a moon\j, for another use of AO.
#
"New astronomical Java program",1241,0,0,0
(Oct '99)
The November issue of \ISky and Telescope,\i a leading magazine for amateur astronomers, has hailed a new Java-based applet which will allow people to display, process and analyze astronomical images anywhere they have access to a computer. The application, called Sky Image Processor, or SIP, was developed by John Simonetti, associate professor of physics at Virginia Tech. All users need is a Java-enabled Web browser such as Netscape, and the applet which is available over the Internet from http://www.phys.vt.edu/~jhs/SIP
Simonetti has set up the system so any updates to the SIP program will be added to the SIP web page and received by users automatically when they access the page. He says he made the program simple because he wanted students in astronomy laboratories or observational astrophysics classes to be able "do the kind of work that real astronomers do" without having to grapple with the complexities of professional astronomical image-processing programs. But while Simonetti developed the application for his own students, it is now available to students of all ages, everywhere, no matter how tenuous their student status may be.
Users can combine images, much like laying one picture over another, but they can also use SIP to analyze digital images. Because the images are digital, this allows easy manipulation and enhancement, and users are also able to save their work. SIP can access images from either the user's hard disk, or from the Internet, reminding us of a security issue which has been dealt with in this case. SIP is a ôsignedö applet, meaning that the author has certified it and provided an electronic signature of approval - something which users should always seek before they let any downloaded program access their hard drives.
While the Java language makes any applet potentially more secure in any case, Simonetti's digital signature serves to reassure any person accessing the program that it has not been tampered with by someone else, because tampering would destroy the signature in much the same way as opening an envelope breaks the seal. As well, the program indicates when it is going to read from or write to the disk, and any wary user has only to click on the "No" choice to prevent that happening.
#
"Nuclear fuel processing methods",1242,0,0,0
(Oct '99)
To understand September's nuclear accident at Tokaimura in Japan, it is necessary to consider what the nuclear industry is about, and we will begin with the fuel used in nuclear reactors. While coal and oil can be fed straight into a power station, uranium oxide is useless in most reactors as a fuel until it has been extensively processed. The uranium needs to be concentrated (usually), purified, and formed into fuel rods. These rods then act as a fuel for a considerable period of time, far longer than a load of coal or a barrel of oil.
The most common process today begins with uranium oxide, U\D3\dO\D8\d, which is about 0.7% U-235, with most of the rest being U-238, and just a trace of U-234. Now it is important to keep in mind that these forms of uranium have exactly the same number of protons and electrons, arranged the same way, so their chemical properties are effectively the same. Their nuclear properties are seriously affected by the number of neutrons: the difference between U-234 and U-235 is one neutron in the nucleus, while U-238 has three more neutrons than the otherwise identical U-235.
Except in the Canadian CANDU reactors, natural uranium is not used as fuel material, and the uranium needs to be treated to increase the amount of U-235 present if the uranium is to power the common light water (LWR) type of reactor. This increase in the proportion of U-235 is called enrichment. It is not an easy process.
For a start, the uranium has to be converted into a gas so physical methods can be used to separate the different atoms. Uranium hexafluoride, UF\D6\d or "hex", is a gas at around 60║C, so making this gas is the first step in the process. If the uranium hexafluoride gas is then allowed to travel long distances through a large number of membranes, the slightly lighter molecules containing U-235 atoms travel slightly faster, which allows a richer mix to be collected at the far end by taking just the portion of material which arrives first, and repeats the process.
This is called gaseous diffusion, and it was the main method used in collecting U-235 in the \JManhattan Project\j to make the first uranium-based atomic bomb. These days, the much more efficient gas centrifuge technology is used more, and the future may see advanced laser technology used, but in each case, the separation of the two isotopes relies on a mass difference of less than 1% between the two types of gas molecule.
The result is a yield of enriched uranium about 15% of the mass of the original uranium, containing about 3.5 percent U-235. This leaves the remaining 85% of the uranium stock as "depleted uranium". Depleted uranium is mainly U-238, with about 0.3% U-235. This depleted uranium is stockpiled as "hex", with some of it being used to make metal uranium for armor-piercing artillery shells, although it can also be used in \Jfast breeder reactors\j.
The depleted uranium projectiles are used because of the density of the uranium metal, and not because of the radioactivity of the material, but people commonly assume incorrectly that the "uranium shells" are some form of tactical nuclear weapon, when they are just very big and sophisticated rocks, thrown very fast at a tank, where they make holes because the dense metal just punches its way through conventional armor.
Some older reactors use uranium metal as the fuel, but this is no longer common. Now, the enriched "hex" is normally converted to uranium dioxide, a ceramic material which is formed into small cylindrical pellets about 2 cm long and 1.5 cm in diameter. These pellets are then loaded into tubes about 4 meters long, made of zirconium alloy or stainless steel, to create "fuel rods". The rods are then bundled into batches about 30 cm square to form reactor fuel assemblies which can then be used in an LWR, currently the most popular reactor design. This type of reactor, rated to generate 1000 megawatts, will have about 75 tonnes of fuel in it, derived from about 500 tonnes of yellowcake, or the contents of about 300 standard 200 liter (44 gallon) drums of the original unprocessed oxide.
The fuel rods remain in the reactor for about three years, generating heat from fission of both the U-235 and also the fissile plutonium (such as Pu-239) which is formed there. The reaction is controlled by adding water, heavy water or graphite moderator. These additives slow down the neutrons which are formed by fission, so the neutrons can cause more fissions in a controlled chain reaction. More control comes from the presence of neutron-absorbing control rods which are pushed in or pulled out to regulate the speed of the reaction.
Fast neutrons leave the pile before they have a chance to trigger new fission events, slow neutrons lose energy and are unable to cause a new nuclear fission, so control is a matter of balance between slowing down enough neutrons to keep the reaction going, and avoiding a runaway reaction. Each fission event generates heat, and this is used to drive electrical generators. Nuclear reactors are designed with fail-safes in place to ensure that unusual situations, such as an increase in temperature or neutron flux, leads to the moderators being withdrawn, or neutron absorbers being inserted, or both.
At the end of their three-year life, the rods contain fission products and other neutron-absorbers which interfere with the normal control processes, so the spent fuel assemblies are removed. In a typical reactor, this is a rolling process, with one third of the fuel changed each year.
The fuel rods are hot, both in terms of their temperature and "hot" in that they are radioactive, so they are stored under water to cool them and to shield people from the radiation. In countries such as the USA, Canada, and Sweden, which use an "open fuel cycle", the rods will then become \Jnuclear waste\j, while other countries, such as the UK, France, and Japan, which have chosen to "close" the fuel cycle, will reprocess the fuel rods to make new ones.
Reprocessing uses straightforward chemistry to recover the plutonium and unused uranium, which make up about 97% of the fuel rod, leaving the remaining 3% of the fuel as a liquid high-level waste, to be solidified and disposed of in some way.
The recovered uranium is about 96% of the original uranium, and now contains about 1% of U-235, with plutonium making up the other 1% (a small amount of mass has been converted to energy, but at this level, the mass loss disappears between the rounded whole-number percentages). The plutonium which is recovered is an excellent fuel, requiring no enrichment, and it can be mixed with natural uranium, made into fuel rods in a mixed oxide (MOX) fuel fabrication plant and put back into the reactor as fresh fuel, or it can be used in breeder reactors.
The uranium can be purified by standard chemistry, and returned to an enrichment plant to become fuel for a reactor: this provides the most efficient use of uranium, but low prices for uranium have tended to discourage the establishment of expensive processing plants.
#
"Tokaimura nuclear incident: the background",1243,0,0,0
(Oct '99)
Japan is committed to using nuclear energy in the future, for reasons outlined in \JJapan and nuclear energy\j, so most of the scientific effort surrounding the Tokaimura incident is now being directed to identifying the causes, most of which seem to come down to a rich mix of negligence, ignorance, and plain stupidity. The incident involved serious radiation doses to three workers in a small plant in Japan, and in two of these cases, the result is likely to prove fatal. Another 66 workers are reported to have received some form of measurable radiation dose, but all of these were within "permissible limits".
The event occurred, according to industry sources, because of poor or negligent management. This, they suggest, is the only way to explain how workers were able to bring together too much uranium enriched to a relatively high level: it was this which led to a "criticality", a limited but uncontrolled nuclear chain reaction, which continued, on and off, for 17 hours.
Most of the concern in the past for nuclear safety has concentrated on the potential large dangers. \JNuclear reactor\j operations, where huge amounts of energy are involved, are seen as dangerous, as is the management and containment of high level wastes. In each of these parts of the nuclear industry, there is a finite risk that large amounts of radioactive materials might be released into the biosphere, causing extreme effects, as was seen in the 1986 Chernobyl disaster, when radionuclides swept right across Europe.
The standard view has been that other parts of the nuclear fuel cycle are comparatively safe, because they have much less potential for widespread harm to people or the environment. While opponents of nuclear energy can offer scenarios where spent fuel and reprocessed fuel may be hijacked in transit, the actual processing ought to be a straightforward process, and this has meant that working with fuel is generally less regulated than the power plants where that fuel is used.
In hindsight, this may not have been a sensible option at Tokaimura. There are 15 nuclear facilities of various kinds at Tokaimura, and there had already been a fire and explosion in a bituminizing facility for wastes in March 1997 in the village of Tokai (part of Tokai-mura, as the area is sometimes called), the causes of which were largely hushed up.
It is clear now that no lesson was learned from this, and the Japanese authorities are determined that this new event will not go the same way. There have been highly publicized "raids" on the company involved, and stern words have been spoken. Nobody has actually been charged, and there seems to have been no real changes, so far, but the matter is being treated very seriously.
This year's event happened in a very small fuel preparation plant operated by Japan Nuclear Fuel Conversion Co. (JCO), a subsidiary of Sumitomo Metal Mining Co. Industry sources have stressed that the plant was unconnected to the electricity production fuel cycle. Instead, they say, the process was almost a "boutique" operation, rather than a routine manufacturing operation where operators might be assumed to know their jobs reasonably well.
It was this "one-off" status of the plant which probably caused the incident. The JCO plant was commissioned in 1988 and processes small quantities of uranium of higher enrichment than for ordinary power reactors, and in this case was working with a uranium stock which was 18.8% U-235 supplied from France, for the experimental Joyo fast reactor (a 100 MW thermal reactor, which has been operating since 1978). None of the three workers involved had performed this particular task before.
This was JCO's first batch of fuel for the Joyo reactor in three years, and used a wet process, which required the workers to put uranyl nitrate solution into a precipitation tank. This meant they had to dissolve uranium oxide in nitric acid, and to do this fast, they were using stainless steel buckets rather than a purpose-built piece of equipment.
Next, the three operators poured the uranyl nitrate from the buckets into a precipitation tank in order to mix it evenly and quickly. Reports say that they used buckets because they were falling behind schedule. So one lesson from this incident is that safety-critical operations must never be subjected to time pressure which can lead to short-cuts.
In fact, they were supposed to use the installed plumbing which had a criticality control function. Because they were operating with buckets, too much of the solution ended up in a single tank, and this triggered the flash criticality and the neutron radiation which followed. While anti-nuclear campaigners have claimed that the whole assembly could have blown up like a bomb, this was in fact not possible at any stage.
The key elements here are the heavily enriched uranium, and the water. When a uranium atom fissions naturally, neutrons are produced, and a few of these will hit a second uranium atom, making it fission as well, splitting into two lighter atoms, releasing energy and more neutrons. Usually, most of the neutrons escape harmlessly, but in a critical mass, each neutron triggers the formation of one or more neutrons, and a chain reaction starts. When a chain reaction is set off, we have a criticality incident.
Usually, the neutrons from a U-235 fission are traveling too fast to have any noticeable effect, but water is a moderator and has the effect of slowing the neutrons down to a speed where they do more, not less, damage. The same amount of U-235 in dry form would have been no problem, but in solution, it was deadly. There was no explosion, no huge release of energy, no fire that spewed radionuclides into the sky, no China Syndrome, where the whole mess boiled furiously and melted its way down into the ground. Instead, there was a burst of neutrons and gamma radiation, both of which traveled a short distance before being absorbed, some of the neutrons and radiation hitting the operators' bodies. There was a brief and apparently minor release of radiation to the atmosphere, but not a great deal.
From a detached scientific view, what happened next is fascinating. The solution got very hot when it went critical, and steam pockets formed, separating the U-235 enough to take the material below criticality, so it would cool, go critical, and boil once more, a process that went on for 17 hours, until the water surrounding the precipitation tank was drained away, since this water provided a neutron reflector. Later, a neutron absorber, boric acid solution, was tipped into the tank to stabilize the remaining contents.
This on-again off-again situation is what typically happens in a criticality situation. It is no longer classified information that the early fission weapons used conventional explosives to force the two halves of a critical mass together long enough to create an "atomic explosion", but left to itself, critical material will tend to blow itself apart before any large-scale chain reaction can occur.
This blowing-apart can still hurl radioactive material around over a wide area, and injure or kill a number of people, but scenarios involving a nuclear holocaust are somewhat exaggerated. In the worst case, an incident like this is likely to kill tens or hundreds of people, rather than tens of thousands, but the worst case, or any case, should never have arisen.
Almost all the radiation produced came from the tank itself, and only traces of volatile fission products were released from the factory, including iodine, krypton, and xenon. Later, traces of cesium were also reported. People living within 350 meters of the plant were evacuated about five hours after the accident started, while people living with 10 km (6 miles) were advised to stay indoors, which was appropriate given what we now know. The 150 people evacuated were allowed to return to their homes when their safety could be assured, which was 27 hours later.
The five-hour reaction time, and the advice to stay indoors for those further away seem now to show an unsatisfactory level of preparedness. Emergency services arrived, unaware that a nuclear accident had taken place, and there seemed to have been no preparation or contingency plan to deal with an event like this.
The energy produced was equivalent to the fission from 2 mg of U-235, about 160 megawatts, which is about the equivalent of the burning of five liters of petrol, about 1.3 US gallons of gasoline. The event was rated at 4 on the \JINES\j scale by the Japanese Government. This puts it behind Three Mile Island, although in that case, the accident was less significant in its actual radiation effects, because the problem area was shielded from staff and others.
A week later the Japanese Science and Technology Agency suggested raising the event to \JINES\j level 5, because the level of off-site risk had been made worse by the operator's "established violation of the law and organizational neglect of safety management". In all, 39 workers, three firemen and seven members of the public were exposed to "elevated doses" during the accident, with a further 21 workers receiving elevated doses in the process of shutting down and clean-up.
The permissible limits for dosage are 50 millisieverts () in one year for workers in a nuclear plant, or 100 mSv for emergency situations; and 1 mSv/yr for members of the public, where a dosage of 8,000 mSv is likely to be fatal. Apart from the three serious cases (17,000, 8,000 and 3,000 mSv), all the other doses were within the permissible levels.
Some iodine-131 was detected in a ventilation stack, which was then closed. Radio-iodine is a problem because human bodies concentrate iodine in the thyroid gland, but the level, even inside the stack, was only twice the permissible level for air outside the factory: for the most part, the incident seems to have had no lasting effect on the environment - thanks to good luck, not good management.
The company admits that it violated both normal safety standards and legal requirements, and criminal charges are reported to be under way. At least one of the three operators apparently did not know the meaning of "criticality". None had performed this particular operation before, none was wearing a film badge and there was no neutron monitoring anywhere in the facility.
Aside from the bizarre practices, where JCO ignored the official operations manual, and even violated its own unauthorized procedures, this is very much a "one-off", according to people in the industry, who are predictably keen to point out the differences between this peculiar event and industry reality. The plant, being a "boutique operation", is outside the mainstream nuclear fuel cycle, and this apparently kept it relatively free from scrutiny, even for fuel processing plants.
But this raises questions about the scrutiny process itself: no major civil plant uses uranium enriched beyond 5% U-235, so the much higher level of enrichment should have raised alarms somewhere, and most plants use a dry process in any case, which is intrinsically safer. Once again, the use of a wet process should have required a greater level of control and regulation, and so should "boutique operations", where workers do not have training and experience in following established safe procedures. A fictional plot combining all of these danger signs, with no official reaction, would be criticized as too improbable for words, but it happened at Tokaimura.
This is not the first criticality incident in the world: the first happened in Chicago in 1942, under controlled conditions, and a number of similar criticality accidents have occurred, especially in US and Russian military plants and laboratories, most of them before the early 1980s, with incidents in 1958 and 1964 being described as "very similar to this accident". Once again, there was good reason to expect, anticipate, and prevent problems under these conditions if the workers and management were properly educated and trained.
A total of 37 previous accidents occurred in connection with research reactors or laboratory work for military projects, and these accidents killed ten people. Another 22 occurred in fuel cycle facilities, all but one of them military-related, and these resulted in seven deaths, and energy releases ranging from 0.03 MJ to 3 GJ, from amount of energy in 10 mL (half a fluid ounce) of petrol/gasoline, up to the energy in 100 litres (22 gallons).
#
"INES",1244,0,0,0
(Oct '99)
The seven-point International Nuclear Event Scale was developed to allow prompt communication of safety significance.
\BLevel 0:\b Below scale: events with no safety significance.
\BLevel 1:\b An anomaly involving "deviations from authorized functional domains".
\BLevel 2:\b An incident with potential safety consequences.
\BLevel 3:\b A serious incident involving any of a very small release with public exposure at a fraction of prescribed limits, major contamination, overexposure of workers, or a near accident with loss of defence-in-depth provisions, such as the incident at Vandellos, Spain in 1989, which was a turbine fire, with no radioactive contamination.
\BLevel 4:\b An accident mainly in installation with either a minor release involving public exposure of the order of prescribed limits, or partial core damage with acute health effects to workers, such as in Saint-Laurent, France, in 1980 (fuel rupture in reactor), or Tokaimura, Japan, in 1999.
\BLevel 5:\b An accident with off-site risks, with limited release and partial implementation of local emergency plans, on-site severe core damage, such as Windscale, UK (now \JSellafield\j), in 1957 and Three Mile Island, USA, in 1979.
\BLevel 6:\b A serious accident with a significant release, requiring full implementation of local emergency plans.
\BLevel 7:\b A major accident with major release of material and widespread health and environmental effects, such as Chernobyl, Ukraine, in 1986.
So far, there is no definition for a Level 8 event.
#
"Nuclear waste",1245,0,0,0
(Oct '99)
\JRadioactive waste\j comes in several forms. At the low end, there are contaminated gloves and protective suits, as well as tools and laboratory equipment used in processing radioactive materials, and worn-out parts of reactors which are only slightly radioactive. Under normal circumstances, such waste can be stored in drums until the radioactivity has died away, though in some cases, this may take 50,000 years or more. The problem is that while the material is harmless enough, if it happened to catch fire, the smoke and ash would carry radionuclides away, and if the storage area is flooded with water, some of the materials may be carried away and incorporated into the food chain.
One of the basic rules of science is that entropy, the disorder of things, always increases, but life, at a local level, can be an anti-entropy agent, making things more ordered at a local level, even as entropy is increasing on a larger scale. Iodine-131 is concentrated in the human thyroid, \Jstrontium\j-90 is so similar to calcium that we store it in our bones, and other organs store other radioactive materials as well. This means that wastes, even low-level wastes, need to be stored in something they cannot escape from until the danger is no more than we would find in the natural environment. One example of a product with potential to do this is \JSynroc\j, with synthetic glasses as a less viable alternative.
Then there are higher levels of waste. The mops used to clean up at Tokaimura, for example, would be remarkably "hot", and need a different level of treatment, as intermediate-level waste, and other contaminated material needs similar treatment.
And at the top, we have the leftover fuel material, which is likely to remain dangerous for perhaps half a million years. This is a particular target to the nuclear industry, who prefer to look at the spent fuel as the starting point for a new round of energy generation, rather than as a useless and problematical end-product.
The alternative is some form of geological disposal, perhaps as Synroc, or perhaps by dumping the waste in a subduction zone, where one tectonic plate is sinking beneath another plate, but this still remains a theoretical solution, with many worries associated with it.
#
"Japan and nuclear energy",1246,0,0,0
(Oct '99)
Japan is one of the world's richest nations, but it suffers a severe lack of natural energy resources, so the civilian use of nuclear power is central to the nation's economy. Japan is the largest energy importer in the world, but it aims to establish a complete Japanese fuel cycle and thus achieve virtual national independence in all aspects of nuclear power generation.
This has led to the Japanese pro-nuclear forces working to establish the acceptance of nuclear power overseas, as well as in Japan, because they have recognized the political importance of "winning hearts and minds". In that context, there will be those pro-nuclear officials in Japan who will be happy to see some other officials hung out to dry over Tokaimura, and equally, there will be those who will be working very hard to make sure that such abysmal ignorance is prevented hereafter.
To get a measure of the importance of nuclear power in Japan, in 1973 Japan imported 89.4% of its energy, while by 2010, in spite of huge increases in use, energy imports will be down to 80%. In the same period, oil will have declined as an energy source from 77.4% to 50%, with a massive increase in the country's dependence on Middle East oil. The present aim is for the electricity generated by nuclear energy to reach 45.6 million kilowatts by the year 2000.
Japan is an area which poses huge economic risks for the rest of the world. While the most likely scenarios for massive worldwide economic disruption stem from a major earthquake in either California or Japan, followed by economic chaos as Japan rebuilds, or as Japanese insurers attempt to recoup losses from such a disaster by selling-off holdings outside Japan, any major disruption of Middle East oil before about 2050 would also have the potential to turn Japan into an economic basket-case. The "big one" quakes in Tokyo or California can only be partly engineered against, but nuclear power offers the option of evading the worst effects of any disruption in the Middle East and it is worth noting here that most of the oil fields of the Middle East are in earthquake-prone zones as well.
Between 1970 and 1995, Japanese nuclear power went from 0.3% to 12% of energy production, so there is still a long way to go, and further increases are targeted. The realities of shoring-up a whole economy will almost certainly mean that a few sacrifices will be made beyond the hurried explanations put out by the nuclear industry to isolate Tokaimura as a one-off public relations disaster.
True Japanese energy independence requires a nuclear fuel cycle based on \Jfast breeder reactors\j, or FBRs, which generate more fuel as they operate. This has raised the fear that other countries might take advantage of the Japanese precedent, and begin reprocessing spent fuel. Such countries might launch the development of breeder reactors without taking adequate safeguards, and some of the plutonium produced might also be diverted to a weapons program.
In fact, Japanese insistence on pursuing a relatively costly plutonium program, in spite of the long-term drop in uranium prices, has raised the question of whether Japan truly intends not to ever use nuclear weapons. In fact a small scandal arose in Japan at the end of October when a right-wing official proposed that Japan develop such weapons.
As well, the program involves the regular shipping of mixed oxide or MOX fuel and vitrified high-level nuclear waste between Europe and Japan on a long-term basis, and this has allowed all sorts of terrorist scenarios to be served up. Even the "Monju accident" in Japan in 1995, when sodium leaked from the cooling system of the prototype Fast Breeder Reactor, Monju has been used to call into doubt the safety and wisdom of all forms of nuclear power.
According to the Japanese government, even though oil and uranium prices are stable at the moment, both resources will approach depletion by the middle of the 21st century, and since Japan is poor in energy resources, it has no choice but to rely on nuclear power sustained by a fuel recycling program. The government also cites the reduced greenhouse emissions from using nuclear power, at the same time pointing to the limited potential of energy conservation and alternative energy sources.
In political terms, Japan sees China, Indonesia, Thailand, and Vietnam all aiming to increase their dependence on nuclear power, and sees an advantage in cooperating with a region which bears a striking resemblance to the areas Japan sought to dominate during World War II, in what they then called a Co-Prosperity Sphere. (At the same time, the rapid development of these same countries makes it more likely that there could be strong variations in the supply of, and demand for, available oil stocks.)
Opponents of the Japanese national policy cite safety concerns, the need for more openness and greater control over the PAC, the body which oversees the recycling of nuclear fuel in Japan. They demand greater local control over the location of reactors, which would allow NIMBY (Not In My Back Yard) principle to prevail, and more use of energy conservation and alternative energy sources. In other words, the competition is between idealism on the one hand, and political reality on the other. It is not hard to guess which "side" will win.
The NIMBY principle is already in action. People living near the plants ask why power plants are not built near Osaka or Tokyo, where most of the energy is consumed. In August 1996, the residents of the town of Maki, in Niigata prefecture, voted down the government-planned construction of a nuclear power plant by referendum, the first such case in Japan. Some 70% of people acknowledge the need for nuclear power, but more than half of them have concerns about its safety.
Right now, Japan needs to send its spent fuel to Europe for re-processing, but it aims to close the fuel cycle, moving away from light water reactors to breeder reactors, and set up its own plutonium recovery systems. After the Monju incident, Japan expected to start using plutonium in commercial breeder reactors in about 2030. Meanwhile, the country is stockpiling plutonium, and can be expected to have stocks of 5-10 tons of plutonium at any given time. The Japanese government says it will not accumulate excess plutonium and will seek to absorb much of its surplus plutonium by having it converted to MOX nuclear fuel for its light water reactors and its existing breeder reactors.
While Japanese nuclear specialists like Ryukichi Imai claim that reactor grade plutonium is not suited for weapons production, international nuclear experts say the reactor grade plutonium can be converted with little difficulty to weapons grade material.
#
"Uranium mining and safety",1247,0,0,0
(Oct '99)
Uranium ore is found as uranium oxide, U\D3\dO\D8\d, in rocks. While this is barely radioactive, the uranium produces radium and radon from radioactive decay chains, and these make the ore, and working with it, potentially hazardous, especially if it is high-grade ore. The ore is crushed and ground, then treated, usually with sulfuric acid, to dissolve the uranium and other metals. The leftover solids are pumped as a slurry to a tailings dam, which is designed and built to retain them securely. Most of the radium remains in the tailings, and the leach liquor, the dissolved material, is sent for processing.
The leach liquor is then chemically treated to precipitate out the uranium oxide as a bright yellow solid, called "yellowcake". This is dried at high temperature, producing a khaki powder which is packed into 200 liter (44 gallon) drums for shipping. According to industry figures, the radiation dose at one meter from a full drum is about half the dose received from cosmic rays by a person on a commercial jet flight.
Radiation illness in miners has been known from underground mines in the mountains near the present border between Germany and the Czech Republic, and in the late 1800s, this illness was diagnosed as lung cancer. By 1921, radon was suggested as a possible cause of the disease, and this was confirmed by 1939. Nonetheless, it was only in about 1960 that the problem was addressed properly.
Workers in the uranium mining and processing industry today are monitored carefully, and dosage limits are set at 20 millisieverts/yr, averaged over five years as the maximum allowable radiation dose rate for mine workers. This includes any dose received from radon (and radon daughters), and is in addition to natural background levels, but excludes medical exposure.
Radon is a special problem because it is a gas which can be breathed into the lungs, where it can decay to solid atoms which are energetic alpha-emitters. At high concentrations it is a health hazard because its short half-life means that disintegrations giving off alpha particles occur relatively frequently. Alpha particles discharged in the lung can eventually give rise to lung cancer - this became apparent in the 1960s, when miners who smoked were found to have very high levels of lung cancer, because the radon daughters were more likely to stay in their lungs.
Controls at uranium mines include ventilation. Most Australian uranium mines are open-cut, which provides good ventilation for the ores, which are generally less than 0.5% U\D3\dO\D8\d. The Olympic Dam underground mine has an ore grade less than 0.1% U\D3\dO\D8\d, but this has to be ventilated with powerful fans. This keeps the Olympic Dam radiation doses below 10 mSv/yr, with an average of about 5 mSv/yr.
As well, the mine site in open-cut mines is sprayed with water to keep dust down. At Australia's Ranger mine, dust typically contributes 4 mSv/yr to a worker's annual dose. In the processing plant, the uranium oxide is regarded as the toxic equivalent of lead oxide. Both have similar effects on the kidneys, so air is carefully monitored, and respirators are used where necessary, but this exposure is a biochemical hazard, rather than a radiological hazard.
When mining is complete, the tailings, with virtually all of the radioactive radium, thorium and actinium materials, will contain most of the radioactive material from the original orebody, which means that camping on it or walking on it would be as dangerous as doing the same thing on the exposed orebody itself, so the tailings need to be covered over with enough rock, clay and soil to reduce both gamma radiation and radon emanation rates to levels near those naturally occurring in the area. After that, the area can be seeded and planted to stabilize the surface. With no uranium to keep feeding the decay sequences, the tailings quickly drop to about 70% of the original orebody radiation levels, and then remain fairly constant after that.
Process water left over after the yellowcake has been precipitated out contains radium and other metals, so this is retained on the site and evaporated, with the heavy metals being precipitated out and added to the tailings.
Under ideal circumstances, the actual mining of uranium should not cause any problems, but this leaves open the questions: what are the likely or possible un-ideal circumstances, such as floods, and also, what un-ideal uses might the uranium ore be put to outside of the mine area? To study these, we probably need to look more carefully at the nuclear fuel cycle and \Jnuclear fuel processing methods\j.
#
"Fast breeder reactors",1248,0,0,0
(Oct '99)
These are fast neutron reactors, all of which use a different technology to generate power from plutonium by using uranium-238, rather than U-235. The FBRs are designed to produce more plutonium than they consume, unlike the "burners", which are fast neutron reactors that are net consumers of plutonium. Burners are useful as a way of disposing of plutonium from military weapons stockpiles, while generating useful energy.
The fast neutron reactors use plutonium from the closed fuel cycle as their basic fuel while also converting depleted (or natural) uranium, located in a "fertile blanket" around the core, into more fissile plutonium. Both U-238 and Pu-240 are "fertile", because they can capture a neutron and become (indirectly or directly) Pu-239 and Pu-241 respectively.
In a conventional ("thermal") reactor, six fissions typically produce 15 neutrons, but in a fast breeder, neutron production averages 17 neutrons from six fissions, which allows the system to "breed" more fissile material than is consumed - if this is desired.
The fast neutron reactors operate at a high temperature, using liquid sodium (melting point about 98║C) as the coolant, and this high temperature makes them very efficient. As well, they are potentially able to consume 100% of the uranium dug from the ground, yielding about 60 times the energy from a gram of uranium that can be obtained in the standard light water reactors of today. The only problem is that cheap uranium and high capital costs make this an unattractive prospect this side of 2050, explaining the demise of the French Superphenix FBR (see \JNuclear news\j, June 1997).
#
"A more recent date for the last Neandertals",1249,0,0,0
(Oct '99)
The Neandertal, or Neanderthal, people are generally thought to have died out about 30,000 years ago, somewhere in Spain. A new set of radiocarbon dates, taken from Neandertal fossils from the Vindija cave site in Croatia, were reported in the \IProceedings of the National Academy of Sciences\i in late October, suggesting that Neandertals roamed central Europe as recently as 28,000 years ago. This is the latest date ever recorded for Neandertal fossils worldwide.
Perhaps more importantly, the old picture that we have of the last Neandertals huddling in a corner of Spain will need to be revised if these dates are correct. It now appears that the disappearance of the Neandertals, whether by displacement or population absorption, was a slow and geographically mosaic process. For that to happen, the differences between the Neandertals and "modern humans" in basic behavior and abilities must have been small, minor enough for the two groups to have coexisted for several thousand years.
The dates were derived from pieces of Neandertal skulls from the Vindija cave site, and the ages are between 28,000 and 29,000 years old, while the previous most recent date for Neandertals in central Europe was about 34,000 years. With that evidence, people had assumed that the last remnants of the dying race were pushed into the Iberian Peninsula where their last members died off some 30,000 years ago. The new dates certainly call that theory into question, while earlier work by Fred H. Smith, one of the authors, has identified some modern human anatomical characteristics in late Neandertal fossils from the cave site.
Some earlier research, described in \JNeandertal man partly cloned\j, July 1997, claimed to show that while Neandertals and early modern humans may have coexisted in Europe, they probably did not interbreed, but see \JDid Neandertals and modern humans interbreed?\j, April 1999, for a different view on this. Smith argues that "When you look at the anatomy of early modern Europeans, you also find a number of features that are hard to explain unless you allow the Neandertals some ancestral status. And actually, the Neandertal mitochondrial DNA is not completely out of the modern human range, just on its extreme periphery."
The cave is also the source of a large number of tools, some of them the relatively crude stone tools commonly associated with the Neandertals, while others are made of more sophisticated stone and bone tools, of the sort usually assumed to have come from early modern humans. So why was there a combination of tools? Did both groups inhabit the cave, perhaps at close but different times, or did the two groups trade tools with each other?
#
"The oldest dinosaurs?",1250,0,0,0
(Oct '99)
Madagascar, that excellent hunting ground for paleontologists, has delivered the goods once again, it seems. A report in \IScience\i in October details the jaws of two of the oldest dinosaurs ever discovered, as well as the remains of eight other prehistoric animals. The fossils come from the middle to late Triassic (225 to 230 million years ago), a period that has long been a puzzle for paleontologists, which begins with a range of reptiles, amphibians, and other vertebrates on the land, and finishes with the early dinosaurs and mammals present, but with few fossil clues to what was going on.
The two new dinosaurs appear to be older than \IHerrerasaurus\i and \IEoraptor\i, the current seniors of the dinosaur world. They were prosauropods, herbivores with small heads and long necks, about the size of a newborn calf, which could walk on two or four legs, relatives of, or ancestors to, the great sauropod dinosaurs which evolved later, such as \IApatosaurus.\i However, naming will take a while, as the discoverers want to get more fossil material excavated or cleaned before they commit themselves.
It is worth recalling here that the name \IBrontosaurus\i was given from an almost complete specimen, but it was later discovered that a fragment of a brontosaur had previously been named \IApatosaurus\i, and on the rules for zoological naming, this earlier name had to be used. They do know that "Mena" will appear in at least one of the names, commemorating the young Madagascan who led them to the site of the fossils in 1996.
The \IScience\i report only covers part of what has been discovered so far, because apart form the prosauropods, the find includes three members of the branch of animals that includes modern day reptiles and five members of the branch that includes mammals. This report deals only with one of the so-called "mammal-like reptiles" described in more detail, together with the two prosauropods.
The other finds are the key to dating the dinosaurs, as several of the other animals are more ancient, and come from lines that have never before been found in the same strata as dinosaurs. So while the Madagascar sites have not yet yielded the right minerals for radioisotope dating, the rest of the fossil record suggests that this is a very special find indeed. The anatomical details of two of the fossils, one a parrot-beaked reptile, the other an early relative of mammals, point to the fossils as more primitive cousins to similar animals already known to be about 228 million years old.
More importantly though, is what is missing: the site is suspiciously lacking in fossils of aetosaurs, small, armored reptilian herbivores that were abundant about 228 million years ago. So if there are no aetosaurs, perhaps that is because this site dates from a time before the aetosaurs evolved.
The fossils are at present held at the Field Museum of Chicago, but once the study of them is complete, many will be returned to Madagascar.
#
"The strange tale of the homosexual beetles",1251,0,0,0
(Oct '99)
\IDiaprepes abbreviatus\i is an inch-long black beetle commonly known as the sugar cane rootstalk borer weevil. The beetle is common in southern Florida, in the United States, where it preys on citrus crops. In the past, the beetles have been seen engaging in "homosexual behavior", where beetles mount other beetles of the same sex in apparent copulation. It turns out that this lurid slant is rather less than reliable as a description of what is actually going on.
Ally R. Harari, together with Jane Brockmann, has studied the effect, and a report which they published in \INature\i during October seems to offer an explanation for this behavior. Both male and female beetles mount each other, and the researchers have discovered that the sight of a pair of mounted females attracts large males, who are then equally likely to mate with either of the two females. Smaller males, on the other hand, seem to be put off by the size of the top female, so they stay away.
In other words, it appears that by mounting each other, the females are able to attract more attention from larger males than if they were seeking males alone. The bottom females are able to push the top ones off, but do not do so, again suggesting that there is an advantage in the behavior for that female as well.
The males and females are almost indistinguishable, so when passing males see the mounted females they mistake them for a male mounting a female. Studies have shown that males are attracted by the sight of normal male-female mating couples, and while some insects use pheromones, chemical signals to attract a mate, it seems that the rootstalk borer responds to visual cues, to the sight of potential mates.
What makes the study memorable is a problem the researchers faced: female beetles mount each other for an average of 17 minutes, which makes it difficult to examine the phenomenon experimentally. So the intrepid pair got around this by gluing dead females on the backs of live females, so they could observe the reactions of males. But however you look at it, the citrus farmers of south Florida need have no homophobic fear of what the beetles are doing down in the orchard.
#
"The earthquake dangers beneath the Pacific Northwest",1252,0,0,0
The worst earthquake in the US Pacific Northwest in this century was not like the long-awaited and anticipated earthquake called "The Big One". When "The Big One" comes, it will be a great thrust earthquake caused by the rupture of a huge offshore fault beneath the ocean. Yet the most damaging earthquake in that part of the USA this century was a different type of shock called an "intraslab" earthquake. It was a magnitude 7.1 event which occurred in 1949 beneath Olympia, Washington state, USA, and it caused over $100 million in damage.
A paper by Simon M. Peacock and Kelin Wang on intraslab earthquakes was published in \IScience\i in late October. It confirms the way in which these quakes are generated and warns of the hazards intraslab quakes pose to certain geographic areas, such as Japan and the northwest of the USA. According to Peacock, the intraslab earthquakes are probably not given sufficient consideration when scientists carry out seismic hazard analysis. He says this is especially important, since these less easily understood earthquakes occur closer to major population centers than the larger, offshore, great thrust earthquakes.
The better-known and studied great thrust earthquakes, like the intraslab earthquakes, occur in "subduction zones," where oceanic crust dives beneath the edge of a continent, but the great thrust earthquakes have epicenters between 0 to 50 kilometers down, somewhere along the sloping boundary between the descending plate and the continental margin. The intraslab earthquakes happen within the descending oceanic crust at depths of 50 to 300 kilometers beneath the surface. As well, these quakes appear to be caused by different processes, by changes in the rocks themselves.
In 1996, Stephen H. Kirby at the USGS proposed a mechanism for intraslab earthquakes, which is supported by the current study. Kirby's view is that the intraslab earthquakes happen because the intense heat and pressure in the subduction zones change the descending oceanic crust into denser rocks in a process called metamorphosis. These changes "wring" water out of the rocks as the new minerals form, and the water which is driven out of the minerals is able to lubricate and reactivate pre-existing faults. Without the water, the weight of the overlying rocks would hold everything in place, but the pressurized water causes the faults to slip, setting off earthquakes.
While Kirby's theory is well-founded in mineralogical theory, Peacock and Wang have now provided sound observational proof of the theory. They did this by examining two subduction zones in Japan, knowing that the key element in the mineral changes is temperature-dependent. In a warm zone, where the temperature of the subducting oceanic crust is higher, the triggering temperature will be reached at a shallower depth, setting off shallow intraslab earthquakes and causing less volcanic activity than would be seen when the subducting crust is colder.
The oceanic crust diving down beneath southwest Japan forms a "warm" subduction zone, while the subduction zone beneath northeast Japan is "cold" - and as the theory predicted, things happen much deeper beneath northeast Japan, while intraslab earthquakes occur at relatively shallow depths beneath southwest Japan.
The Pacific Northwest (northern California, Oregon, Washington, and southern British Columbia) are underlain by warm subduction zones. So, too, is southern Mexico, where the Oaxaca earthquake in late September was an intraslab earthquake. According to Peacock, "We're starting to realize that we have to worry about a magnitude 7-7.5 intraslab earthquake located 50 km beneath Seattle or Vancouver, as well as a magnitude 8 or 9 out on the coast."
An interesting side issue: Peacock's expertise is metamorphic petrology, a largely academic study, but he points out that the study is an example of how specialized scientific research can sometimes yield information with real significance to everyday life. All of a sudden, ". . . there is a solid connection between metamorphic processes and earthquakes that have killed tens of thousands of people," he says. As so often happens, a piece of pure science turns out to provide the basis for some applied science.
#
"The 1999 earthquakes",1253,0,0,0
(Oct '99)
The major earthquakes of 1999 have shown a great degree of variability, as earlier reports have indicated. Here are the latest figures on deaths from those earthquakes:
August 17, 1999 Turkey - a magnitude 7.4 earthquake which killed 17,000.
September 7, 1999 Greece - a magnitude 5.9 earthquake which killed 143.
September 20, 1999 Taiwan - a magnitude 7.2 earthquake which killed 2,300.
September 30, 1999 Mexico - a magnitude 7.5 earthquake which killed 33.
And, most recently, on October 16, California - a magnitude 7.1 earthquake which woke up everyone, killed no one and "broke a few bottles of ketchup". (As this report was being finalised, news arrived of a November 12 earthquake beneath Duzce in Turkey, with a magnitude of 7.2. This earthquake collapsed more buildings, and within several days had a reported death toll of 450, a figure which seems likely to rise.)
#
"Lemur news for 1999",1254,0,0,0
(Oct '99)
Lemurs have been restricted to Madagascar, off Africa's east coast, for more than 50 million years, where they have evolved into almost 50 species, including about 16 species of giant lemurs which are now extinct. Many of the remaining species are threatened. As we have indicated in the past (\JLemur news for 1998\j), Duke University is continuing its work on lemur breeding and release. They report that they have just captured two of the animals, a male and a female, both "diademed sifakas" - the largest living lemur and considered among the most beautiful of primates, with lush fur of yellow, orange, gray, white and black.
The female has been named "Juliet", because after a six-to-12-month period of acclimatization in Madagascar, she is expected to join another of her species, Romeo, in residence at the Primate Center in Durham, North Carolina, where they will hopefully help establish a new population, which can later be released back into the forests of Madagascar.
The sifakas were taken from a patch of forest which is fast disappearing, being destroyed by wood-gathering and slash-and-burn agriculture. The expedition originally hoped to capture three young adult females and two males, as part of an urgent effort by the Primate Center scientists to establish a captive breeding colony of the animals before they become extinct from hunting and habitat destruction. The catch was disappointing, but at least it gives the Durham team something to work on.
Romeo arrived at the Durham Primate Center in late 1993, along with his mother Titania and another unrelated male named Oberon. Oberon died soon after of an infection, and Titania died later, probably from too much calcium in her diet, an unexpected and atypical reaction for a lemur. As an infant, Romeo needed more calcium in any case, and so did not suffer from hypercalcemia. He survived, but calcium-rich foods are now excluded from his diet, and the Duke primatologists believe they have developed enough information about the sifaka's low-calcium diet of leaves to bring more animals into captivity without any risk of loss.
Romeo now weighs 14 pounds (6.5 kg), well on his way to his adult weight of 18 to 20 pounds (8-9 kg), so it is time to get him into a breeding program. Because captured animals are well fed and protected from disease and natural enemies, they can produce from five to 10 times more offspring that survive to adulthood than wild animals normally can, making the early losses worth bearing, in order to save the whole species.
#
"Ice-age sediment cores record extreme climate change",1255,0,0,0
(Oct '99)
A report in a mid-October issue of \IScience\i describes the analysis of recent sediment cores taken from the subtropical Atlantic Ocean from sediments deposited during Earth's last glacial period. The study of these cores indicates that the temperature of the Sargasso Sea between the West Indies and the Azores fluctuated repeatedly by up to 5║C (9║F) from 60,000 to 30,000 years ago.
This shows that the warm Atlantic, like the polar Atlantic, was undergoing very large and very rapid temperature changes during the last glacial period. Previous studies and models have all indicated that changes in warm ocean temperatures are likely to produce widespread, global climate impacts, if only because of the effects they have on atmospheric humidity, as we see every few years with the El Ni±o effect. The warm oceans create much more water vapor, increasing atmospheric heat, trapping and driving the temperature even higher.
Scott Lehman and Julian Sachs reached their conclusions after studying 50 meters of sediment cores hauled up from several miles down in the Sargasso Sea near Bermuda by French scientists as part of an international project. They analyzed the saturation state of organic molecules from planktonic algae over the past 100,000 years, providing a measure of the sea-surface temperatures during that period.
At the end of the last ice age, about 10,000 years ago, there was a warming, accompanied by the disappearance of enormous ice sheets. At the same time, global atmospheric CO\D2\d levels rose by a third, and there were changes in the seasonal distribution of the sun's energy. The addition of large amounts of fresh water to the oceans can trigger abrupt and long-lasting cooling events, including ice ages, by interfering with the "conveyor belt" of water (including the Gulf Stream) which carries heat from the tropics to temperate regions. (See \JGlobal warming: could we lose the conveyor?\j, November 1997.)
This sort of effect has been recognized before - see \JA quick cold snap\j, July 1999, for an earlier take on the same issue. If more heat is trapped in the atmosphere, this has the potential to kill major parts of ocean circulation, and then echo around the world. What is new is the worrying reports that the whole global system shows signs of being on a hair trigger, where the world climate we know and survive under could collapse without warning. What needs repeating is the reminder that there are no safe refuges. Any human-induced changes to the ocean's plumbing are likely to affect everyone on Earth, not just Greenlanders and Northern Europeans.
#
"Is the West Antarctic ice sheet in its death throes?",1256,0,0,0
(Oct '99)
The West Antarctic Ice Sheet is about 930,000 square kilometres (360,000 square miles) of ice which has been receding steadily for 10,000 years. If the next few years are going to see any serious rise in sea levels, it will probably be because of the sheet's potential instability.
The ice sheet is about the size of Texas and Colorado combined, or France and Italy combined, or New South Wales, and its complete collapse would raise global sea levels some 4 to 6 meters (15 to 20 feet), flooding many low-lying coastal regions, including all of the world's port facilities, and many of the world's airports. Unlike ice floating on the surface of the sea, which has \Iabsolutely no effect\i on the sea level as it melts, an ice sheet which sits on hard rock like the West Antarctic sheet can make a huge difference to sea levels.
A paper describing recent research on the ice sheet, published in \IScience\i in early October, suggests that the sheet's complete disintegration in the next 7,000 years is inevitable. Human-caused climate change might hasten the ice sheet's demise, but it looks as if there is nothing humans can do to slow or reverse the trend.
Howard Conway, lead author on the paper, says that "Collapse appears to be part of an ongoing natural cycle, probably caused by rising sea levels initiated by the melting of the Northern Hemisphere ice sheets at the end of the last ice age, but the process could easily speed up if we continue to contribute to warming the atmosphere and oceans." Co-authors on the study include Edwin Waddington, Anthony Gades, George Denton, and Brenda Hall.
A variety of evidence from raised beaches and radar imaging of subsurface ice structures indicates that the ice sheet has both thinned and decreased in area since the last glacial maximum 20,000 years ago. The rate of withdrawal of the grounding line (the boundary between floating ice and grounded ice) has been constant for the last 7,600 years. During that time, the grounding line receded about 1,300 km (800 miles) since the ice age, and this rate of withdrawal is still occurring today, giving the estimate of another 7,000 years, all other things being equal.
This will not be the first time the area has been clear of ice. Other scientists have found fragments of diatoms from cores drilled through the ice and into the land beneath. As diatoms usually need free water to build their colonies, this suggests the region once was free of ice, perhaps as recently as 130,000 years ago, between the last two ice ages.
\BFootnote\b: It may seem illogical that a floating ice block can melt into the sea without affecting sea levels, but this is a simple application of the principle proposed by \JArchimedes\j, that a floating body displaces a mass of fluid equal to its own mass, so the ice block is already displacing the same mass of water that it will form when it melts. A small iceberg floating in a lake will melt without raising the water level, and ice blocks in a brimming glass will also melt without spillage - so long as the glass is covered by a jar to prevent too much condensation from the atmosphere onto the melting ice blocks.
See also \JWest Antarctic icesheet still stable\j, December 1998.
#
"Snowball Earth?",1257,0,0,0
(Oct. '99)
Has the Earth been a giant snowball, covered in ice, glaciers and frozen seas, from pole to pole, three times in the past? That is the claim put forward by James F. Kasting, addressing the annual meeting of the Geological Society of America in late October. He puts the events at 600 and 750 million years ago, but says the earliest occurred 2.3 billion years ago.
Kasting claims that there is convincing evidence that at least six of the seven continents were once glaciated - and that some of them were near the equator at the time. If the evidence for glaciation at the equator is correct, then there are two possible explanations for equatorial glaciation. It is possible that the Earth's tilt, now 23.5 degrees from vertical, was greater than about 54 degrees from vertical. This would have positioned Earth so the poles received the most solar energy and the equator would receive the least, creating a glacier around the middle but still leaving the poles unfrozen.
As an alternative, greenhouse gases in the atmosphere may have fallen low enough so that over millions of years, glaciers gradually encroached from the poles to 30 degrees from the equator. Once they got this close, the greater reflectivity of the ice would cool the rest of the Earth, so that in about 1,000 years, the ice would fill the remaining gap.
Kasting favors the greenhouse gases scenario, and suggests that the first glaciation was triggered by changes in methane, while the two later events were triggered by changes in carbon dioxide levels. Small increases in the amounts of oxygen in the early planet's methane atmosphere would send methane levels plummeting without producing enough carbon dioxide to compensate, but over time, carbon dioxide from volcanoes would raise the levels to a point where the Earth could start to thaw once more.
Kasting can even explain the drops in carbon dioxide for the two later events. As calcium and magnesium silicate rocks weather, the magnesium and calcium carbonate deposits which form take large amounts of carbon dioxide out of the atmosphere. The big problem, say other scientists, lies in explaining where the biological evidence for the glaciations has gone. Quite a few different life forms would have had to survive the glaciation, which is difficult to imagine on an ice-covered world, though it is possible that our remote ancestors survived in refuges like hot springs or near undersea thermal vents.
#
"November, 1999 Science Review",1258,0,0,0
\JHigh-velocity clouds explained\j
\JA Hawaiian-style volcano on Io\j
\JA quasar, far, far away\j
\JSix more planets\j
\JA planet confirmed\j
\JMercury transit\j
\JMars Polar Lander\j
\JCancer cures\j
\JMelanoma vaccine trials succeed\j
\JSetting a disease to kill a disease\j
\JHow a virus triggers asthma\j
\JHepatitis-related carcinoma may increase in the US\j
\JBeating the TB bacillus\j
\JChasing the dragon causes problems\j
\JHerpes and heart disease link explained\j
\JHuman genome carries a virus related to HIV\j
\JDeinococcus radiodurans genome sequenced\j
\JFirst detailed map of malaria parasite\j
\JModified soya beans strike problems\j
\JHow flowers recognize pollen\j
\JMajor journals join to offer better electronic publishing\j
\JBoys, girls, dating and vulnerability\j
\JThe homemade supercomputer\j
\JChiba City gets real\j
\JGetting more bandwidth\j
\JSmall, smaller, smallest\j
\JKilling landmines\j
\JChimpanzee subspecies more diverse than humans\j
\JAlphabets get older\j
\JThe oldest vertebrates so far\j
\JA new giant plant-eating dinosaur found\j
\JThe Latest Paleocene Thermal Maximum\j
\JTracking the missing minerals\j
\JUranium-helium thermochronology\j
\JWhy does "Lonesome George" refuse to mate?\j
\JLizards, forests, and speciation\j
\JSeeds in rainforest fragments\j
\JDeath of a lemur\j
\JChina and sulfur emissions\j
\JArctic sea ice disappearing\j
\JWhat drives the African drought\j
\JPlague cases increase in the USA\j
\JJan van Paradijs, 1946-99\j
#
"High-velocity clouds explained",1259,0,0,0
(Nov æ99)
In the mid-1960s, astronomers detected mysterious clouds of gas, which might or might not have been in our galaxy. The clouds showed every sign of moving through space at high speed, but they were not neatly rotating, along with the rest of the galaxy. Their position, as well as anybody could assess it, was between a few hundred light years to 10 million light years away, an estimate that would place them well outside the Milky Way.
It now appears, from a report in \INature\i during November, that the clouds are not only part of the Milky Way, but they play a key part in the galaxyÆs ability to keep forming new stars.
Observations from new large ground-based telescopes and the orbiting Hubble Space Telescope have recently allowed astronomers to establish that one such cloud lies about 20,000 light years from Earth in the halo of the Milky Way, a region high above the star-rich plane of the galaxy.
Now Hubble data have been used to reveal the heavy elements in another high-velocity cloud which appears to lie between 10,000 and 40,000 light years above the plane of the galaxy. The authors of the report believe that the high-velocity clouds are involved in the chemical evolution of the galaxy by showering it with metal-poor gas that counteracts a buildup of heavy elements within the stars and gas found in the disk of the Milky Way.
Every star begins as gas, mainly hydrogen, and over time, nuclear reactions turn the hydrogen to helium, and also into heavier elements like metals. The stars then shed these metals back into interstellar space, so the recently-formed stars should be richer in metals than old stars, but in reality, stars of all ages seem to have the same or similar heavy element concentrations no matter what age they are.
Previous explanations of the constant level of the heavy elements have ranged from the idea that, in the past, stars may have been more efficient at producing heavy elements to the notion of unknown processes at work, but these rather clumsy ideas can now be discounted, says Bart Wakker.
This finding would also help to explain how the galaxy can keep generating new stars at the rate of about one a year: the cloud observed by Wakker is estimated to contribute about one-fifth of a solar mass per year, and there are other clouds, also making a contribution. Now the only problem seems to be explaining where the gas comes from.
It may be gas left over from the formation of the so-called Local Group of galaxies that includes the Andromeda Nebula, or it may be that the Milky Way is still forming, continuously gathering gas from near the edge of its sphere of influence. There is at least one other possibility, and that is that the clouds might have been stripped away from passing dwarf galaxies.
#
"A Hawaiian-style volcano on Io",1260,0,0,0
(Nov æ99)
New images from the Galileo spacecraft released in early November have revealed unexpected details of the Prometheus volcano on Io including a caldera and lava flowing through fields of sulfur dioxide snow. The volcanic crater photographed on Jupiter's moon Io during a close flyby is several times larger than one found at Hawaii's Kilauea. But while Prometheus is much larger, it has characteristics remarkably similar to those of the Kilauea shield volcano.
According to Laszlo Keszthelyi, a Galileo research associate at the University of Arizona, both are long-lived eruptions, with flows that apparently travel through lava tubes and produce plumes when they interact with cooler materials. In another parallel, Prometheus has been active during every observation over the past 20 years.
The images actually came from a flyby on October 10, and are part of a large batch of data currently being transmitted to Earth. They reveal two distinct hot spots at Prometheus - a large one to the west and a fainter, cooler one to the east. As well, they show numerous lava flows near the western hot spot and enable scientists to identify a crater, or caldera, 28 kilometers (17 miles) long and 14 kilometers (9 miles) wide near the hot spot to the east.
Prometheus has a 50-100 kilometer (30-60 mile) tall plume, which was thought to form where the lava erupts onto the surface. From the new images, it now appears that the plume forms at the far end of the lava flows. The caldera and eastern hot spot are thought to be associated with the vent where the molten rock rises to the surface.
After the lava reaches the surface, it appears to be transported westward through lava tubes for about 100 kilometers (60 miles) before breaking out onto the surface again. At this point, numerous lava flows wander across a plain covered with sulfur dioxide-rich snow, and the plume is created by the interaction of the hot lava with the snow.
Kilauea, by comparison, has a small lava lake about 100 meters (330 feet) across that produces a relatively small thermal hot spot, with lava tubes running from there to the Pacific Ocean 10 km (6 miles) away, where the lava comes in contact with the sea and forms a plume of steam.
One difference though: the volcanoes of Io seem to have temperatures of more than 1700 kelvin, and perhaps as high as 2000 kelvin, that is, around 1430-1730║C, or 2600-3140║F, making these the hottest spots on the surface of any orbiting body in our solar system. By way of comparison, the volcanoes on Earth are typically no more than about 1500 kelvin (1230║C or 2240║F).
Galileo flew another flyby of Io in late November. These missions take the spacecraft through intense radiation, and controllers at first believed that this risky process had caused damage to the spacecraft, but they later reported that it had survived unscathed. As with most space missions, every part contains its risks, and the NASA controllers have delayed these flybys until the end of the mission, when all of the æsafer goalsÆ have been met already.
#
"A quasar, far, far away",1261,0,0,0
(Nov æ99)
Some time, about 11 billion years ago, something flared, right across the electromagnetic spectrum. The radiation spread out in all directions, and just a few years ago the radiation from that event passed through our solar system and was recorded by a variety of instruments, including the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. The big drawback with BATSE is that it was designed to detect flashes of radiation from gamma-ray bursts, but it observes the entire sky, because BATSE can't be pointed, in the conventional sense, to look at just one star or galaxy.
On the surface of the Earth, if the radiation reached us (it doesnÆt), you could identify the location of a source by moving a shadow over the detector, so that when it cut out, you would know where the source was by tracking from the detector to the shading object, and beyond. BATSE is sitting on a satellite, out of reach, but it does get ôshadedö by the planet, and once people realized this, they were able to turn to a supply of BATSE data stretching back several years, looking for signs of sources that ôriseö and ôsetö behind the Earth, from the satelliteÆs point of view.
This is how Angela Malizia, a doctoral candidate at the University of Southampton in England, was able to report that a distant quasar shines regularly in gamma rays and emits the occasional burst. Others involved include Mike McCollough Loredana Bassani, J. B. Stephen, Bill Paciesas, and Nan Zhang.
MaliziaÆs work combined thousands of observations of the region with each other. This meant that the noise in the signals would cancel out and this allowed her to detect faint signals that normally escape BATSE's notice. So now we know that we can use gamma rays, the most energetic form of electromagnetic radiation, to detect quasars out to 11 billion light years - at least in some directions. The importance of this is that we are now able to find and study more examples of these distant objects which formed when the universe was much younger.
The source is either known as 4C 71.07, its designation in the 4th Cambridge University catalog of radio sources, or as QSO 0836+710, a quasar or quasi-stellar object that emits baffling amounts of radio energy. The numbers actually designate the same place in the sky: 71.07 is its declination, and 0836+710 is right ascension and declination. In either case, the value also tells us that this object is many degrees away from the galactic plane. That puts it in a high galactic latitude, in an area, well above (or below) the galactic plane, in parts of the sky which offer us a better view into deepest space. This is because there are fewer strong gamma-ray sources in these directions, compared with those we find all over the crowded Milky Way.
This is the faintest and most distant object to be observed in soft gamma rays, although it has already been seen in gamma rays by the Energetic Gamma Ray Telescope (EGRET) which is also aboard the Compton Gamma Ray Observatory.
Quasars like 4C 71.07 are also known as active galactic nuclei or AGNs. This particular one has a red shift of z=2.17, putting it about 11 billion light years away in a 12 to 15-billion-year-old universe (using z=1 as 5 billion light years). According to researchers, it appears to be the nucleus of a galaxy that is showing extraordinary activity, probably a super massive black hole at the center of a galaxy that is forming.
Our own galaxy and the famous M31 spiral galaxy in Andromeda may have started out as AGNs, before they settled down to what we regard as ônormalö. The key is the black hole in the middle, and here we need to recall that black holes come in two forms. First, we have the ôordinaryö ones with as much mass as a few suns compressed into a region just 10 to 20 kilometers across (though any measurement of the diameter is meaningless since even the fabric of space is stretched around a black hole). The second sort is a supermassive black hole with a mass equivalent to millions of suns crammed into a volume about as wide as our solar system.
This sort of black hole consumes everything nearby, but eventually is left with stars orbiting around it, too far away to be dragged in. But by definition, black holes emit no radiation, nothing except gravitational pull, so where does the radiation come from? The answer is simple: as stray bits of material are accelerated up to speeds close to the speed of light, they radiate energy, and it is this radiation that we detect. Electrons spiral along magnetic fields, and it is one of the givens of physics that when charged particles are accelerated, they radiate. As well, the electrons interact with visible light emitted by the disk around the black hole, and that pumps them into the X-ray and gamma-ray area.
The source ôburstsö, although not all at once across the entire spectrum. In late 1995, it reached its record optical brightness, while 55 days later, its gamma-ray emissions peaked, fading back to its average output three months later. According to Angela Malizia, this indicates that the source is about a third of a light year, 100 billion km or 60 million miles across. Because radiation, or particles traveling almost at light speed, must carry the signal to erupt, the duration of an event can translate into its maximum size.
It is fairly safe to assume that 4C 71.07 will be under closer scrutiny for some time to come.
#
"Six more planets",1262,0,0,0
(Nov æ99)
Six more planets have been added to the list of known extra-solar planets, and two stars, already known to have planets orbiting them, now appear to have at least one extra object orbiting around them. The findings, announced in November, will be fully reported in the \IAstrophysical Journal\i in the near future, based on observations with the High Resolution Echelle Spectrograph (HIRES), designed and built by Steven Vogt, on the Keck I Telescope in Hawaii.
All of the stars are sun-like, mainly because the survey, carried out over the past three years, has only looked at 500 nearby sun-like stars in search of planets. This 25% increase in the number of extra-solar planets hardly raised a ripple of interest in the world's media, even though one of the planets, around HD 192263, was also recently detected by Nuno Santos and collaborators in Geneva. The Geneva group reported the find while Vogt and his colleagues were preparing their paper.
Because of the bias in the survey, all of the stars are similar in size, age, and brightness to the Sun. They lie between 65 and 192 light-years from Earth. The planets range from 0.8 times the mass of Jupiter up to 6.5 times the mass of Jupiter. They are assumed to be like Jupiter, gas giants composed mainly of hydrogen and helium gas.
Like most of the previous discoveries, the new planets have orbits which are quite eccentric. Rather than the almost circular orbits of our own solar system's planets, these travel in oval paths. One of the planets, around star HD 222582, has the most wildly eccentric orbit yet known: at its closest point, the planet is just 0.39 of an \Jastronomical unit (AU)\j from the sun, ranging out to 2.31 AU from the sun, all in the course of a year of 576 of our days. Vogt comments that it is beginning to look as though the orbits we see locally may be the exception rather than the rule.
Five of the six new planets are located within the so-called "habitable zones" of their stars. This puts them neatly in the region where temperatures would allow water to exist in liquid form. Most of the previous discoveries have been outside the habitable zone, either too far away and too cold, or too close and too hot. Vogt commented that "These planets are at just the right distance, with temperatures in one case around 108 degrees Fahrenheit - like a hot day in Sacramento."
But any life forms found there are unlikely to resemble anything you might hope to see, even in California, because Jupiter-sized planets in oval-shaped orbits are not likely to offer a home to life forms like us, and any Earth-sized planet orbiting one of these stars would soon be ejected by the gravitational influence of the Jupiter-mass planet. Still, if the gas giants are like those in our own solar system, they probably have numerous moons associated with them, and these could well have liquid water and so support life. As this was being written, evidence was coming in of more support for the theory that life might exist on such moons.
The two planets which appear to have companions are around stars HD 217107 and HD 187123. The companions may be planets or larger objects such as brown dwarfs, and they appear to be orbiting their host stars in a long period, something like two to three of our years. Previously, only one star, Upsilon Andromedae, was known to have a system of multiple planets. Getting a detailed definition of the masses and orbits of these companions will take years, but now it seems reasonable to assume that multiple-planet systems are not uncommon.
Here are the details of the new planets and their stars, as provided by the discoverers:
HD 10697 is a G5IV star, slightly cooler and a bit larger than the Sun. It lies 106 light-years away in the constellation Pisces. Its planet has a minimum mass of 6.35 Jupiter masses and a 1,072-day orbit. The radius of this orbit is about 2.13 AU, but the orbit is somewhat eccentric, so the planet's distance from its star ranges from 1.87 AU to 2.39 AU. At its average orbital distance, it lies just at the outside edge of the habitable zone of its star, and is expected to have an equilibrium temperature (due to energy received from its parent star) of about -10║C (15║F).
HD 37124 is a G4V star, slightly cooler than the Sun. It lies 108 light-years away in the constellation Taurus. Its planet has a minimum mass of 1.04 Jupiter masses and a 155.7-day orbit. This orbit is also quite eccentric. At its average orbital distance of 0.55 AU, it sits just within the inner edge of the habitable zone of its star, and is expected to have an equilibrium temperature of about 55║C (130║F). This is the lowest metallicity star known to have a planet.
HD 134987 is a G5V star, 83 light-years away in the constellation Libra. Its planet orbits in a 260-day eccentric orbit. This planet has a minimum mass of 1.58 Jupiter masses. At its average orbital distance of 0.81 AU, its expected equilibrium temperature is a balmy 42║C (108║F). It lies well within the habitable zone of its star.
HD 177830 is a K2IV star, about 1,000 degrees Kelvin cooler than the Sun, lying about 192 light-years away in the constellation Vulpecula. It harbors a 1.22 Jupiter mass planet in a 392-day, highly eccentric orbit. This orbit carries the planet from as close as 0.63 AU from its star to as far as 1.57 AU. At its mean orbital distance of 1.10 AU, its expected temperature is about 89║C (192║F). The planet is probably within the habitable zone of its star.
HD 192263 is a K2V star lying 65 light-years away in the constellation Aquila. A planet around this star was first reported by Nuno Santos, a Portuguese graduate student at the University of Geneva. Vogt's team has obtained essentially the same results as Santos: a 0.78 Jupiter mass planet orbiting in a 24.36-day orbit. This orbit has a radius of only 0.15 AU, with little or no eccentricity. It orbits well outside the habitable zone of its star.
HD 222582, a G3V star, is a near solar twin, 137 light-years away in the constellation Aquarius. Its planet orbits in a wildly eccentric 576-day orbit, which carries the planet from 0.39 AU to 2.31 AU from the parent star in the course of its oval orbit. This is the most eccentric extrasolar planet orbit yet known. The planet's expected temperature is about -39║C (or -38║F - note, this is not a misprint: -40║ is the same on both scales). Its mean orbital distance places it squarely in the habitable zone of its star.
As usual, the team has posted a mass of data on the Web at their usual site, which is worth a regular check to catch updates: http://www.physics.sfsu.edu/~gmarcy/planetsearch/planetsearch.html
\BKey names:\b Steven Vogt, Geoffrey Marcy, Paul Butler, and Kevin Apps.
#
"A planet confirmed",1263,0,0,0
(Nov æ99)
At the start of November, there were believed to be 22 planets around other stars. By the end of November, the number had increased to 28 (see \JSix more planets\j), and most importantly, one of the previous 22 had been confirmed in a most spectacular fashion.
To understand the importance of the confirmation, you need to remember that the planets have never been seen. All we can see is the wobble that a star shows when it is orbited by a massive dark object, and while the physics are straightforward, it is always possible that there \Imight\i be another cause, so scientists have been casting around for some way of proving that there really was an object circling each of the stars. One way might be to look for a slight darkening of the star as a planet travels between us and the star, but this is a fairly remote chance.
If you were sitting above the SunÆs north pole, you would never see any planet in our solar system crossing the face of the Sun, but if you are in the same plane as the orbit of a planet, you may occasionally see it transit the face of the Sun. In fact, that happened here in November, as described in \JMercury transit\j. Of course, the transit of Mercury across the face of our Sun has about as much effect as a sunspot, so far as the measured brightness of the Sun is concerned, but the planets in this hunt are much larger, and could be expected to have a significant and measurable effect.
Good science involves making a prediction, and then testing to see if the prediction comes true. In this case, the test was to analyze the wobbles, predict a time when a planet would cross its star, and then check to see whether the star grew dimmer at the right time. Of course, it had to be the right star, one that had a planet traveling in the right direction for us to see it, but one of the candidate stars, HD 209458, a Sun-like star, 47 parsecs (153 light years or 1.4 million billion kilometers or 895,000 billion miles) away from us, looked good.
HD 209458 is in the constellation of Pegasus. It is about the same age, color and size as our own Sun. It is very near the star, 51 Pegasi, around which the first extrasolar planet was discovered in 1995. The results were excellent, with enough data for them to form an estimate that the planet has about two-thirds the mass of Jupiter but about a 60% larger radius, giving it a density of 0.21 g/cc, just a fifth of the density of water. This fits with theories that predict a bloated planet when, as here, the planet is very close to the star.
The result required a collaborative effort. First, Geoffrey Marcy, with Paul Butler and Steve Vogt, needed to detect a wobble in the star, and use this to predict a likely crossing time. Then Greg Henry had to try to spot the transit with an automatic telescope. It all happened very quickly from there: the wobble was detected on November 5, and because the planet orbits its star once every 3.523 days, Henry was able to detect a 1.7% dip in the starÆs brightness on November 7, at exactly the time predicted by Marcy.
With the orbital plane of the planet known, the astronomers for the first time could determine precisely the mass of the planet and, from the size of the planet measured during transit, its density, which tells us that this planet is a gas giant like Jupiter. Gas giants like this could not have formed at the distance this planet is from its star, so it must have formed further out and migrated inwards.
In the astronomical sense, Greg Henry was the first person to ôseeö the planet, and while the discoverer usually gets naming rights, there is no official convention for naming extrasolar planets. As a temporary measure, the planets are getting the starÆs name or catalog number, and a lower case b, so that HD 209548b is the planet associated with the 209,548th star in the Henry Draper star catalog. As a group, astronomers have taken to referring to the planets as ôlittle b'sö.
HenryÆs 12-year-old son Daniel has already suggested to his father that he call it Namek, in honor of a planet from the Cartoon Network show ôDragonball Z.ö And while even Daniel realizes that Namek has a poor chance of winning the race, it might as well do, until such time as the International Astronomical Union comes up with an agreed convention for naming such planets.
Henry planned another attempt to observe the planetÆs transit on November 14, two days after the team released their result in a November 12 circular of the International Astronomical Union. He was blocked by cloud, and if he does not get confirmation soon, the star will be obscured by the Sun, which will cause a delay of some months before it can be observed again. Another group has since reported that they observed the same dimming on September 8 and September 15. David Charbonneau and Timothy M. Brown made the observation, after David W. Latham spotted the wobble in August, but they are saying no more until their results are accepted by a journal.
The next step will be for astronomers to zero in on this star, to try to gather spectroscopic data. If the planet moves in front of the star, this means that some of the light passed through the planet's atmosphere on its way here, and that should give us even more information. So while this is already a good story, it is a story which has not yet ended.
#
"Mercury transit",1264,0,0,0
(Nov æ99)
Right now, as we approach a solar maximum, there are many more sunspots than usual on the SunÆs surface, but these are cool areas created by twisted magnetic field lines poking through the surface, and they travel slowly, usually staying in view for about two weeks, moving from east to west as the Sun rotates.
On November 15, observers in northeastern Australia, New Zealand, Antarctica, Papua New Guinea, the islands in the Pacific Ocean, western South America and most of North America with the right sort of filters on their telescopes, were able to see a small black dot which passed across the face of the Sun rather faster than any sunspot. In just an hour - give or take a bit, depending on where you were observing from, the Sun underwent a very slight \Jeclipse\j of a special kind, the sort we call a transit of Mercury.
The only transits we can ever see from Earth are the 13 or so transits of Mercury which occur each century, and the relatively close pairs of transits of Venus which occur a few years apart, every century or so. This is because a transit is the result of the Sun and the Earth lying on the same straight line as another planet, which means that only planets closer to the Sun can be seen transiting the sun.
Previous transits have happened on 9 May 1970, 10 November 1973, 13 November 1986, and 6 November 1993. After 15 November 1999, future transits will occur on 7 May 2003, 8 November 2006, 9 May 2016, 11 November 2019, 13 November 2032, 7 November 2039, and 7 May 2049. There will be a transit of Venus in 2004.
The planet was visible in this yearÆs transit as a tiny black disk, 9.9 arcseconds across. To put this in perspective, the Sun is about half a degree wide, filling a space 30 arcminutes, or 1800 arcseconds across. For some observers, this monthÆs crossing was a grazing transit, where only part of the planet crossed the Sun, while others saw it follow a short path near the Sun's northeastern limb. Grazing transits are rare, and the next one is expected in the year 2314.
The timing of the transits of Mercury is controlled by the 7 degree difference between the plane of our orbit and MercuryÆs orbit, which means that we only pass through the plane of MercuryÆs orbit twice a year, in May and November. If Mercury happens to be between the Earth and the Sun (astronomers call this "inferior conjunction") at that time, then a transit will occur. Because of the eccentricity of MercuryÆs orbit, it appears to be almost 10 arcseconds across in November transits, but 12 arcseconds across in May transits.
Ever since Edmond Halley (see \JHalley, Edmond\j) realized that closely observed transits could be used to measure the Sun's distance, transits have been interesting to astronomers. They already knew the relative sizes of the planetsÆ orbits, by Kepler's third law (see \JKepler, Johannes\j), and once you can calculate the actual size of the EarthÆs orbit, the size of the whole solar system falls into place, as the relative sizes are converted to absolute sizes.
The distance to the Sun was first calculated reliably from the data gathered during the 1761 and 1769 expeditions to observe the transits of Venus, and a fallout from this was that James Cook mapped the coast of New Zealand and the east coast of Australia (see \JCook, James\j for more information)
These days, of course, we can measure the distance to Mercury far more easily, but there is growing evidence that the Sun's radius varies by a fraction of an arcsecond during the solar cycle. So careful observations and timings, especially from a grazing transit, can be used to calculate the radius of the Sun.
#
"Mars Polar Lander",1265,0,0,0
(Nov æ99)
As this was being written in early December, the Mars Polar Lander mission appeared to have failed to separate as it approached Mars. This was the second major loss this year (see \JMars Climate Orbiter goes missing\j, September 1999), and a great disappointment to NASA, who had great hopes for what they are still calling an "11-month, 137-million-mile space trip to Mars".
In spite of their continued use of non-metric units, when this was the cause of the Climate Orbiter craft, it appeared that the problem this time was not in the mathematics - of what it is, nobody has discovered yet, but Friday December 3 was not the day of celebration it was expected to be.
#
"Cancer cures",1266,0,0,0
(Nov æ99)
In 1997, there were more than 6 million deaths from cancer around the world, 2.5 million (21% of all deaths) in developed countries and 3.6 million (9% of all deaths) in the developing world. As well, there were some 10 million cases of cancer worldwide. This looks likely to rise to 15 million deaths a year by 2020, according to the WHO. This is based on an expected 40% increase in industrialized countries, and a doubling of deaths in developing countries. This will follow from changes in diet and environment, coupled with greater survival into adulthood in these countries - as people survive longer, so they have an increased ôopportunityö to develop cancers.
This makes cancer the second most common cause of death, after cardiovascular disease, in most western countries. One person in four in the world today is likely to experience cancer at some stage in their lives, and as a result, cancer research is a major target for researchers in the developed world.
While there are about a hundred different cancers, eight cause most of the deaths, about 60% in all. These cancers affect the lung, stomach, breast, colon/rectum, mouth, liver, cervix and oesophagus, and in all cases, early detection is the best chance people have of surviving. The question of whether or not screening is desirable is an open one, but the usual rule is to apply it to those who show family histories of particular cancers, or who have had exposure to a known \Jcarcinogen\j.
All of the conditions we call cancer involve the loss of control of cell growth. At the same time, the immune system fails to recognize the cancerous growth as ôforeignö, and does nothing to control its growth. In time, most cancers will spread to other parts of the body. The key factor is loss of control: under normal circumstances, our body systems act to keep the number of cells in an organ ôjust rightö, with cells dying and new cells being formed, in a complicated balance. In a cancer, the controlled cell suicide that we call apoptosis is switched off.
Cancers mostly happen because of damage to genes, but sometimes they arise from a genetic abnormality - in either case, the effect is the same, with control being lost. In general, one change will not be enough, but over time, as damage builds up, so the scene is set for a future cancer, which means that older people are more likely to develop a cancer.
There are two types of cancers: solid tumors, and what the specialists call hematological malignancies, in effect, cancers of the blood. The solid tumors are all hypoxic in parts, having areas where oxygen levels are very low, and this is a major problem for many treatments. As well, the solid tumors produce an outwards osmotic pressure which actually forces drugs and anything else away from the center of the tumor.
There are a number of strategies, but they all come down to finding some form of ômagic bulletö, some way of delivering a treatment that will only affect the unwanted cells of the cancer, or some way of ensuring that the treatment gathers or acts preferentially in the cells of the cancer.
The three standard treatments for cancer are to cut the growth out (surgery), to kill the cells with radiation (radiotherapy), or to kill the cancerous cells with poisons (chemotherapy). Often these cancer treatments are combined to try to mop up the whole of the cancer.
Surgery is more effective in the early stages of a cancer, especially a surface cancer. Radiotherapy can be highly effective against certain types of tumors, but it has side effects, and so too does chemotherapy, since the substances used (cytotoxic drugs) are also poisonous to normal cells, but they are used because they are more toxic to cancerous cells.
These compounds can be injected or swallowed, but either way, they interfere with cell replication. This means the drugs interfere with hair follicles (so the hair falls out) and also with cells in the intestinal lining, so that nausea and vomiting are common. These drugs have the advantage that they spread through the body, and so can reduce secondary growths.
The taxoids such as \JTaxol\j interfere with mitotic cell division. They promote the assembly of the microtubules of the spindle apparatus, but then stop the microtubules from being broken down, which eventually kills the cancer cells.
One problem with chemotherapy is that many of the drugs cause cancer cells to develop resistance not only to the drug being used, but to other unrelated drugs as well. The first drug stimulates the cells to produce more of a drug efflux pump protein called Pgp, which serves to clear the cells of the drug chemicals.
One interesting new treatment for surface cancers (where instruments can get within 3 cm, including the intestinal tract) uses photodynamic therapy, where the drug used is absorbed by healthy cells and cancerous cells, but is removed from the healthy cells faster. Laser light is then applied, and this activates the drug, which kills the cells.
Viruses and liposomes are sometimes used to deliver drugs to some cancers, and some brain cancers can be treated by inserting a polymer into the tumor which slowly breaks down, releasing more of the drug at the site of the tumor.
Hormones can be used in some hormone-dependent cancers such as breast, prostate and endometrial cancer. The main aim is to reduce the levels of the hormone that the cancer needs, causing it to regress.
In late 1999, almost 40 cancer vaccines were at different stages of trials, mainly directed at stimulating the immune response in some way. (See, for example, \JMelanoma vaccine trials succeed\j.)
Cytokines, perhaps delivered by a bacterium (see \JSetting a disease to kill a disease\j) are also a good prospect for the future. These are secreted proteins with a wide variety of functions. Among other things, they manage the activities and interactions of cells of the immune system. Cytokines include interleukins (ILs), interferons (IFNs), colony-stimulating factors (CSFs) and tumor necrosis factor (TNF). Of these, the CSFs do not affect tumor cells directly, but they are able to restore bone marrow function following radiotherapy or chemotherapy.
Monoclonal antibodies (MAbs for short) can be developed for any identified cancer cell antigen, and two MAbs for the treatment of non-Hodgkin's lymphoma are now known. It is hard to guess which of the competing technologies will emerge as the front runner in the new century, but other candidates include gene therapy, antisense therapy, apoptosis stimulators, fusion toxins, signal transduction inhibitors and telomerase inhibitors.
#
"Melanoma vaccine trials succeed",1267,0,0,0
(Nov æ99)
Late in November, the 26th Annual Scientific Meeting of the Clinical Oncological Society of Australia in Melbourne was given details of Phase 2 studies of the M-Vax melanoma vaccine. Ernest W. Yankee, PhD, the Executive Vice President of Avax Technologies (AVAX), reported exciting results contained in nine-year follow-up data of stage III \Jmelanoma\j patients.
Dr. Yankee reported results from 71 patients who were free of melanoma for at least two years following treatment with M-Vax and then monitored for up to 9.8 years, with a median follow-up time of 5.3 years. He said that the nine-year overall survival rate was approximately 85% in this group of patients, demonstrating that the clinical response to M-Vax therapy is long lasting. The data also demonstrate a nine-year relapse-free survival rate of 80% in these patients.
While the vaccine was developed in 1989 by David Berd, MD, Professor of Medicine, Thomas Jefferson University, Australia is the worldÆs center for melanoma, and AVAX plans to begin marketing M-Vax in Australia early next year. Australia has the highest per capita incidence of melanoma in the world, three times the US rate, a combination of having a large proportion of people of Celtic extraction, a continent partly in the tropics, and an outdoor lifestyle: melanoma is generally the result of damage caused to sensitive skins by sunlight.
The M-Vax vaccine is an individualized cell-based vaccine for cancer, which has shown no signs of any serious or long-term toxic effects of M-Vax in any of the trial patients, all of whom were in advanced states of the disease.
Melanoma starts out as a ôspotö, a small discoloration of the skin. If this is left untreated, the cancer develops, and then starts releasing cancerous cells in a process called \Jmetastasis\j. These cells travel and lodge in other parts of the body, where they form secondary cancers. In the case of melanoma, these secondaries are typically in the lymph nodes. Once metastasis has happened, cures become very much more difficult.
\BTechnical details\b: The 71 patients all received M-Vax after surgical debulking of bulky regional lymph node metastases. More than half entered the trial with ôone or more widely recognized indicators of poor prognosisö - including spread of the cancer between the initial lesion and the lymph node group. In other words, the M-Vax trial has been applied to a group with poor prospects, patients who were given very little hope of a cure, and it looks to have come through remarkably well.
Malignant melanoma is an accelerating problem in the USA, with an estimated 40,300 new cases and 7,300 deaths from malignant melanoma in 1997, but Australia is to be the first country where M-Vax will be made available, probably in early 2000. There is already a vaccine-building facility in the USA, and the methodology is now proven. According to Dr. Yankee, ôOnce we complete and receive approval for a manufacturing facility in Australia, we believe we will be able to quickly replicate the same delivery paradigm and begin commercialization activities there.ö
ôM-Vaxö is a registered trademark, and it is the first of several autologous cell vaccines based on AVAX's proprietary AC VaccineÖ technology. Another product, O-Vax, is already being tested on ovarian cancers. The company specializes in the development and commercialization of novel biotechnologies, immunotherapies and pharmaceuticals for cancer and other life-threatening diseases using three core technologies: autologous cell (AC) vaccines, topoisomerase inhibitors, and anti-estrogens.
So what is an AC vaccine? In simple terms, it is a vaccine made using a process developed at (and patented by) Thomas Jefferson University. The process allows the patientÆs own cancer cells to be turned into a cancer vaccine. This vaccine stimulates the patientÆs immune system to seek and eliminate the cancer. Since the treatment uses the patientÆs own cells in the custom-manufacture of the vaccine, many of the severe side effects typically associated with other cancer treatments can be avoided.
The basic problem is that our cancerous cells are still \Iour\i cells, and that means that our immune systems are inhibited from attacking them, even though the cancer cells are in fact dangerously different. The answer is to train our immune systems to single out these cells from our normal cells.
The first step in making a vaccine is to remove a patient's cancerous tumor, and then the tumor cells are treated with dinitrophenyl (DNP), a chemical compound known as a hapten. This chemical binds to molecules on the surface of cells and helps trigger immune responses. Then the DNP-treated cancer cells are combined with an adjuvant, a substance that improves their effectiveness, and they are then injected back into the patient.
Around the body, there are likely to be remnant cancer cells which have either metastasized to other areas of the body, or in some cancers, been left behind as surgeons gingerly try to remove a cancer without damaging nerves or other vital organs (this is a special problem with prostate cancer - see \JProstate cancer: a special report\j, September 1999). These remaining cancer cells, if they are left undetected and untreated, have the potential to form additional cancerous tumors and even to kill the patient. The immune system, after treatment with the vaccine, is then better able to recognize, locate, and combat any remaining cancer cells.
The technology is not entirely new. The ability of DNP to modify proteins and render them more easily identified as foreign to the immune system has been well-documented over the past 30 years, and David Berd began work on AC vaccines some 10 years ago. Now, though, the work has withstood the test of time, and it can only be a matter of time before this disease is brought under control.
See also two earlier related reports: \JA vaccine against melanoma\j, April 1999, \JMelanoma ævaccineÆ\j, July 1999, as well as \JSetting a disease to kill a disease\j and \JCancer cures\j this month. Avax technologies are hard to locate on the Web: their home page is located at http://www.avax-tech.com/
#
"Setting a disease to kill a disease",1268,0,0,0
(Nov æ99)
The genus \I\JSalmonella\j\i is better-known as a cause of disease than as a cure, although we reported in October 1997 (\JEngineered bacteria to fight tumours\j) on an interesting new application of this bacterium in stunting the growth of melanomas in mice. Now, engineered bacteria are ready to be used in a Phase I human safety trial.
The method, developed at Vion Pharmaceuticals, is known by the registered name given to it by that company: TAPET, standing for Tumor Amplified Protein Expression Therapy. In a report delivered in mid-November at the International Conference on Molecular Targets and Cancer Therapeutics in Washington, D.C., the company announced a successful use of one version, ôarmed TAPETö to bring about complete tumor regression in animal trials with mice.
Vion reports that VNP20009 (unarmed TAPET organisms), administered systemically to tumor bearing mice, accumulates preferentially in tumors over livers at a ratio of 1000:1 and inhibits tumor growth up to 95% in mouse models. In simple terms, the modified bacterium gathers in the tumors rather than in inviting territory like the liver, and the bacteria in the tumor are present at a thousand times the concentration found in the liver.
Further preclinical toxicology studies conducted on non-human primates (cynomolgus monkeys) demonstrate that VNP20009, at significantly high doses, was safe. They say that toxicology studies conducted on mice demonstrate that the modified bacterium is unable, even at significantly high doses, to cause mortality, and was completely cleared from the blood in 24 hours and from all organs in less than two months.
The TAPET method uses \ISalmonella\i bacteria to colonize and multiply preferentially within the confines of a tumor, where they inhibit tumor growth. The bacteria can be in their normal (ôunarmedö) form, and growing inside a tumor, or they can be ôloaded for bearö, with the power to do extra damage to the tumor, in the form known as armed TAPET. The human safety trial will use the unarmed version of the vector, and is aimed at showing that the former food poisoning organism can no longer cause illness in humans. As explained below, even this unarmed form is capable of giving the tumor something to think about.
VNP20009 began as a normal ôwild-typeö \ISalmonella typhimurium\i culture. Bacteria from this culture were treated to increase their rate of mutation, and the strains were screened for an improved ability to specifically invade tumor cells. Eventually, one isolate, called ôclone 72ö, was selected for testing.
Mice with tumors were then injected with 2 million cells of either the wild-type bacterium, or of clone 72. After 21 hours, the clone 72 bacteria had selectively multiplied to a level 4,000 times higher in the tumor than in the livers of the mice. The same preference for tumors over normal tissues has been shown to apply in a wide range of other tumors.
But once you have a bacterium which dives in and breeds merrily in a tumor, what do you do with it? The first thing the experimenters noted was that the bacteria distributed themselves right across the tumor they had invaded, both inside the cells, and also between the cells, and this means they are ideal for delivering cytotoxins, cell poisons, directly to the tumor.
As the effects were studied, the Vion researchers also found that tumor growth was inhibited when they were treated with the unarmed TAPET vector. In mice, TAPET alone gave mice with tumors twice the survival rate of non-TAPET (control) mice, and made a huge change in the rate at which the tumors grew.
So now we have a mystery: how are the bacteria damaging the tumor? Part of the answer comes from the observation that killed bacteria have no effect on solid tumor models, meaning that the bacteria have to be alive to damage the tumor, and the researchers think the effect may be related to either nutrients or oxygen levels in the tumor.
Safety is a key factor in this sort of work, and one common way of restricting new forms is to make sure that they are auxotrophs, organisms with a mutation that means they are dependent on some nutritional item that they would normally make themselves. Clone 72 is a purine auxotroph, which means that it has only limited powers of reproduction unless there is available purine.
This auxotrophy has been artificially selected for, and since it is a deletion mutation, where the gene in question has been stripped from the bacterium, there is no way the bacterium can regain the power of making its own purines, but these compounds are plentiful inside a tumor.
So the bacterium is unable to escape and cause sepsis somewhere else in the body. On top of that, the selected strains are up to a million times less toxic to tissues than the normal wild-type bacteria, and they have also been engineered to produce less of a cytokine called Tumor-Necrosis-Factor-alpha, (TNF-alpha or TNF\Fa\f), which often causes septic shock when it is produced by gram-negative bacteria like \ISalmonella typhimurium\i. The bacteria are also fully vulnerable to antibiotics, leaving them open to control if an unexpected infection were to break out.
Without the TNF\Fa\f to attack the tumors, how does the armed TAPET work? Simple: it produces cytokines and enzymes which convert prodrugs, chemical building blocks for drugs, into the final drug molecules. This means that the anti-cancer drugs, usually rather unpleasant substances, are produced inside the tumor where the bacteria are, and only there, where they are needed.
The success reported with mice used a TAPET vector which released TNF\Fa\f inside the tumor - as usual, animal trials are more advanced than those with humans. The companyÆs Phase I human safety trial of VNP20009, an unarmed TAPET, is open, at the end of 1999 for eligible patients. TAPET will be administered by injection directly into a surface tumor. This is a strategy which the company describes as having ôdemonstrated significant anti-tumor effects against the injected lesion and distal metastases as observed in preclinical animal tumor modelsö.
To be eligible, patients would have to be able to walk, have an advanced cancer that is no longer responsive to standard treatment and must have one or more surface tumors that can be injected with TAPET. The company also intends to conduct intravenous studies. In future developments, they are looking at delivering other payloads, including anti-tumor peptides, RNA viruses and DNA viruses, and single chain antibodies. The TAPET vectors are available in multiple serotypes, to avoid any risk of an immune response in the host.
See also \JCancer cures\j and \JMelanoma vaccine trials succeed\j.
#
"How a virus triggers asthma",1269,0,0,0
(Nov æ99)
If you are likely to get asthma, a viral infection may constrict the airways in your lungs, leading to the wheezing and shortness of breath which is characteristic of an asthma attack. If you are not asthmatic, you will not face these extra problems as a result of an attack of ôthe fluö. But while the effect has been known for a long while, researchers at the Johns Hopkins University School of Public Health and School of Medicine have only just discovered how viral infections trigger asthma attacks in susceptible people, and they have reported their findings in the November issue of \IThe Journal of Experimental Medicine.\i
The eosinophil is a type of white blood cell involved in allergy. In asthmatics, it also triggers asthma, yet the viral infections, which also trigger asthma, do not normally bring eosinophils into the lungs. It occurred to the researchers to wonder if, since many asthmatic people are allergic, it might be the allergies that change the response to a virus.
They created an animal model of an allergic person by injecting ovalbumin (the protein from a chicken egg) into guinea pigs, making them allergic to this substance. They were then infected with parainfluenza virus, one of the viruses known to trigger asthma attacks, and a control group of unsensitized animals was also exposed to the virus.
Eosinophils cluster along the nerves in the lungs of patients with asthma, and the same pattern of clustering appears in the allergic animals. The parasympathetic nerves in the lungs secrete acetylcholine (ACh), a neurotransmitter which binds to receptors on the smooth muscle in the airways, causing them to constrict.
Other receptors called M2 muscarinic receptors are also found on the same nerves, and these receptors act to inhibit the release of ACh. This reduction in ACh levels then stops the airways from getting smaller. This is a common pattern in living organisms, with two chemicals operating against each other, and the overall balance keeping body systems in a more-or-less steady state. If something blocks or disables the M2 receptors, then the acetylcholine levels will increase. After that, the ACh attaches to receptors on the smooth muscles, increasing airway constriction and leading to asthma symptoms.
Previous research by the same group has shown that the M2 receptors in virus-infected animals were disabled. The viral infection activates the eosinophils, which then release a protein, called major basic protein (MBP). The MBP blocks the action of the M2 receptors and keeps them from turning off ACh release, so the increased acetylcholine level causes increased constriction of the airways.
The future could run in any number of ways. It may be possible to block the entire parasympathetic nervous system, or to clear the body of eosinophils or MBP, thus heading off bronchoconstriction in people with asthma who come down with a viral infection. As an alternative, the group say they have identified a group of substances that neutralize MBP, restore the M2 receptor, and reverse the increased constriction of the airways in these virus-infected allergic animals. This research, and this research group, should be worth following.
\BKey names:\b David Jacoby and Allison Fryer.
#
"Hepatitis-related carcinoma may increase in the US",1270,0,0,0
(Nov æ99)
A Japanese scientist warned the American Association for the Study of Liver Diseases (AASLD) 1999 Annual Meeting in Dallas in early November that there may be an increase in the number of carcinoma cases related to hepatitis C virus (HCV) infection, sometime soon.
Masashi Mizokami said that while the incidence of the HCV infection is about the same in Japan as in the US, the cancer known as HCV-related heptacellular carcinoma (HCC) is eight times more common in Japan. He believes that HCV began to diverge from its source in Japan as early as 1945, and around 1965 in the US.
Mikozami says the incidence of HCC in Japan increased in 1975, around three decades after the spread of the HCV infection. On that basis, the US should soon be seeing an increase in HCC levels as well, as we are now three decades past the period, from 1966 to 1970, when intravenous drug use and the sharing of needles was a common practice.
#
"Beating the TB bacillus",1271,0,0,0
(Nov æ99)
A report in \INature\i in early November revealed a possible weakness in the deadly bacterium, \IMycobacterium tuberculosis\i, the organism which causes tuberculosis, and which has now infected more than a third of all human beings on our planet. Each year, tuberculosis kills more humans than either malaria or HIV/AIDS.
Researchers from the Howard Hughes Medical Institute (HHMI) have identified a lipid molecule that must be produced if the bacterium is to infect the lungs of mice. These lipids are unique to this bacterium, and it is likely that they play key roles in making the bacterium the world's most successful pathogen.
The lipids are exported from the bacterial cells, and are necessary for the bacteria to grow in the lungs. The bacteria are passed by aerosol infection, when droplets coughed or breathed out by an infected person are breathed in by a healthy person, but infection can only happen when the bacteria manage to invade the cells of the lungs - getting into the lungs is only half the problem. According to HHMI researchers, the key to understanding how to combat the infection lies in a better knowledge of how TB gets into the living cells of the lungs and manages to multiply.
The emergence of drug-resistant strains of \IM. tuberculosis\i has made it even more important to work out the pathways that lead to an infection. The lungs ought to be a hostile environment for bacteria, and if we understand how this bacterium gets around the defenses in the lung, this could open new opportunities for vaccines and drugs to prevent TB.
The researchers added pieces of gene-disrupting DNA called transposons at random locations in the genome of the bacterium. Each transposon created a mutation that carried a signature sequence of DNA that could be easily identified later, and this explains why the method is called "signature-tagged mutagenesis". Then mice were infected with the mutant strains of bacteria, each one carrying a unique tag, and three weeks later, the researchers checked the lungs of the mice for strains of bacteria that failed to thrive.
Three mutant strains were found, which could survive in the liver and spleen, but not in the lungs. These three strains all carried mutations in a part of the tuberculosis genome where genes responsible for making the lipid phthiocerol dimycocerosate (PDIM) are located. This lipid makes up part of the bacterial cell wall.
Then when the mutant strains of bacteria were grown in the normal way on agar plates, the colonies looked abnormal. Usually, TB bacilli form flat and corded colonies on agar. The mutants formed colonies which ôlooked like the Pompidou Center in Paris, with the pipes running all over the outside of the buildingö.
Two of the mutant forms could not produce PDIM, while the third one could make the lipid, but it contained a defective gene called \ImmpL7,\i and so it could not transport PDIM outside of the bacterial cell, where it seems that the lipid assists in the infection process. So PDIM is necessary for a bacterium to be virulent, but the real excitement is in the \Immpl7\i mutant, because the gene is a member of a family of 12 highly homologous genes.
The researchers believe that these genes are involved in exporting lipids that are known to be unique to this bacillus. So the next step will be to try to unravel the functions of all these lipids, which they suspect may play an important role in the bacillus's ability to overcome macrophages, immune system cells which prowl the lungs engulfing and shredding invading bacteria. The lipids may either be protecting the bacillus from some killing function of the macrophage, or perhaps they mount some form of more direct attack on the macrophages, neutralizing them.
#
"Chasing the dragon causes problems",1272,0,0,0
(Nov æ99)
Heroin users around the world have reacted to the risks, including HIV and hepatitis infection, associated with direct injection of the drug, by ôchasing the dragonö, which involves heating the drug and inhaling the vapor. According to an early November report in the American Academy of Neurology's scientific journal, \INeurology,\i this can lead to other dangers, such as a progressive and permanent brain disorder and even death.
The brain disorder is called spongiform leukoencephalopathy, and it involves the brain's white matter becoming covered with microscopic fluid-filled spaces, creating a sponge-like appearance. The disease targets specific cells, mainly in the cerebellum and motor pathways, and it blocks nerve impulses in the brain. As a result, patients become uncoordinated and have difficulty moving and talking.
According to neurologist Arnold Kriegstein, author of the report, about 20% of all cases result in death. Most of those who survive have ôpermanent deficitsö, and will never return to normal. The symptoms usually progress rapidly over days to weeks, even after the drug is no longer present in the body. While the situation is common in Europe, Kriegstein's are the first three reported cases in the USA, and he questions whether there may be many more cases that are being misdiagnosed.
The cause remains unknown. One theory is that since the users heat the heroin on foil, there might have been tin poisoning, but researchers could find no evidence of this. Some users seem not to be affected at all, suggesting that the cause may be a foreign substance used to ôcutö the heroin. Certainly the same heroin seems not to be toxic when taken in by other means - other than the normal toxicity associated with the drug itself.
Some patients have been successfully treated with antioxidants coenzyme Q, vitamin E and vitamin C, but others did not seem to respond to these chemicals. The treatment was experimental, having been proposed for other similar conditions in the past, and Kriegstein comments that " . . . it may be worthwhile to try this treatment with future patients since it was well tolerated and might have contributed to recovery in our patients".
Kriegstein also notes that one of the patients learned how to ôchase the dragonö from observing a scene in a movie.
#
"Herpes and heart disease link explained",1273,0,0,0
(Nov æ99)
Herpes is linked to some forms of heart disease (see \JWill the heart ever be safe again?\j, May 1998), but the reason behind this has long been a mystery. A report in \ICell\i in late November seems to offer a mechanism.
Mechanisms lie at the heart of what we call a scientific proof. It is not enough to say that people with herpes, or a certain kind of herpes, get heart disease. This simply means that we have identified a \Jcorrelation\j, but this does not tell us that the herpes causes heart disease.
The correlation may simply alert us to the fact that a tendency to get herpes and a tendency to get heart disease both are driven by the same factor. On the other hand, if we can identify a mechanism, a way in which the herpes virus \Icould\i lead to heart disease, we are much closer to a cause. In the same way, if a mechanism is later called into question, as happened in the ôLiburdy caseö (which was about cancers and power lines - see \JScientific misconduct?\j, July 1999), then the causal relationship falls apart.
In this case, there is no suggestion of a problem, but we do appear to have a clear mechanism, and that pushes the association from mere correlation to probable cause. The link is between a common herpes virus and vascular problems that occur in patients who have undergone an organ transplant or a balloon angioplasty procedure to clear clogged arteries. These problems all involve \Jatherosclerosis\j, with the over-accumulation of smooth muscle cells in the artery wall, which can block blood flow in a vessel.
The virus in question is the human cytomegalovirus (CMV), a member of the herpes family. It has infected somewhere between 50% and 85% of the adult population across the United States. Once a person is infected, the virus normally remains dormant and does not cause major health problems unless a person has a suppressed immune system. As a result, many hosts remain unaware they are carriers.
When problems arise, CMV is often observed in the smooth muscle cells. The new report indicated that a CMV gene called US28 stimulates the smooth muscle cells to migrate. Two proteins known as chemokines bind with US28 and direct the smooth muscle cells to areas of inflammation that may occur in atherosclerosis or vascular disease in transplant patients.
A pharmaceutical company, Portland-based Activated Cell Systems, L.L.C., which funded part of this project, believes that in the future, this work could result in pharmaceuticals which block this process before it even starts.
\BKey names:\b Jay Nelson, Dan Streblow, Cecilia Soderberg, Patsy Smith. and Jeffery Vieira.
#
"Human genome carries a virus related to HIV",1274,0,0,0
(Nov æ99)
A report in the \IProceedings of the National Academy of Sciences\i reveals that there is a snippet of DNA which resembles a gene sequence from the human immunodeficiency virus (HIV) in the genome of every human being. It looks as though we and our ancestors have been carrying this unwanted genetic baggage around for more than 30 million years.
Bryan Cullen and his colleagues at Duke University say that an ancient family of viruses, known as HERV-K (for human endogenous retrovirus K), were taken into the genetic material of \JOld World monkey\j group shortly after they diverged from \JNew World monkey\j group. Then the viruses were passed along through the simian and pre-human hosts which became humans and the other modern apes and monkeys of Africa and Asia.
Retroviruses like HIV and HERV-K have an RNA genome which is reverse transcribed, converted into a DNA ôcopyö which is then integrated as a DNA provirus into the chromosomal DNA of the target cell. This DNA can then become active, directing the cell machinery to make the proteins needed to assemble more viruses, or it may remain dormant, but it stays in that cell, and any other cells which are derived from it. If the cell in question is part of the germ line, the cells which later become eggs and sperm cells, then it will be passed on to the next generation.
Genetic material from inactive viruses accounts for roughly 3% of the human genome, and Cullen says that between 30 and 50 copies of HERV-K exist in the human genome. Some of the copies appear to be active at a low level in normal testicular and placental tissue. As Cullen points out, this finding offers further confirmation of something that biologists have long accepted from other evidence: that humans and Old World monkeys share the same ancestry.
More importantly, it shows that certain disease-causing tools used by HIV may have been around much longer than we had previously thought, because the HERV-K viral protein, K-Rev, functions in a manner similar to the HIV Rev protein.
The Rev protein is also produced by human T-cell leukemia viruses. It ushers viral messenger RNAs from the nucleus of a host cell into the cytoplasm, where they direct the cell's machinery to make the building blocks for more viruses.
Rev does this by controlling a human protein known as Crm1, and without this Rev-Crm1 pair, the viral messenger RNA would remain trapped inside the host's nucleus, and the virus would be unable to reproduce. Up until now, this form of attack was considered to be a special characteristic of the HIV and human T-cell leukemia viruses. But while K-Rev appears quite different structurally from HIV's Rev, CullenÆs group have shown that K-Rev also hijacks Crm1 to transport mRNA from a cell's nucleus to its cytoplasm. Even though it has been there in our genome for millions of years, says Cullen, it is still in perfect working order.
It is unlikely that HIV has descended from HERV-K, but it is rather more likely that the two viruses had a common ancestor, somewhere in the past, which had Rev-like activity or maybe the two viruses exchanged genetic material somewhere in their evolutionary history to create Rev activity. If this is the case, then plans to attempt xenografts, transplanting animal organs, such as kidneys, into humans, could be very dangerous. When you transplant an organ, you also transplant any viruses that are in the cells of that organ, and that brings genetic material from two different species of virus into close contact. "You now give these viruses an opportunity for genetic exchange, an opportunity not too different from what may have created the REV activity in the first place", Cullen says.
\IDeinococcus radiodurans\i is a most unusual bacterium. It is pink, and it can survive a radiation dose of 1.5 million rads of gamma irradiation. This dose is 3,000 times the amount that would kill a human, making it the most radiation-resistant organism in the world. So in a nuclear holocaust that killed even the cockroaches, this bacterium would be left to flourish.
\ID. radiodurans\i was originally isolated in 1956 from samples of canned meat which were thought to be sterilized by gamma radiation, but which had bacteria growing on them. Colonies of non-pathogenic bacteria growing on the spoiled meat turned out to be the radiation-resistant organism. The microbe also withstands extreme desiccation and UV-irradiation.
Since its original discovery, \ID. radiodurans\i has been found all around the world. Typically, it is found in locations where most other bacteria have died from extreme conditions, ranging from the shielding pond of a radioactive cesium source to the surfaces of Arctic rocks. Its name, due to its berry shape, means "strange or terrible berry \I(Deinococcus)\i that withstands radiation \I(radiodurans)."\i
The bacterium performs its survival trick by having a repair mechanism which can recover, even when radiation shreds the bacterial genome into hundreds of pieces. This amazing ability could turn out to be important to cancer researchers, because cancer is often caused by unrepaired DNA damage, but a better knowledge of the bacterium may also lead to improved ways to clean up pollution and to new industrial processes.
The same defense against damage also allows the bacterium to grow in what would otherwise be deadly poisons. This means a genetically engineered form of this bacterium could be released into a toxic waste dump where it could thrive on the most unlikely ôfoodsö, turning them to harmless chemicals. The bacterium is easily manipulated, and other researchers have already developed modified \ID. radiodurans\i strains which are able to degrade toluene and "fix" or immobilize mercury while converting it to a more benign form.
So it is important to know how the genes in this unique organism work, and that is precisely what we now have, according to a report in \IScience\i in mid-November. A team from The Institute for Genomic Research (TIGR), led by Owen White, has worked out the order of all of the nearly 3.3 million individual chemical base units making up the \ID. radiodurans\i DNA. The genome is composed of two circular chromosomes which are about 2.6 million and 400,000 base pairs in length. The genome also contains two smaller circular molecules, a megaplasmid of 177,000 base pairs and a plasmid of 45,000 base pairs. While other bacteria with multiple chromosomes or megaplasmids are known, \ID. radiodurans\i represents the first completely sequenced bacterium with these features.
\ID. radiodurans\i contains the usual complement of repair genes found in other radiation-sensitive bacteria, but it has an unusually large redundancy of repair functions. It appears that the smaller chromosome, the plasmid, and the megaplasmid may all have been acquired some time after the origin of \IDeinococcus\i line, but more analysis will be needed to decide if these smaller structures are directly responsible for its ability to survive extreme environmental conditions.
This work is part of a microbial genome program, started in 1994, which has already seen 11 microbes sequenced completely, with another 15 being worked on at the moment. Perhaps the most interesting of these was a 1996 determination of the genome of the microbe \IMethanococcus jannaschii\i. It was this sequence which established beyond doubt the validity and uniqueness of a third major branch of life on Earth, the Archaea (see \JArchaeans fix DNA better\j, May 1998).
Further information on the Department of Energy's microbial genome program is available at http://www.er.doe.gov/production/ober/microbial.html and information about TIGR is available at http://www.tigr.org
\BKey names:\b TIGR, Owen White, Kenneth W. Minto, and Michael J. Daly.
#
"First detailed map of malaria parasite",1276,0,0,0
(Nov æ99)
Mid-November brought a report in \IScience\i from scientists at the US National Institute of Allergy and Infectious Diseases (NIAID) who have completed the first high-resolution genetic map of \IPlasmodium falciparum,\i the deadliest of the malaria parasites. Complete sequences for two of the parasite's 14 chromosomes have recently been reported. This is not the complete genome, but a map which NIAID director, Anthony S. Fauci, calls "the scaffolding to accelerate efforts to sequence the entire genome of one of our greatest infectious foes, the malaria parasite".
Each year, \IP. falciparum\i malaria affects up to 500 million people worldwide, and more than 2 million of those, mostly young children in sub-Saharan Africa, die. As well, many more are left weakened and debilitated by the disease. The new map has been constructed from a classical genetic cross, and it will serve as a bridge between the genomic information and the biology of the parasite. In plain language, it will help to locate genes important to drug resistance and disease severity.
The main researchers were Thomas E. Wellems and Xin-zhuan Su, and the work proceeded using data from both the United States and the People's Republic of China. The mapping was based on examining the offspring from crosses between a drug-resistant type of \IP. falciparum\i and a drug-sensitive type.
A total of 901 markers were identified, and these sorted naturally into 14 groups, matching the 14 chromosomes, an effect known as \Jlinkage\j. These linkages allow maps of the chromosomes to be drawn, and this is one of the most detailed maps ever drawn for a eukaryotic organism. This map can be combined with another one produced at the University of Wisconsin, a map that orders snipped DNA fragments of the chromosomes, and this will help genome sequencers as they continue to build the truly big picture.
The new genetic map has been posted on the National Center for Biotechnology Information Web site, located at http://www.ncbi.nlm.nih.gov and further information on NIAID can be found on the Web at http://www.niaid.nih.gov
#
"Modified soya beans strike problems",1277,0,0,0
(Nov æ99)
A report in \INew Scientist\i in mid-November suggested that Monsanto's Roundup Ready soya beans (see \JMonsanto takes the blame\j, October 1998, and \JMonsanto and Roundup Ready\j, July 1999) may be in trouble again. Researchers in the US have found that hot climates do not agree with Monsanto's herbicide-resistant soya beans. The heat causes stems to split, and can cause crop losses of up to 40%.
This research follows problems encountered in the US state of Georgia, where farmers suffered unexpected crop losses in two very hot spring growing seasons, when soil temperatures reached as high as 50C. This is a potential problem for the company which sees Brazil and other Latin American countries as major markets for its soya beans.
In a set of controlled laboratory growth chamber studies, Bill Vencill found that at 25C, the modified beans grew just as well, but stunting appeared as the temperature was increased, and in soils reaching 45C, the differences were marked. Plant heights were reduced, as were the yields in beans modified to resist glyphosate, the herbicide marketed by Monsanto as Roundup. Curiously, the plantsÆ resistance to a different herbicide, gluphosinate, were not affected by the heat.
Vencill suggests that the effect may be caused by a side-effect of glyphosate resistance, which sees the beans producing more lignin, which may make them more brittle. According to \INew Scientist\i (http://www.newscientist.com), Monsanto are awaiting a published and peer-reviewed article before they comment, but they have said that farmers might avoid the problem by choosing a variety of engineered soya bean that is better suited to hot conditions.
#
"How flowers recognize pollen",1278,0,0,0
(Nov æ99)
Many plants exist in a number of strains, and the female part of the plant, the stigma, is somehow able to recognize the pollen of its own strain, and slow down any fertilization by that strain. This effect is commonly interpreted as preventing fertilization, but this is only half the story.
Usually, when pollen of the same strain falls on a stigma, it is usually from the same flower, or from another flower on the same plant, so it is genetically identical, and of reduced evolutionary value. Like any other pollen grain, the grain ôgerminatesö, and a tube starts to grow down towards the ovule, carrying the pollen nuclei that will be involved in actual fertilization. But where a ôforeignö grain produces a fast-growing pollen tube, the same-strain pollen grain has a pollen tube which only grows slowly.
In evolutionary terms, this all makes sense. There is a real advantage for a species if genes are thoroughly mixed, and when there are a hundred or more strains, as happens with some clovers, most ovules are fertilized by a pollen grain of another strain. All the same, there is a fall-back for the isolated plant, growing from a seed that has been carried large distances by wind, water or an animal: if no other pollen grains arrive, then the plant is still able to fertilize itself.
We have known of the effect since the 19th century. That leaves the more interesting questions: how does the stigma recognize ôselfö, and how does it slow down the same-strain pollen grains? Now, according to a late November report in \IScience,\i Cornell University researchers may have the answer to this basic, long-standing botanical mystery. The answer lies in a gene that tells the stigma-based receptors which pollen to accept or reject - at least in the kale, where the plant cannot fertilize itself at all, and is called "self-incompatible."
What we see is the opposite of the rejection effects that happen in an organ transplant, where a transplanted organ will be rejected if its genetic makeup is different from that of the host but is more likely to be accepted if the genetic makeup is similar. Here, genetic relatedness between pollen and stigma results in rejection of pollen, and genetic unrelatedness results in acceptance.
Almost a decade ago, June and Mikhail Nasrallah identified a receptor on the surface of the stigma that allowed it to distinguish between self and non-self pollen, but the label on pollen that identified it has remained elusive, up until recently. Now, working with Christel Schopfer, they have found a gene located at the so-called S locus, the genomic region that is known to control the recognition phenomenon. They have called the gene "S locus cysteine-rich protein," or SCR for short. The gene is expressed in the plant's anther, the upper part of the plant's stamen, where pollen is produced.
\BKey names:\b June and Mikhail Nasrallah, and Christel Schopfer.
#
"Major journals join to offer better electronic publishing",1279,0,0,0
(Nov æ99)
In mid-November, a consortium of 12 leading scientific and scholarly publishers announced that they are collaborating on a new reference-linking initiative that will change the way scientists use the Internet to conduct online research.
The publishers are Academic Press, American Association for the Advancement of Science (the publisher of \IScience),\i the American Institute of Physics (AIP), the Association for Computing Machinery (ACM), Blackwell Science, Elsevier Science, The Institute of Electrical and Electronics Engineers, Inc. (IEEE), Kluwer Academic Publishers, \INature,\i Oxford University Press, Springer-Verlag, and John Wiley & Sons, Inc.
Most research reports include references from journals other than that in which they are published, and with a published journal, this means going to the shelves, pulling down another journal, and consulting the references where necessary. Even if the reference is in the same title, it is likely to be in another volume, meaning inconvenience for the reader. As more and more research is published electronically, this means researchers will be free to read the work wherever they are, so long as they have access to a computer. But if they still need to go searching for references, the advantages are largely lost.
Enter reference-linking, where access to a given article published by one of the consortium will give you access to all of the articles published by other members of the consortium. So when you come to a hyperlink in the research report you are reading, this will immediately take you to any reference - so long as it is in a journal published by a member of the consortium. And with the combined power of the group involved, other journals will need to fall into line rather fast. The service is expected to be up and running in the first quarter of 2000.
It will not matter that the reference is located on a different server, owned by a different journal: the links will allow a reader to jump to the reference with just one or two clicks of the mouse. At the start, some three million articles across thousands of journals will be linked through this service, and more than half a million more articles will be linked each year thereafter.
The linking will operate from a central facility which will hold a limited set of metadata, allowing the journal content and links to remain distributed at publishers' sites. Publishers will be able to set standards, determining whether a user has access to the abstract or to the full text of an article, by subscription, document delivery, or pay-per-view, or some other arrangement.
The system is based on a prototype developed by Wiley and Academic Press, and was developed in cooperation with the International DOI Foundation. It builds on work by the Association of American Publishers and the Corporation for National Research Initiatives. (DOI is short for Digital Object Identifier, and it is this which uniquely identifies each reference, just as a bookÆs ISBN identifies a book, or a bar code identifies a product.)
#
"Boys, girls, dating and vulnerability",1280,0,0,0
(Nov æ99)
Anybody who has been through it knows that starting dating is a difficult time for young people, but a group of researchers have just questioned the assumption that girls' self-esteem suffers the most in the earliest boy/girl romantic relationships. In a report in the \IJournal of Youth and Adolescence,\i they claim that it is the boys who are more vulnerable - especially if they feel pressured into dating, at least in the United States.
The study involved sixth, seventh and eighth graders completing questionnaires, taking part in interviews, and keeping telephone diaries. Most of the boys and girls found involvement in mixed-sex settings enjoyable and challenging, rather than stressful. However, boys who were dating but who were less than enthusiastic about having a girlfriend had lower self-esteem than their non-dating peers or boys who were very interested in dating.
The study is mainly memorable for the researchers, who appear to be attempting a world record on nominative determinism. Aside from Linda Caldwell, who was obviously not part of the attempt, the study was conducted by Nancy Darling, Bonnie B. Dowdy and Lee Van Horn.
#
"The homemade supercomputer",1281,0,0,0
(Nov æ99)
It may only be of interest to a few people, but to those in the business, a place in the list of the world's 500 fastest supercomputers is pretty important. Yet in a list dominated by familiar names like Intel, IBM, Cray-SGI, Fujitsu, Hitachi, and so on, there is a single interloper at position 44: the Cplant Cluster, an assembly of 600 computers at the Department of Energy's Sandia National Laboratories.
None of these units is particularly outstanding: they are like high-end systems in a retail store, operating at about 600 million operations a second (600 megaflops) on standard benchmark problems. But when 580 nodes get stuck into the same problem, the benchmark speed is more like 232.6 gigaflops (232.6 billion operations a second).
The system uses off-the-shelf hardware, with home-grown drivers to get fast communication between nodes, and then the system uses a set of utilities modeled on those for Sandia's ASCI Red machine, more familiarly known as Teraflops, the fastest computer in the world. Many of the same computer researchers worked on both projects.
It is very easy to add extra units to Cplant, an effect called scalability. Last year, Cplant with 400 operating nodes was ranked 92nd in the world in a widely accepted test called LINPAC, which tests speed and accuracy of machines processing very similar series of operations. After the performance that gained it 44th place, Cplant ran at 247 gigaflops on 572 nodes, which would have gained it 40th place, except that the result came too late for inclusion in the 1999 results.
#
"Chiba City gets real",1282,0,0,0
(Nov æ99)
The futuristic "Chiba City" in William Gibson's science fiction novel, "Neuromancer" has provided the inspiration for a new project. The US Department of Energy's Argonne National Laboratory is working with IBM and VA Linux Systems to build the largest supercomputing cluster dedicated to highly scalable open source software development, and they have named it "Chiba City".
Just two weeks before IBM outlined its plans for a five-year project aimed at producing ôBlue Geneö, the first petaflops computer, capable of 10\U15\u floating point operations per second, this 512-CPU Linux cluster will be Argonne's most powerful supercomputer - for now.
To put that measure in perspective, IBM plans to deliver a supercomputer next year which is capable of up to 10 teraflop (that is 0.01 petaflop) performance, more than four times faster than "Teraflops", the fastest computer on the Top 500 list today. The Apple Power Mac G4, described as ôtwice as fast as the fastest Pentium P3-based computersö is credited as the first personal computer to break the gigaflop (one billion floating point instructions per second) barrier.
The project, according to an Argonne release on the Internet, will help advance the use of state-of-the-art Linux clusters based on affordable industry standard components in high-performance computing. The cluster comprises 256 2-CPU computational servers from VA Linux Systems, and IBM Netfinity servers for cluster management, file storage and visualization.
The cluster installation was accomplished in a two-day "barn-raising" event, complete with banjo player. Over 50 Argonne scientists pitched in to help build the cluster, which links high-performance servers from VA Linux with advanced hardware from IBM and the latest in network interconnect hardware.
The construction was planned and managed by engineers from Argonne and VA Linux Professional Services, with support from VA and IBM's cluster hardware and software experts. VA Linux also provided cluster management technology and certified new high-performance Linux drivers for the gigabit ethernet cards and graphics cards used in the scalable cluster.
The actual speed of the cluster was not stated, but this is seen more as a test bed for methods, while providing ôa flexible development environment for scalable open source software in four key categories: cluster management, high-performance systems software (file systems, schedulers and libraries), scientific visualization, and distributed computing.ö
Now back to Blue Gene: the machine is expected to cost $100 million over the next five years. It is intended to simulate the natural biological process by which amino acids fold themselves into proteins, and the whole setup ôwill stand six feet tall, occupy 1,600 square feet of floor spaceö (two meters high, 160 square meters) and it will include about 1 million microprocessors. It will need roughly a year to perform the protein-folding simulation, which the human body manages to complete in less than a second. IBM says that Blue Gene will demand ônew standards of computer architecture to enable super high-speed calculations and highly advanced software to run at high speeds without causing bottlenecks.ö
#
"Getting more bandwidth",1283,0,0,0
(Nov æ99)
Reports appeared on the Internet during November on two rather exciting breakthroughs that may directly influence the future of the Internet. Both came from Bell Labs, the research and development arm of Lucent Technologies.
The first report relates to a world record of transmitting information at the rate of 160 gigabits (billion bits) a second over 300 kilometers of optical fiber using a single wavelength, or color, of light, four times the normal performance of commercial single-wavelength systems today.
The researchers believe they can expand the system by using a technique called dense wavelength division multiplexing (DWDM), which increases the capacity of optical fiber by transmitting data over additional wavelengths of light. There is a great deal of potential here: todayÆs commercial optical systems combine up to 100 wavelengths on an optical fiber.
Even without this, the result represents the world's first practical 160-gigabit system. It uses a semiconductor-based transmitter and demultiplexer, and according to the appropriately-named Alastair Glass, director of the Bell Labs Photonics Research Lab, ôMultiplying 160 gigabits over additional wavelengths, we expect to be able to scale up to many trillions of bits a second in the foreseeable future.ö
The researchers achieved the amazing result with Lucent's TrueWave RSÖ fiber, the long-distance fiber which they claim has the lowest dispersion slope in the industry. This fiber was developed in 1993, and since 1998, Lucent has been marketing an 80-channel DWDM system, which can transmit up to 400 gigabits per second of information over a single fiber.
Then to show that there is more than one way of skinning a cat, Bell Labs claimed another record, transmitting data over 1,022 wavelengths, or colors, of light, through a single optical fiber, with each wavelength carrying a distinct stream of information. To do this, they used an experimental transmitter which uses a single ultra-high-speed laser to generate signals over all 1,022 wavelengths, instead of using a separate laser for each ôcolorö, as is done in conventional multi-wavelength systems. The previous record, from the same laboratory, involved transmitting 206 separate signals on the one fiber.
The normal practice is to send DWDM signals with a frequency separation of 50 gigahertz (GHz), but the 1,022-channel system operates at a record high density of 10 GHz channel spacing. With the closer packing of the frequencies, Bell labs have labeled this UDWDM, meaning ultra-dense wavelength division multiplexing.
The 1,022-channel transmitter carries information at the rate of 37 megabits (million bits) of information per second, for a total system capacity of more than 37 gigabits (billion bits) per second, roughly comparable with present commercial systems, but this is only the start. The researchers believe the system can be scaled up to OC-48 data rates, for a capacity of several terabits (trillion bits) per second.
\BKey names:\b Brandon Collings, Wayne Knox, and Matt Mitchell.
#
"Small, smaller, smallest",1284,0,0,0
(Nov æ99)
When you press a 10 mm diameter key on the number pad of an ordinary mobile cellular telephone, the signal travels to an interior chip about half a micrometer across, some four to five orders of magnitude smaller than the number button. That is, the chip is about 1/20,000 the size of the key you pressed, but as microelectronics get ever smaller, the problem of making a connection, called electronic packaging, becomes a greater and greater challenge.
At one end, the limitation is human dimensions. The size of a keyboard is limited by the size of a human hand, and the size of a display is limited by what the eye can see. In time, we hope that nanotechnology will be able to reduce devices on chips to nanodots, tiny circuit islands a few atoms across. If this happens, the size difference between the outside and the inside of a device would be seven to eight orders of magnitude.
The main problem is that the electronic packaging is now rising in cost, and that means people need to rethink the way that we connect to devices and interface with them (see \JWorldÆs smallest Web server\j, February 1999, for some examples). If keyboards are limited by hand size, perhaps we need to do away with keyboards and use voice-recognition software, or some other solution. Other problems include making connections to ever-smaller chips, ensuring that the connections do not interfere with each other, and reducing the size of physical components such as capacitors.
These charge-holding devices are mainly used as filters that clean up electrical interference and also to store electrical energy to prevent information loss. In the future, these may be made from composite films containing nanocrystals, that could be placed as close as possible to the connections.
We will be maintaining a watching brief in this area. Names to watch at Cornell University include J. Peter Krusius, Che-Yu Li and Emmanuel Giannelis, but others are working in this area as well, and a team from Rice University and Yale University reported in \IScience\i in mid-November that they have just created a molecular computer switch with the ability to turn on and off repeatedly. They achieved this by using chemical processes rather than the silicon-based photolithography usually needed to make a standard chip.
By itself, a switch may not sound very exciting, but in the world of computing, a switch becomes a logic gate, the basic component used to represent ones and zeros, the binary language of digital computing. Without switches, there can be no computer, but if you can make smaller switches, that means a smaller computer, and better still, the new switches will cost several thousand times less per unit.
So with the new switch an established success, say the developers, all they need now is memory units built on the same scale. The word is that Rice and Yale researchers planned to announce a molecular memory device at the International Electronic Device Meeting in Washington, DC on 6 December.
The following technical explanation is based closely on information provided by the researchers. The switch works by applying a voltage to a 30 nanometer wide self-assembled array of the molecules, allowing current to flow in only one direction within the device. The current only flows at a particular voltage, and if that voltage is increased or decreased, it turns off again, making the switch reversible. In other previous demonstrations of a molecular logic gate, there was no reversibility.
In addition, the difference in the amount of current that flows in the on/off state, known as the peak to valley ratio, is 1,000 to 1. The typical silicon device response is, at best, 50 to 1. The dramatic response from off to on when the voltage is applied indicates the increased reliability of the signal. These measurements of the amount of current passing through a single molecule were taken at Yale at a temperature of approximately 60 kelvin (-213║C or -350║F).
The researchers reported that since submission of their findings to \IScience,\i they have observed the reversible switch behavior in a similar molecule at room temperature, with a current peak to valley ratio of 1.5 to 1, which is still sufficient for numerous electronic applications, and more efficient systems are now being synthesized. In addition to logic gates, potential applications include a variety of other computing components, such as high frequency oscillators, mixers, and multipliers.
The active electronic compound, 2'-amino-4-ethynylphenyl-4'-ethynylphenyl-5'-nitro-1-benzenethiol, was designed and synthesized at Rice. The molecules are one million times smaller in area than typical silicon-based transistors. Looking ahead, the moleculeÆs designer, James Tour, Chao Professor of Chemistry at Rice, sees a rosy future. "It really looks like we're going to have hybrid molecular- and silicon-based computers within five to 10 years," he said.
#
"Killing landmines",1285,0,0,0
(Nov æ99)
The usual methods for landmine disposal involve either blowing a landmine up with an explosive charge which also detonates the explosive in the mine, or by opening and disarming the mine. Neither of these is a particularly safe procedure. All the same, something needs to be done, as there are an estimated 80 million or more active land mines scattered around the world in at least 70 countries. Each year, landmines kill or maim 26,000 people, most of them women or children, and usually after military hostilities have ended. That means a fresh victim every 20 minutes, year in, year out.
Now a new and safer solution has been found, using surplus rocket fuel to construct flares which can burn through the landmine cases, so that the explosive burns, or if it explodes, does so with less force, since the only explosive going off is that inside the mine.
The flare was developed by Thiokol Propulsion in Brigham City, Utah, the NASA contractor which designs and builds rocket motors for the Space Shuttle. At each shuttle launch, NASA allows for a small percentage of extra propellant in each batch to make sure that there will be enough on hand, and once mixed, the leftover fuel solidifies and cannot be saved for use in another launch. But in solid form, the solidified fuel is an ideal ingredient for Thiokol's new flare.
So all (!) mine clearers have to do now is locate and uncover a landmine, place a flare alongside it, and set off the flare from a safe distance using a battery-triggered electric match.
#
"Chimpanzee subspecies more diverse than humans",1286,0,0,0
(Nov æ99)
A study reported in \IScience\i in early November suggests that chimpanzee sub-species are more genetically variable than humans, and that the various sub-species may be more closely related than previously believed. This has a number of implications for theories about matters ranging from the origin of modern humans to great ape conservation. The findings also suggest that the observable cultural differences between chimpanzee populations are probably not the result of genetic variation between these groups.
While information about the human genome has been gathering at a frantic pace, there is far less information about the diversity of humans' nearest relatives. We have some fragmented information about mitochondrial DNA (mtDNA) in the chimps, but this DNA is separate from the DNA of the nucleus, and always inherited from the mother, which makes it useful for some purposes, but less interesting others.
With this in mind, a German team, led by Svante PΣΣbo and including Henrik Kaessmann and Victor Wiebe from the Max-Planck-Institute for Evolutionary Anthropology in Leipzig, set out to take a closer look at nuclear DNA variation, in order to get a better look at chimpanzee diversity.
They chose for their target a region on the X chromosome, known as Xq13.3, a section which does not code for any proteins, also has a low mutation and recombination rate, and which is already well-studied in humans. Using blood samples, they sequenced and thoroughly analyzed this bit of DNA from blood taken from the three geographical subspecies of chimpanzees (eastern, central, and western African) and also from the bonobo, a chimpanzee cousin.
It appears that the Xq13.3 sequence is almost four times as variable and three times as old in chimpanzees as it is in humans. This contradicts other evidence gathered from isolated chunks of DNA analyzed in earlier research, but supports a view, based on other evidence, which suggests that humans are less diverse.
In all probability, the human population has at least once passed through a bottleneck or crash, with only a few survivors - though not quite as few as are on the passenger list of Noah's ark, and not as recently as NoahÆs epic voyage either, but some massive catastrophe like this would explain the lack of diversity in humans. Says PΣΣbo: "The simplest explanation for this is that at some rather recent point in the past, humans have been few in numbers. From a genetic point of view, this point could be the origin of modern humans." That would set the date back to at least 50,000 years ago, and maybe much more.
The central African chimpanzees are the most diverse, while the western chimpanzees are the least diverse. The central chimpanzee sequences also appear to be the oldest of the chimpanzee lineages, based on nuclear DNA, although mtDNA shows the western chimps as the most diverse. Another contradiction is that the mtDNA analysis shows the three chimpanzee subspecies, while the Xq13.3 sequence evolutionary tree shows that the geographical subspecies are in fact highly intermixed.
The differences may be a matter of timing. Since mtDNA changes only have to spread through the maternal side of the genome, it appears to "evolve" more rapidly. So it is quite possible that mitochondrial and nuclear DNA may be suited to capturing evolutionary events such as the development of discrete subspecies, on different time scales. On this theory, the subspecies are old enough for the mitochondrial DNA to have become different between the subspecies, but not old enough for this to have happened to the nuclear genes.
Another surprise came when the Xq13.3 sequences for the bonobos were compared with the chimpanzees, and found to be closer together than has been assumed. In fact, some common chimpanzees are more genetically distant from each other than they were from bonobos, even though the bonobos are a different species. The conclusion would appear to be that the chimpanzee and the bonobo may have split off in separate evolutionary directions quite recently.
#
"Alphabets get older",1287,0,0,0
(Nov æ99)
Up until now, scholars have believed that the first \Jalphabet\j was invented close to 1700 BC, with the Levant valley, modern Syria, Lebanon and Israel, as the place where the Semitic-speaking people of the Levant Valley were held responsible. This alphabet, they considered, gave rise to the written forms of Hebrew, Arabic and Greek - and to virtually all other alphabets as well.
Now, it appears that the alphabet was in use in Egypt two centuries earlier, probably by some of the many Semitic-speaking people who lived or worked in Egypt, and who were inspired by the Egyptian hieroglyphics. The evidence comes from some inscriptions found on a rock in Egypt, which were described to an American Oriental Society conference on November 22.
Somewhere at the start of the second millennium BC, long before ancient Biblical times, a traveler who was passing through a desert valley in what is now southern Egypt, stopped at a rock and inscribed on it his name, his title and probably a short prayer for safe passage. This ancient calling card, as it has been dubbed, and another similar one, both point to an Egyptian origin for the alphabet.
The desert valley, now known in Arabic as "The Valley of Horrors", was visited in the northern summer of 1998 by Egyptologist John Darnell, an assistant professor at Yale. He stumbled across the rock while surveying the area, but was unfamiliar with the writings. When he returned to the United States, he brought photos of them to Chip Dobbs-Allsopp at Princeton Theological Seminary, who studies the writings of the Iron Age, around ancient Biblical times.
Dobbs-Allsopp immediately suspected that these inscriptions were very ancient, and he contacted Kyle McCarter, one of the few people in the world who can decipher archaic alphabetic inscriptions, who has spent much of his career tracking down the origins of the alphabet. In the northern summer of 1999, a team of scientists including Darnell, Dobbs-Allsopp and Bruce Zuckerman and Marilyn Lundberg of the West Semitic Research Project of the University of Southern California, visited the desert valley site to record the inscriptions, accompanied by Egyptian soldiers for protection.
The area can be dangerous; it is an inhospitable, sparsely populated region in southern Egypt. The group worked in a snake and scorpion-infested area where temperatures reach 50C (120F) for several days, taking high resolution photographs and documenting the inscriptions, but with only these examples to go on, according to McCarter, translation is not easy.
"The earliest examples of a writing system can never quite be read; it isn't until later when the system becomes conventionalized that the chances of a clear reading become more likely," he said. "However, it does bear some clear elements of Semitic writing, like the words ôgodö and ôchiefö and a few others. With our limited understanding of the words, there is a fear of forcing an interpretation of the inscription. But I think we can safely say that it is an inscription of the two men's personal names, their titles and possibly a prayer to a local god."
But even if they can only be partially translated, McCarter is excited about the find. "These inscriptions are for epigraphers what Lucy was for paleontologists," he said.
See also \JAlphabet origins\j.
#
"The oldest vertebrates so far",1288,0,0,0
(Nov æ99)
Scientists have unearthed two half-billion-year-old fossils of fish-like creatures that could be the earliest known vertebrates. The animals are only 5 cm (2 inches) long, but they are undoubtedly vertebrates. To be precise, they are a kind of jawless fish called agnathans, one resembling a \Jhagfish\j, the other a \Jlamprey\j, examples of which have survived into modern times. The fossils were located in China's Lower \JCambrian period\j Chengjiang deposits, and they are estimated to be about 530 million years old.
The discovery suggests that the agnathans had already undergone evolution by the Cambrian epoch. Up until now, the earliest definite examples of agnathan fish were classified as Lower Ordovician, dating from about 475 million years ago. The new finds mean that we must now assume that the first agnathans may have evolved in the earliest Cambrian, with the chordates arising from more primitive deuterostomes in Ediacaran times, some time before 555 million years ago.
The discovery, announced in \INature\i in early November, lists a number of important observations: there are clear signs of zigzag muscle patterns and gill structures, both characteristics of modern fish, and one fossil shows marks of an early spine.
They have fins, and would probably have been active swimmers, which may explain why no other such animals have been found this early in history, because they would have been less likely to have been buried alive when storms stirred up sediment from the bottom of the sea. Without burial, scavengers would spoil any chance of a dead animal being preserved in one piece.
#
"A new giant plant-eating dinosaur found",1289,0,0,0
(Nov æ99)
In mid-November, paleontologist Paul Sereno (see \JLargest dinosaur unearthed\j, May 1997, and \JDinosaurs and evolution\j, June 1999) unveiled two new reconstructions of a plant-eating dinosaur which weighed about 20 tons, and grew to about 20 meters (70 feet) long. Called \IJobaria tiguidensis,\i the larger specimen is about 17 meters (60 feet) long. The specimen was collected in a 1997 expedition to the Republic of Niger in the African Sahara, and the animal is fully described in the November 12 issue of \IScience.\i
The specimens, the large one already mentioned, which is male, and a juvenile posed in mid stride, are both casts, taken from the bones which have been in preparation since 1997. Around 95% of the bones were retrieved, making these the most complete long-necked dinosaurs ever discovered from the Cretaceous period.
These dinosaurs flourished around 135 million years ago, when open forests and broad rivers characterized the region that is desert today. The genus name comes from ôJobarö, a creature in the legends of the local Tuareg nomads that is linked to the exposed bones. The species name, \Itiguidensis,\i refers to a cliff near the excavation sites.
The site actually contains bones of several adults and juveniles, suggesting that \IJobaria\i once roamed in herds of mixed age, according to Sereno. They appear to have been buried in a flash flood, but tooth marks on the ribs of one of the juvenile skeletons suggest that some of them may have died at the hands of the chief meat-eating dinosaur of the time, \IAfrovenator,\i a predator 8 meters (27 feet) long, previously discovered in the same area by Sereno's team.
\IJobaria\i does not fit any of the known families of long-necked dinosaurs, or sauropods. It appears to be an ancient sauropod lineage that survived and flourished only in Africa during the Cretaceous. The distinctive features are its spoon-shaped teeth and a relatively short neck composed of only 12 vertebrae. As well, its backbone and tail are simple, compared with the complex vertebrae and whiplash tail of the older North American sauropods, \IDiplodocus\i and \IApatosaurus\i.
Sereno believes that \IJobaria\i moved gracefully, with its feet set close to each other under the body, rather like a modern elephant, and he suggests that the flexible neck and spoon-shaped teeth were well adapted for nipping the smaller branches of trees. For more information about the dinosaur, there is a Web site located at http://www.jobaria.org while the search was sponsored by the National Geographic Society, who have more dinosaur information available at http://www.nationalgeographic.com/dinorama/
#
"The Latest Paleocene Thermal Maximum",1290,0,0,0
(Nov æ99)
It has been apparent for some time that a massive global warming took place about 55.5 million years ago (see \JAncient eruptions caused global warming?\j, November 1997) and it now appears that this was tied in with a massive release of methane, an event which resulted in an extensive die-off of deep sea dwelling organisms. A report published in \IScience\i in mid-November sets out the evidence which links the LPTM and the site of the methane event.
There are huge amounts of methane on the ocean floors right now (see \JNew worms\j, August 1997 and \JSeeing a new energy source on the shores\j, March 1998), and this has probably always been the case. Since \Jmethane\j is a powerful greenhouse gas, anything that released large amounts of the gas would send the worldÆs temperatures soaring.
The LPTM lasted over a period of 10,000 to 20,000 years, and was originally discovered by James P. Kennett and Lowell Stott. At this time, numerous mammals (including primates) first appeared, and many deep-sea species became extinct. The researchers believe that vast quantities of methane were stored as frozen gas hydrate in the upper few hundred meters of continental slope sediments before the LPTM, when a longer-term global warming during the late Paleocene pushed the ocean-atmosphere system past a critical threshold.
At this point, the warm surface waters started to sink, and intermediate to deep ocean temperatures rose by approximately 4║ to 8║C, according to the geochemical record and also from faunal changes, variations in the species present in a group of microfossils, the \Jforaminiferans\j. While these need to be studied under a microscope, the Foraminifera are a remarkably useful source of information.
This temperature increase would have triggered the methane to melt, and when it was released in bubbles it would then react with oxygen dissolved in the deep waters, lowering the overall oxygen available for deep-sea life, and making a huge boost to the amount of available carbon - and also to levels of atmospheric carbon, both as methane and also as carbon dioxide. At the same time, the surface water productivity which fuels all deep-sea life would have changed as well.
If that was not enough, the increased carbon dioxide would make the deep waters more corrosive, which may also have contributed to deep-sea extinctions, while higher temperatures would have opened up new migration paths. Animals will always spread into new territory if they can reach it, so the effect would be to disperse certain mammals widely, and that would set the scene for new species to arise. Then, when the global carbon and oxygen cycles stabilized after perhaps several hundred thousand years, marine and terrestrial ecosystems were changed forever.
So where is the evidence for all this? In simple terms, it lies in a sediment core, removed by the Ocean Drilling ProgramÆs vessel, the JOIDES (Joint Oceanographic Institutions for Deep-Earth Sampling) \IResolution\i, from an area known as the Blake Nose, a promontory on the continental shelf, 400 kilometers (250 miles) east of Tallahassee, Florida.
This core shows disturbed sediment which is interpreted as evidence of a submarine landslide layer. This fits with the theory of the buried methane melting, going from methane clathrates, an ice-like solid, into a gas. Methane, say the researchers, appears to have escaped from a pressure zone created by an underlying ancient reef.
Even so, the mass of methane released from this region would not have been enough, by itself, to have driven the whole of the LPTM, so there needs to be more research, but the picture has just got a little clearer. But could it happen again? At this stage, nobody can say, because the triggering mechanism may have been different, but we know that the worldÆs oceans are estimated to carry a reservoir of 14,000 gigatonnes of marine gas hydrate - and that represents a lot of greenhouse gas.
Now the question will arise: did the asteroid that we think killed the dinosaurs also release large amounts of methane? Already, people are talking or methane fireballs roasting the dinosaurs to death, but this would appear to be far-fetched, to say the very least.
\BKey names:\b Dorothy Pak, Miriam E. Katz, Kenneth G. Miller, and Gerald R. Dickens.
#
"Tracking the missing minerals",1291,0,0,0
(Nov æ99)
Mineral veins have a nasty habit of not being where they ought to be. Most commonly, the problem is caused by faulting and movement of the rocks which carry the minerals, some time after the minerals were laid down in the rocks. So when miners come to the fault plane, the surface where the slippage occurred, the ore suddenly runs out. This is more of a problem with mineral ores than with a sediment like coal, but even coal seams can disappear quite frustratingly.
As a result, undergraduates in geology have always spent a great deal of time working over surface maps trying to build up a picture of what is going on beneath the surface, trying to build up a three-dimensional model of the hidden rocks and minerals. Most of the time, the maps have been drawn from surface observation, using all the techniques available to match up rocks in different parts of the survey area, but one granite can be much like another, and so most other rocks are of little help.
Enter a new Australian technique, developed by Brent McInnes from the Australian government research organization, the CSIRO. Dr. McInnes, who works in CSIRO Exploration and Mining, has already used his technique, a dating method called \Juranium-helium thermochronology\j, to show that billions of tons of copper ore are missing from the world's largest copper mine, the giant Chuquicamata copper mine in Chile.
There is a fault which cuts through the deposit, and it now looks as though, millions of years ago, tectonic movement along the fault displaced part of the rich deposit, pushing it half a kilometer into the sky, where it would later be eroded away. According to McInnes, "We determined the direction and amount of the vertical motion along the fault using this new technique," but perhaps more importantly, he and his colleagues showed that there may be another copper deposit buried on the eastern side of the fault, and this brings us to some interesting geology.
Copper deposits associated with igneous intrusions can be vertically zoned. That is, you will typically find a deep zone, the "porphyry style" regime, \Iinside\i an igneous intrusion, and a shallow zone, which is the "high sulfidation" regime, located \Iabove\i an igneous intrusion. In the Chuquicamata area, there are two deposits, 10 km apart, on opposite sides of the West Fault. The first, Chuquicamata, is on the east side and is a "porphyry type", meaning it is associated with an igneous stock. The second, the nearby Mansa Mina deposit, is a "high-sulfidation type" deposit, meaning it formed above an igneous stock.
The two deposits are now at the same elevation, but in the past, geologists have argued that the Chuquicamata porphyry deposit was the intrusion underlying the Mansa Mina deposit, and therefore the displacement along the West Fault was mostly lateral, with a minor ôeast side upö component. (We need to remember that faults can move in any direction.)
The new study reveals that there has been 600 m ôwest side upö movement. That means Chuquicamata cannot be the root intrusion of the Mansa Mina deposit. Rather, there must be another intrusion on the east side of the fault that was the root of the Mansa Mina deposit, some 600 meters below the surface. The only way to test this would be to drill a hole 600 meters deep, and usually more than one hole, at a cost of about $100,000 to $150,000 each to drill and process. Holes like this are not drilled lightly, and it helps to know that it would be silly to stop at 500 meters, but knowing this came from some clever science.
The trick was to use the new dating method on rocks from both sides of the fault, and this showed that they had different ages, showing that the rocks had cooled at different rates. The slower cooling rocks would have been at a greater depth, so the "younger" rocks were formed at a greater depth, and later rose to the surface after the whole block had been moved upwards, and then cut back by erosion. By dating the rocks accurately, geologists are able to form a better picture of what is happening, and has happened, underneath the ground.
McInnes expects the technique to have a wider application, and tells us that the new technology will be used in the search for ore bodies buried by fault displacement in the vicinity of known mineral deposits, such as the Cadia mine near Orange in New South Wales. In fact, the CSIRO is so impressed that it built its own uranium-helium dating facility, which started operation in June 1999.
Brent McInnes was also the lead author in a paper published in October 1999 in \IScience,\i which explained how gold gets where it does. As we explained in \JThe earthquake dangers beneath the Pacific Northwest\j, October 1999, in a subduction zone, minerals are dragged downwards as one plate is forced beneath another, but at about 100 km (60 miles), the pressure and heat changes the chemical nature of the minerals, forcing water out.
The Ladolam deposit, containing about 42 million ounces of gold, lies on and beneath Lihir Island in Papua New Guinea, to the north of Australia. The island of New Guinea is AustraliaÆs crumple zone, the area at the business end as the Australian tectonic plate moves north. This is why Irian Jaya and Papua New Guinea have high mountains, active volcanoes, earthquakes and lots of fabulous mineral wealth. But while the Ladolam deposit has been known since 1982, and while people have assumed that the deposit was somehow tied in with the local geology, the exact mechanism which brought gold and copper to the surface has been a mystery until now.
The water which is squeezed and boiled out of the minerals in the subduction zone is not like the water that comes out of your tap. It is extremely hot, but still a liquid because of the high pressure, and very reactive (see \JNear-critical water as a solvent\j, August 1999). As well, it contains oxygen, which makes it even more reactive. The plume of water begins to rise, and as it does, it gathers gold and copper, which is deposited in veins, where it is later melted into magma, molten rock, and carried up towards the surface, until the gold is deposited in the crust.
Part of the mystery about this and other gold deposits is whether they come from the crust, or from deeper down in the mantle, and this is what the researchers have now uncovered. They studied the osmium isotope composition in the gold ore, and compared this with a number of xenoliths which had been dredged from the ocean floor near the Tubaf volcano, a submarine cinder cone sitting over the subduction zone.
Xenoliths (literally "foreign rocks") are usually pieces of rock of one sort included in another rock, like chunks of granite washed into a sandstone bed, or pieces of sandstone taken into a lava flow, but in this case, the xenoliths were samples carried by the volcano from as far as an estimated 70 km (40 miles) below the surface and deposited in the cinder cone. As a group, the xenoliths appear to provide a set of samples of all the rocks, ranging from the sea floor, all the way down to 70 km, with the peridotite pieces coming from near the bottom of the range.
Some of the xenoliths were veined peridotite, with between two and 800 times as much gold, copper and palladium as the surrounding mantle, and the gold ores at the surface have osmium-187 / osmium-188 ratios like those observed in the peridotite. And that, the researchers believe, is enough to confirm that the mantle is the source of the gold.
\BOther researchers on the gold study\b: Jannene S. McBride, Noreen J. Evans, David D. Lambert, and Anita S. Andrew.
#
"Uranium-helium thermochronology",1292,0,0,0
(Nov æ99)
This dating method (see \JTracking the missing minerals\j) looks at the way in which helium accumulates in the mineral \Japatite\j, as uranium and thorium undergo alpha decay. The idea is not new - in fact, Lord Rutherford proposed the method as far back as 1905, but there was one snag. In many cases, the age given by (U-Th)/He methods was younger than the known crystallization age of the rock.
The problem arose because the helium levels are too low in these rocks, and this happens because at 100C, helium starts to diffuse out of the rock. The rate of diffusion of an element or compound through a mineral is related to the ability of that mineral to deform. A mineral lattice is made up of chemical bonds, and these bonds deform and become flexible at higher temperatures.
Dr Brent McInnes who has been working on this method explains it like this: ôThink of a mineral as a cage made out of rubber bands, and the helium atoms as a tennis balls. Tennis balls will æmagicallyÆ appear in the cage at a constant rate (because of radiogenic decay of U and Th). At the Earth's surface (T=25║C), the apatite cage is relatively rigid and the tennis balls can't get through the web of rubber bands. You can calculate how long the apatite cage has been at the surface by counting the tennis balls. At 3 km depth (T=85C), the rubber bands making up the webbing of the apatite cage become flexible and the cage expands. The tennis balls bouncing around the cage have no problem escaping through the webbing, such that every tennis ball produced immediately escapes.ö
He adds that every mineral which contains uranium and thorium (apatite, zircon, titanite, baddeleyite, monazite, xenotime) can be thought of as a cage, but they will release the helium they have gathered at different temperatures, called the ôclosure temperaturesö. In the case of apatite, where the helium diffusion is now well understood, the closure temperature is taken as 75║C. The other minerals are still to be investigated fully, but their closure temperatures are expected to all be between 250║C and 350║C.
To calculate how long a mineral has lain at something less than its closure temperature, we need to measure the amounts of uranium, thorium and helium within a single mineral grain. The first step is to release the helium from the ôcagesö where it is held. This is done by heating the mineral in a furnace at 1200║C, well above the closure temperature, for 30 minutes, and then measuring the helium with a \Jmass spectrometer\j. Then it is a simple matter to remove the cooled mineral from the furnace, and then dissolve it in 5% nitric acid, before determining the amounts of uranium and thorium by mass spectrometry.
#
"Why does",Lonesome George" refuse to mate?",0,0,0
0
(Nov æ99)
A study reported in the \IProceedings of the National Academy of Sciences\i in November sheds some new light on "Lonesome George", the last remaining male of a subspecies of the \JGalapagos giant tortoise\j, living on the small Galapagos island of Pinta. While the Darwin Research Station has been trying to mate him with females from islands closest to his, he has not shown any interest in the potential mates offered to him.
There were once 15 different subspecies of tortoises in the Galapagos, but four have died out through heavy trapping in the 19th century and the destruction of their habitats by feral animals abandoned on the islands, and when George dies, another sub-species will be lost.
From DNA studies, his closest relatives are on the islands of San Crist≤bal and Espa±ola, some 300 km (200 miles) away. In fact, some people suspect that George may have been carried from one of those islands to Pinta. DNA sequences taken from skins of stuffed tortoises collected on Pinta in 1906 are identical to George's, lending support to the belief that George was born on Pinta and is truly the last survivor of his lineage. But if George has close genetic affinities to the Espa±ola and San Crist≤bal subspecies, perhaps they would be a more appropriate source of a mate for this sole survivor.
The same study also revealed that the closest living relative of the Galapagos tortoise is the Chaco tortoise, the smallest of the three remaining species of tortoises on the mainland of South America.
#
"Lizards, forests, and speciation",1294,0,0,0
(Nov æ99)
There are two main theories about where new species originate. The standard view is that geographic isolation is the key source of species diversity in tropical rainforests. New data published in the \IProceedings of the National Academy of Sciences\i in November have provided support for the alternative idea that the edges of forests are the key centers for the development of new species.
Researchers from Boston University, San Francisco State University, and the University of Queensland argue that natural selection in forest peripheries, or "ecotones," may play an equally important role in the evolution of new species. This follows on from research on birds (see \JOrigins of new species\j, June 1997) which revealed that West African ecotones are hotbeds of evolution. This extends the ecotone theory to Australia, and from birds to reptiles.
The studies concentrated on \ICarlia rubrigularis,\i a \Jskink\j which is common throughout Australia's wet tropical rainforests and dry open forests. Even though there was evidence of genetic exchange between the populations, skink populations living within the ecotone, the ecological gradient between the two forests, exhibited significant differences in their physical appearance compared with their rainforest counterparts. At the same time, rainforest skink populations that have been geographically isolated by a mountain barrier for millions of years were uniformly similar, despite ancient genetic divergence. Conclusion: conditions at the edge of the forest favor diversity.
One alarming aspect of this finding is that modern trends to preserving diversity in the rainforest tend to target the untouched core areas of these forests, when it may be the margins which are most important, because that is where the genetic diversity of a population is found. The researchers now have a National Science Foundation (NSF) grant of $2.6 million, and Thomas Smith will lead an international team of scientists, students and policy makers on a three-continent study to test alternative hypothesis of speciation, with the goal of defining better conservation policy. The institutions collaborating in this study include NASA, the World Resources Institute, Boston University, UCLA, and the University of Queensland.
The study showed that open-forest lizards were smaller, had shorter limbs and a bigger head, and became sexually mature earlier than their rainforest counterparts. They suspected that this might be a form of natural selection driven by lizard-eating birds hunting in open forests. Theoretical studies suggest that natural selection caused by predation favors the evolution of smaller bodies and earlier reproduction.
They tested this by placing 480 plastic lizard decoys, painted to match the striped, reddish skink, throughout the dense rainforest and the open dry forest. After a period of time, they checked the models for the telltale bite marks created by bird bills, and used this to work out how many were attacked: 21 in the open, transitional forest, and just four in the closed rainforest habitat. As they have evidence of high levels of gene flow, the researchers believe that the differences between the two populations can only be attributed to rapid adaptive evolution in response to natural selection.
The new program will extend to more habitats in three continents - Africa, Australia, and South America - and to a variety of taxa, including birds, mammals, reptiles and amphibians in a variety of landscapes. The aim, says Smith, is to put science and policy on the same track: ôAt present, conservation programs tend to emphasize preserving areas of high species richness, with little attention to the evolutionary processes that generate biodiversity. Our research model is designed to provide the data necessary to define more effective conservation policy.ö
\BKey names:\b Chris Schneider, Thomas Smith, Brenda J. Larison, and Craig Moritz (University of Queensland).
See also \JSeeds in Rainforest Fragments\j.
#
"Seeds in rainforest fragments",1295,0,0,0
(Nov æ99)
A report published as a ôBrief Communicationö in \INature\i in mid-November suggests that seeds survive better in larger, continuous tropical rainforests than they do when they fall to the ground in small fragments of rainforest. In fact, those falling to the ground in small fragments are three to seven times less likely to sprout than those that fall in continuous forest.
The fragments suffer from edge effects, according to researcher Emilio Bruna, and this makes them less hospitable to germinating seeds. The fragments are hotter and drier, and they have more light penetrating the canopy to the forest floor than continuous forests do. Since rainforest plants are not adapted to these conditions, the result is hardly surprising.
Worse, Bruna mentions that other research has suggested that plants in fragments could become inbred, which might make their seeds less likely to germinate, and might even stop plants from setting seed in the first place (see \JHow flowers recognize pollen\j). If this goes on long enough, the result could be that some of the species in forest fragments might die out altogether.
See also \JLizards, forests, and speciation\j.
#
"Death of a lemur",1296,0,0,0
(Nov æ99)
Juliet, a sifaka lemur, destined to play a major breeding role in a breeding program described last month (\JLemur news for 1999\j, October 1999) has died of unknown causes during November. This is something of a disaster for the breeding program, which aims to undo some of the damage being caused in the sifakasÆ home territory in Madagascar, both from habitat destruction and also from active hunting.
After they were captured, the two lemurs were transported to the Ivoloina Zoological Park in Madagascar, where they were being acclimatized to captivity over the next six months to a year, but now both are dead, which has led the breeding group to bring forward their plans. They say now that while they at first believed the captured lemurs were diademed sifakas, closer examination of the animals' markings and body characteristics suggested that the animals might be a previously unknown subspecies of lemur. DNA tests are now under way to confirm the possibility.
#
"China and sulfur emissions",1297,0,0,0
(Nov æ99)
In the last 20 years, as scientists became more concerned about carbon dioxide levels in the atmosphere, sulfur emission has ceased to be a serious concern. After all, over the last two decades, the United States, Europe and the former Soviet Union have all stabilized their emissions of sulfur, but mainland China's sulfur emissions have soared. In fact, China now leads the world in the emission of sulfur produced by a single country, according to a recent survey published in the journal \IAtmospheric Environment.\i
In 1850, global sulfur emissions were 1.2 million metric tons, just 1.7% of the 1990 estimate of 71.5 million metric tons. While production leveled off and then dropped during World War I and the Great Depression, the climb started again with World War II, and only fell away slightly in 1981-83, corresponding to declining oil demand during the global recession.
There are a number of sources for sulfur: some oil contains traces of the element, but most of it comes from low grades of coal, with metal smelting also making a contribution. The survey makes an interesting case study in the reconstruction of data, since much of the historical information is entirely missing. Instead, the researchers worked on fuel consumption information, which is easy to get, and on a wide range of sources to deal with metal smelting.
Most 20th century data came from League of Nations and United Nations publications, while 19th century data came from a wide variety of obscure publications. More recently, data on fuel consumption from 1950 to 1990 had already been compiled by the US Department of Energy, Oak Ridge National Laboratory. Overall, the researchers gathered data from 234 countries, noting details of production, import and export of different fuels in each country to reach the final figures.
In the countries which have stabilized emissions, a variety of factors have come into play. The former Soviet Union has shifted to use more natural gas, which is typically lower in sulfur content, while in the United States, there was a switch from high-sulfur to low-sulfur coals, together with tighter environmental controls. In the US, alternative fuels such as natural gas and nuclear power have made an impact as well, and in both countries, the shift from a "smokestack economy" to a service-oriented economy has also made a difference. Another factor has been the use of scrubbers and desulfurization techniques in coal-driven power plants.
China is large, with a growing population, and to make matters worse, their abundant coal reserves are predominantly soft coal, which is the dirtier kind. They are rushing into industrialization, and in those conditions, it makes sense for them to burn coal because it is so abundant. At the same time, Indochina, Japan and Korea are all becoming more aware of acid rain problems, and these come mostly from Chinese sulfur emissions. In this way, say the researchers, the big acid rain problem of the '70s and '80s has now shifted more toward the East.
In the later part of the study period, the US, the Soviet Union and China led the world in sulfur production, accounting for 53% of global sulfur emissions. As the 1990s began, the United States and Canada were emitting a combined estimated 15 million metric tons of sulfur, compared with approximately 22 million metric tons by China, which means that these three countries are still providing more than half the worldÆs total production of atmospheric sulfur.
#
"Arctic sea ice disappearing",1298,0,0,0
(Nov æ99)
Every so often, cold war data, no longer needed by the military, becomes available for science (see, for example, \JBut at least the Arctic sea floor is uncovered\j, August 1997, \JA satellite with a view\j, March 1998 and \JMcMurdo Dry Valleys images released\j, September 1999).
Data on sea ice thickness, gathered from 1958 onwards, as British and US nuclear submarines traveled under the ice through the Arctic Ocean, have now been used to establish that the average draft of the sea ice (that is, its thickness from the ocean surface to the bottom of the ice pack) has declined by 1.3 meters (4.3 feet).
Writing in the issue of \IGeophysical Research Letters\i dated December 1, but released in November, Dr. D. Andrew Rothrock of the University of Washington, Seattle, and colleagues say this represents a reduction of about 40% compared with the earlier period when submarines first ôsailedö under the ice.
The most recent data came from three cruises: by USS \IPargo\i in 1993, USS \IPogy\i in 1996, and USS \IArcherfish\i in 1997. The earlier cruises included the first nuclear submarine, USS \INautilus,\i in 1958 and a cruise of HMS \ISovereign\i in 1976.
The data do not make a neat time series, because they are drawn from cruises which were not part of a planned experiment, and so the cruises took place at different times of the year. All the same, 29 sites showed a thinning, in some cases by as much as 1.7 meters (5 feet), and after corrections were made to allow for different times in the season, the researchers believe that their maximum error would be of the order of 0.3 meters (1 foot), when the smallest thinning appeared to be a meter (3 feet).
They say they can rule out surface wind pattern effects, because this would cause thinning in some places and a thickening in others. There are several possible hypotheses to consider other than global warming: the flow of heat from the ocean itself, from the atmosphere, and from shortwave radiation, or it may be a question of the amount of precipitation and snow cover in the region and ice movement.
This may be a normal stage in a multi-decadal cycle, or it may be part of a continuing trend, and the researchers have called for the release of other ice thickness data gathered by submarines over the past 40 years, which they believe would be "of immense help" in refining this climatic signal.
\BKey names:\b D. Andrew Rothrock, Yanling Yu, and Gary A. Maykut.
#
"What drives the African drought",1299,0,0,0
(Nov æ99)
Since the 1970s, the process of \Jdesertification\j, the formation of deserts, has been going on in Africa, and scientists have been unable to explain why. A paper in \IScience\i in mid-November suggests that the devastating drought that plagued North Africa for decades may be a natural phenomenon fueled by the land's naturally-changing vegetation cover.
The research comes from scientists at the NASA Goddard and University of California, Los Angeles, who derived their results from computer models. While some studies suggested that ocean surface temperatures drove the drought, others suggested that human use of the land for farming and livestock grazing was the cause, making humans responsible for bringing about the drought and keeping the land from recovering. The new study questions this. In the past, drought conditions in African grasslands caused the following years to remain drier, over-riding meteorological conditions which otherwise would have caused more rain to fall.
The researchers put the natural vegetation into a computer climate model, and made it fully interactive with land surface processes and atmospheric processes, and found that the addition of vegetation to climate computer models proved to be the missing link in what was driving the drought.
The model indicates that the changes in ocean surface temperatures could only account for a small part of the drought effect. Soil moisture could also be seen as a relevant factor, but it still accounted for only half of the intensity of the drought of the 1970s and 1980s. Finally, they established that vegetation is a key factor.
As the area dries due to changes in sea surface temperature, less vegetation grows. And less vegetation leads to higher surface albedo, they say. The albedo, a measure of how much solar radiation will be reflected, leads to a drier and cooler climate. Then this cooling effect weakens the monsoon circulation, and less moisture comes in from the south and the west, and we get a long-lasting drought. As well, plants transpire by losing water through their leaves. With less vegetation, there is less humidity in the air, and with less direct moisture, rainfall is reduced even more.
\BKey names:\b William Lau, Ning Zeng, and David Neelin.
#
"Plague cases increase in the USA",1300,0,0,0
(Nov æ99)
An outbreak of bubonic plague in China in the last days of the 19th century is still having an effect in the southwest of the United States. The disease entered a number of Pacific ports in 1899-1900, including Sydney and San Francisco, when rats carrying the \IYersinia pestis\i bacterium were first able to travel rapidly across broad reaches of ocean in steamships. Before that time, rats carrying the disease on sailing ships usually died, or passed infection to the crew before the ship arrived in the next port, ensuring that the plague-carrying ship would be quarantined.
One major difference between Sydney and San Francisco is that while a number of SydneyÆs marsupials are capable of carrying the disease, it did not spread to local animals, while the San Francisco rats were able to infect native mammal populations, especially ground squirrels, and plague spread throughout western North America. Plague is now most commonly found in the southwestern United States - in New Mexico, Colorado, Arizona, and California. The main risks of infection for humans are from contact with diseased wild mammals or their infected fleas, and exposure to infected fleas carried by pets like dogs and cats.
Now it appears that the number of cases in that corner of the United States is on the increase, with human plague cases in New Mexico occurring more frequently after wetter than average winter-spring time periods (October to May). During wetter than normal years, there is a 60% increase in the number of cases, according to a report in the November issue of the \IAmerican Journal of Tropical Medicine and Hygiene.\i
Plague typically rises and falls in an ôepisodic patternö in many parts of the world. This is commonly a result of rises and falls in the abundances of mammals carrying fleas and \IYersinia pestis,\i and the American case seems to fit that pattern. Ecologists say that a higher winter-spring rainfall builds up the small mammal food resources (plants and insects), leading to an increase in the abundance of plague hosts. It is possible that the moister conditions also boost flea survival and reproduction, also enhancing plague transmission.
Some recent changes in disease frequency have been blamed on global climatic changes, and the El Ni±o events of the 1990s have been associated with large increases in rodent populations in both North and South America. These increases have already been shown to correlate with a number of diseases, including Lyme disease and hantavirus pulmonary syndrome. Now it appears that plague must be added to the list.
#
"Jan van Paradijs, 1946-99",1301,0,0,0
(Nov æ99)
Dr. Jan van Paradijs, one of the world's leading astrophysicists, died from cancer on November 2. Born in 1946, Dr van Paradijs received the 1998 Bruno Rossi Prize (see \JThe Bruno Rossi Prize\j, January 1998). This is the highest award presented by the High Energy Astrophysics Division of the American Astronomical Society, and van Paradijs was given the award for his role in finding the first optical counterpart to a gamma ray burst.
He leaves a widow, the eminent astronomer, Chryssa Kouveliotou, and many in the world of astrophysics will mourn the passing of a man who was both an excellent scientist, and a friend to many. He was both professor of astronomy at the University of Amsterdam and the Pei-Ling Chan eminent scholar in astrophysics at The University of Alabama in Huntsville. His main work was with the Burst and Transient Source Experiment Team (BATSE) at NASA's Marshall Space Flight Center.
Even when he was gravely ill, van Paradijs continued to work with his colleagues on the relation of gamma-ray bursts and their afterglows. In all, he published more than 300 scientific papers, edited at least eight books, and continued working and writing through his nearly year-long battle with cancer, and in his final weeks, he wrote a "Perspectives" article in October in the journal \IScience\i on possible links between gamma ray bursts and supernovae.
See also \JMagnetic quakes shake neutron stars\j, May 1998 and \JMore magnetar news\j, September 1998.
#
"December, 1999 Science Review",1302,0,0,0
\JPromising drug for leishmaniasis\j
\JCoffee and cancer\j
\JHope for fen-phen users\j
\J13 million AIDS orphans expected by end 2000\j
\JAIDS in Africa\j
\JAIDS in Africa 2\j
\JCondoms and HIV\j
\JHow well do women know their contraception?\j
\JNeedle injuries and health workers\j
\JCorneal rings for nearsighted patients\j
\JGenetic æextremismÆ overstates the risks\j
\JA new target antigen for prostate cancer\j
\JHuman chromosome 22 sequenced\j
\JThe first complete DNA sequence of plant chromosomes\j
\JThe minimal gene set\j
\JDraft guidelines for stem cell research\j
\JConan the bacterium\j
\JPatent on key PCR enzyme ruled invalid\j
\JInhibiting telomerase to kill cancer cells\j
\JStuttering may be genetic\j
\JMale brains really are different\j
\JAn ancient ocean on Mars\j
\JPlanet light\j
\JA record magnet\j
\JLessons for teachers\j
\JDoctors' peculiar names\j
\JTop tens\j
\JBest and worst innovations of the 20th century\j
\JTokaimura in retrospect\j
\JAn Australopithecine hand\j
\JMayfly fossil barometers\j
\JLife at Vostok\j
\JRethinking the San Andreas fault\j
\JEl Ni±o triggers tropical forest reproduction\j
\JChina's crop production affected by haze\j
\JThe world's stressed freshwater supply\j
\JThe Arctic Oscillation and northern climate\j
\JEl Ni±o's dramatic impact on ocean biology\j
#
"Promising drug for leishmaniasis",1303,0,0,0
(Dec '99)
A report in the \INew England Journal of Medicine\i in December identified an interesting new treatment for a most unpleasant parasite-caused disease, \Jleishmaniasis\j. This tropical disease strikes roughly half a million people each year, and attacks bone marrow, which makes blood and immune cells, causing symptoms such as fever, liver and spleen swelling, killing about one in nine of its victims in developing countries. There are 21 known species of the parasite, found in 88 countries around the world.
The parasite can be treated with a derivative of the heavy metal antimony (pentavalent antimony), but the parasite is now becoming resistant to this drug. Another drug, amphotericin, is effective against the parasite, but it has serious side effects, and while there is a less toxic version, it is too expensive for widespread use in developing countries. To add to the problems, neither drug can stand up against our digestive juices, so they must be injected.
A team led by T. K. Jha of the Kala-azar Research Center in Muzaffarpur, India has been testing miltefosine in a phase 2 study - an evaluation to establish the most effective levels of the drug for treatment. This is a drug currently used to treat some breast cancers, which had been shown to have some effect against the \ILeishmania\i parasite. According to the report, miltefosine has passed its first major clinical test with flying colors - and it can be taken as a pill. (Kala-azar is Hindi for "black sickness" or "black fever", and the disease is known by this name in India.)
The drug was given to 120 Kala-azar patients for 4 to 6 weeks, in various doses, while doctors compared the parasite counts in the patientsÆ spleens before and after treatment. Every single patient was cleared of the parasite completely, and 95% remained parasite-free six months later. The side effects included vomiting, diarrhea, and at the highest doses, elevated liver enzyme levels. There will need to be further tests in other parts of the world, such as Brazil, where there are different strains of the parasite (Brazil has \ILeishmania chagasi\i instead of \ILeishmania donovani),\i but this is a most pleasing result. Epidemics are common in the Indian sub-continent and the Sudan, so if the drug treats only these strains, it will still be a major advance.
The latest of these epidemics broke out in the 1970s and centered on north-eastern India. It was probably caused when insecticide spraying for malaria was stopped - the spraying must have been killing the phlebotomine sandflies that transmit leishmaniasis as well as a variety of mosquitoes. Humans are the reservoir hosts for this disease, which means there might even be a chance to wipe the disease out, once and for all. (In Brazil, the local parasite can also be carried by domestic dogs, making elimination more of a challenge.)
In southern Europe, a different parasite, \ILeishmania infantum,\i is the main problem, and this has shown up recently as an AIDS-associated opportunistic infection, where it causes visceral leishmaniasis - between 25% and 70% of the adults with this disease are also infected with the human immunodeficiency virus (HIV). It will be interesting to see if miltefosine works there.
#
"Coffee and cancer",1304,0,0,0
(Dec '99)
Ras genes are proto-oncogenes, which in medical language means genes with the power to promote tumors. The first oncogenes, identified in viruses which cause tumors, have now been joined by similar genes found in all animal cells, with more than 50 of them discovered so far, but unlike the viral oncogenes, cellular proto-oncogenes are not normally a problem. They code for a variety of growth factors, growth-factor receptors, and signal transduction proteins which regulate normal cell growth, and only become a source of problems when something goes wrong.
The ras proteins act as complex switches on the cell. For most purposes, the Ras protein seems to promote extra cell division when it is ôonö, but if the right sorts of cells are treated with a nerve growth factor, the cells stop dividing, and turn into nerve cells. On the other hand, if cancer cells are treated with antibodies against Ras, the cancer cells seem to become temporarily normal, but other command systems in the cell break down, so one view of the Ras protein is that it acts as a turnstile in the membrane, passing command molecules through to the other side.
The ras genes, termed K-, N-, and H-ras, can be found in mutated form in a variety of human tumors, and as far back as 1982, researchers knew that ras genes isolated from human bladder carcinoma cells were subtly altered from the normal form. The change was a single-point mutation which caused a change in the twelfth amino acid of the Ras protein, and in mice, this was enough to transform cells so that they were tumorigenic: they began to form cancerous lumps.
Mutations in K-ras occur in around three-quarters of cancerous pancreas tumors, and the presence of mutations is often used to diagnose the disease, but there are other carcinomas where the K-ras gene is normal. This suggests that there may be two types of tumor, arising for different reasons. So far, no link has been found between K-ras mutations and any environmental or lifestyle factor, but that may be about to change.
A peculiar link between coffee drinking and cancer was reported in \INature\i in mid-December. It appears that some component in coffee, or something associated with drinking coffee, is linked to abnormalities of the gene K-ras in tumors from pancreatic cancer sufferers. The linkage is dose-dependent, but there is no sign of an overall association between the incidence of this type of cancer and coffee consumption - that is, heavy coffee drinkers do not seem to be more likely to develop cancers.
All the same, a report from a team led by Miquel Porta and published in the \IJournal of Epidemiology and Community Health\i points out that in people suffering from pancreatic cancer, there was a difference between the coffee-drinkers and the others, and between lower and higher levels of coffee drinking.
Those who also had mutations in K-ras drank around 14 and a half cups of coffee a week, which was significantly more coffee than those without K-ras mutations, who tended to consume nearer 9 cups. Of the 121 patients in the study, nine drank more than 21 cups of coffee a week, and all of those had mutated tumors. So while the number of tumors does not seem to be affected by coffee drinking, the nature of the tumor is changed.
This has been tagged as a preliminary finding, especially since one personÆs ôcupö is another personÆs thimble-full. All the same, it does appear to point to some ingredient in the coffee, or some behavior which is connected to being a coffee drinker as an influence in ôthe progression of precursor lesionsö, a term which might be roughly translated as ôhelping to cause cancerö.
Coffee and caffeine are widely used all around the world, so the effects of caffeine as a carcinogen would need to be minor, and the standard ôrule of thumbö is that a cup of coffee contains some 900 different organic compounds, any of which might be the ævillainÆ. Still, it is known that caffeine can affect DNA repair and interfere with other basic cellular processes such as apoptosis, or cell suicide. (See \JControlling apoptosis\j, April 1999.)
#
"Hope for fen-phen users",1305,0,0,0
(Dec '99)
The combination of diet drugs fenfluramine and phentermine was identified as a danger in July 1997, and the drug fenfluramine was withdrawn from the market in September 1997. It was the subject of public warnings about a year later (see \JDouble danger from Fen-phen\j, August 1998). Identified then as a cause of heart valve disease, fen-phen has remained banned, but the patients who used it can now rest a little easier. A report in the December issue of \IMayo Clinic Proceedings\i suggests that people with mild heart valve disease who took fen-phen may improve after they stop taking the drugs.
Data provided to the FDA in 1997 showed that up to 30% of diet-drug users may have had heart valve abnormalities even though they had no symptoms. Later studies reported smaller and widely varying estimates of how much heart disease was involved, but what was detected was primarily mild valvular disease. Usually, as the name implies, valves allow blood to flow only in one direction, but diseased valves may fail to close completely, and this can cause blood to leak backwards, making the heart work harder to maintain a standard level of flow around the body.
The Mayo study used 30 men and women who had been in fen-phen trials when the problem with the combination was discovered. The trials stopped at that point, but the volunteers, who had taken either fen-phen or a placebo for an average of 41 weeks, were asked to take part in a new study to evaluate potential valve disease associated with the diet drugs.
Most study participants underwent echocardiograhy, an ultrasound method of ôlookingö at the heart, within six weeks of stopping the trials, and in some cases, this was repeated six months later. Then three Mayo Clinic cardiologists who knew neither the sequence of the echocardiograms nor which patients in the study took fen-phen or a placebo, reviewed the data to judge if valve disease was present and if one of the echoes clearly showed less evidence of the disease.
The result was agreement among the three reviewers in identifying drug-related valve disease, but the best news was in what they found: 26% of the fen-phen group had mild aortic valve disease, which appeared to improve after the medication was stopped. The cautious conclusion to the study is that it represented good news for patients, but recommended that ôany patient exposed to fenfluramine or dexfenfluramine receive a complete physical exam by their local physician and an echocardiogram, if there are signs or symptoms of cardiovascular disease or if the physical examination is inconclusive.ö
#
"13 million AIDS orphans expected by end 2000",1306,0,0,0
(Dec '99)
At the end of November, the United Nations Information Center called on the world's citizens to think about the orphans left behind as people die from HIV infections. The focus was to be World AIDS Day, marked on December 1, and there was a discussion on "The Children Left Behind" as leaders in the world's fight against AIDS gathered at the UN in New York.
The UN AIDS Program warns that there will be 13 million AIDS orphans - children who have lost a mother or both parents to AIDS, by the end of 2000. More than 10 million of these orphans are under 15 years of age, and 90% are now in sub-Saharan Africa. In the next few decades, the number of AIDS orphans in Asia, Central and South America, and Eastern Europe can be expected to rise exponentially if there is not an effective medical cure. (An exponential growth means the sort of growth where the number of orphans doubles with each equal period of time, the sort of growth that gives us a graph that leaps upward.)
The figures are grim: in 1998, over 2 million people in sub-Saharan Africa died of AIDS, and in some countries more than 20% of the adult age working population were infected. In South America, there are now 1.4 million cases of AIDS, and around the world, given the rising infection rate world wide, the number of deaths related to AIDS should increase in each of the next several years.
These facts, released by the UN for World AIDS Day, 1999, sum the situation up: 5.6 million people were infected by HIV in 1999, and 2.6 million people died of AIDS, bringing the total killed to 16.3 million, and 3.6 million of those were under 15. Around the world, there are now 33.6 million people infected, and in Africa, more women than men are currently infected. This is perhaps the main reason why more than 500,000 African babies became infected from HIV infected mothers in 1999. (See \JSaving children from AIDS\j, July 1999, and \JHIV and breast milk\j, August 1999.)
There is some hope: prevention programs in Thailand and the Philippines have reduced infection rates, but HIV infections doubled in the former Soviet Union in the last two years, and life expectancy in Southern Africa will drop from 59 years to 45 years between 2005 and 2010 because of AIDS.
#
"AIDS in Africa",1307,0,0,0
(Dec '99)
One of the problems with fighting HIV/AIDS in Africa has been a lack of awareness of the disease, arising out of an unwillingness to discuss the issues openly. This all changed during November when a series of soccer games in Nairobi were harnessed to serve the needs of public health. The games, part of the Confederation of East and Central African Football Associations (CECAFA) Youth Championship soccer tournament, ended when Kenya beat Uganda, two goals to one, in the final, but the real winners were the fans who were exposed to a high-intensity awareness campaign.
Once only mentioned in whispers, talking about AIDS is now becoming common, according to a December news story released by the Johns Hopkins Center for Communication Programs, in time for World AIDS Day on December 1. When the best defense against disease is preventing infection, public awareness becomes an essential part of the medical campaign, and chants of "Ukimwi", the previously unspoken Swahili word for AIDS, were heard throughout the stadium and in the streets of Nairobi. In the same way, the rarely mentioned condom became commonplace as trained counselors handed out 120,000 of them, stressing that condoms are the best prevention against acquiring AIDS during sex (see \JCondoms and HIV\j, December 1999).
As well, throughout the soccer games, fans observed one-minute silences for AIDS victims, while audiences were subjected to a barrage of posters, calendars, and soccer tickets with AIDS messages, information brochures and giant banners in and around the stadium, while telephone hotlines were available for people wanting more information.
The HIV cycle can be broken if young people are educated about the virus, and the organizers argued that the best way to do this would be at a major sporting event with messages that appeal to young people. Using the campaign slogan "Break the Silence: Let's Talk About AIDS", the event involved 15 games between players under 20 from Kenya, Ethiopia, Sudan, Uganda, Rwanda, and Eritrea.
\BKey names:\b Dr. David Awasum; Web site: http://www.jhuccp.org
#
"AIDS in Africa 2",1308,0,0,0
(Dec '99)
The view of AIDS/HIV in the western world is changing as people begin to see the condition as a chronic disease that can be treated for years with new drug combinations, rather than as a death sentence, as it once was, even in the developed world. In Africa, though, there is no such calm, perhaps because 67% of HIV/AIDS cases worldwide live in Sub-Saharan Africa. And as we leave the 1900s behind, AIDS and HIV are raging across the continent, in part because of the failure of educational efforts to help halt the spread of the disease.
That, at least, is the view of a group of researchers who recently published a report in the \IInternational Quarterly of Community Health Education.\i They chose for their study three Kenyan towns along the Trans-Africa Highway, where truck stops on the highway have long been the location of commercial sex workers, who have numerous partners among those who work in the long-distance hauling industry.
They point out that their use of the term "commercial sex workers" as opposed to "prostitutes" is quite deliberate, since there is a cultural distinction involved. Husbands of the sex workers consider their wives' work as just another job, without the moral stigmas attached in most western societies. But stigma or not, the effects are just as deadly, and a combination of multiple sex partners and improper and infrequent use of condoms has transformed these truck stops into high-risk areas where the chances of contracting HIV and other sexually transmitted diseases are high.
Despite numerous HIV/AIDS prevention campaigns in Kenya, the disease continues to spread rapidly. In 1993, some 760,000 Kenyans were estimated to be infected with HIV. Just two years later, nearly one million were believed to be infected, even though large-scale campaigns to fight the spread of the deadly disease were under way by then.
Using discussion groups which included commercial sex workers, truck drivers and young men who live and work at the truck stops, aged between 17 and 57, the researchers began by asking about the most significant health threats in the area. While some people nominated HIV/AIDS, many were more concerned with malaria, skin problems or even Ebola disease. The researchers found there was a potent mix of knowledge, misinformation and suspicion that was seriously harming public health efforts to slow the spread of the disease, and they also learned that 83% of those in the groups thought it possible or probable they themselves had the virus, but few of them had sought tests to find out.
While people generally knew how HIV is transmitted, a few thought that sharing utensils or toothbrushes and even mosquito bites can cause the disease, and some even thought that condoms were in some way responsible. A further problem is the developing belief that that "AIDS" is an acronym for "American Idea of Discouraging Sex," which some used to dismiss AIDS as a threat.
So while awareness is important, reliable information is even more important, though practical solutions are even more so. Motivational campaigns to change behaviors will do little good in the area as long as they are not communicated and implemented they say, but condoms need to be made available at a variety of locations, 24 hours a day, at affordable prices. And in a tropical country, condom storage must be improved to prevent breakage due to heat and moisture damage, which is a serious problem in the area's climate.
\BKey names:\b Kenzie Cameron, Kim Witte, Maria Knight Lapinski, and Solomon Nzyuko.
#
"Condoms and HIV",1309,0,0,0
(Dec '99)
Are condoms really the answer to HIV? Not according to a report in the November/December 1999 issue of \IFamily Planning Perspectives,\i which falls well short of supporting the standard claims that condoms may be 99% effective in preventing HIV transmission. The study, carried out by the University of Texas Medical Branch, shows that condoms, if used properly and consistently, reduce the chance of contracting HIV during vaginal intercourse by 87%.
The study is based on a comprehensive analysis of 12 previously published studies of couples in which one person was HIV-positive and the other person was HIV-negative. Those studies followed a total of 504 couples who never used condoms and 306 couples who always used condoms to see if and when the HIV-negative partner became HIV-positive.
In their analysis, they omitted the "sometimes users" group, who can introduce confusing results, and so concluded that consistent condom use reduced the rate of HIV transmission from about 6.7 cases to .9 cases per 100 person-years, which translates to an 87% improvement in protection, a close parallel to the 90% effectiveness of condoms as a means of contraception.
\BKey names:\b Susan Weller and Karen Davis.
#
"How well do women know their contraception?",1310,0,0,0
(Dec '99)
As a medium like the Internet develops, and as we develop sophisticated ideas about the need for awareness of many things, so we can expect to see more developments like the interactive questionnaire, \IBirth Control and You: Test Your Contraceptive IQ,\i on line at http://www.arhp.org/iq
This site, prepared by the Association of Reproductive Health Professionals (ARHP), provides an excellent way of targeting those points at which users have fallen for various urban myths relating to contraception. The information also helps people to decide which form of contraception is right for them.
While the site provides "quiz" questions in standard formats, the main feature is the instant feedback that respondents get, whether they are right or wrong, plus a continuing tally of right answers. The information appears to be appropriate, understandable, honest and complete, a good model for other sites in the future, dealing with other important health and social issues, or almost any other form of public education and awareness.
#
"Needle injuries and health workers",1311,0,0,0
(Dec '99)
One of the major problems with blood-borne diseases, such as HIV and hepatitis B and C, is that healthcare workers can be injured by needles and other sharp medical implements (known generally as sharps). In the United States each year, there are between 800,000 and 1 million needle-stick events each year, according to a report in the December 1999 issue of \IInfection Control and Hospital Epidemiology.\i
In any major health catastrophe, one of the most important objectives must be to preserve the lives and health of those workers who represent a huge community investment of time, money and talent, but ongoing small losses of healthcare workers or their time are just as costly, both to the workers, and also to the whole community, who are denied their services when they are injured. For healthcare workers, the risk of contracting HIV through a contaminated needle has been estimated at 0.3 to 0.4%; hepatitis B at 10 to 35%; and hepatitis C at 1.2 to 10%.
The report indicates that introducing a comprehensive safety and education program at a community hospital led to a significant and sustained decrease in the overall injury rate of the staff involved. Training is important, but just as important was a visible emphasis on safety, because " . . . healthcare workers are more likely to adopt safe work practices if they perceive a strong organizational commitment to safety," according to the lead author Robyn R. M. Gershon.
The injury data were collected from 1990 to 1998 at a mid-sized, acute-care community hospital. An Anti-Needlestick and Sharps Injuries Task Force had been set up in 1991 to develop and implement "a comprehensive intervention program". This included the introduction of a needleless intravenous catheter and a new comprehensive sharps disposal system, but it also involved extensive safety training for all hospital employees, and staff suffering sharps injuries had to fill out a detailed exposure questionnaire as soon as possible following an incident.
Overall, the rate of sharps injuries dropped 70% from 1990 to 1998. There were 633 injuries for an average population of 2,300 employees, and injuries were most frequently reported by registered nurses and licensed practical nurses, followed by technicians and support staff.
#
"Corneal rings for nearsighted patients",1312,0,0,0
(Dec '99)
If you have trouble seeing clearly, you can wear spectacles or monocles, you can put a contact lens on your eye, or you can undergo the now-popular LASIK (laser assisted in situ keratomileusis) procedure to improve your sight. LASIK, though, is recommended for people with more severe nearsightedness, and is a bit extreme for those with mild near-sightedness. Now there is a new solution, especially for people with low-level nearsightedness, a small plastic implant called a corneal ring.
This can be inserted during a half-hour outpatient procedure, when the surgeon makes a diamond-shaped incision in the periphery of the cornea, at the side of the clear front part of the eye which refracts light. After pushing aside the tissue, the surgeon implants the ring two-thirds deep. The position of the implant makes the central curvature of the cornea flatten, and this corrects the nearsightedness. (People usually assume that the lens of the eye is what focuses light, but most of the refraction actually comes from the cornea, with the lens managing the fine-tuning of our vision.)
Where LASIK is irreversible, the corneal rings are removable, so if the patient's sight changes as they get older, the implants can be taken out, and the corneal tissue is elastic enough that it returns to its original shape.
#
"Genetic æextremismÆ overstates the risks",1313,0,0,0
(Dec '99)
The level of vandalism and protests against the use of genetic engineering in forestry is increasing fast. One of the features of genetics in 1999 has been the way in which leading scientists have begun to see the need to speak out against the scare tactics used by environmental extremists. Medical scientists have been leading the drive to bring the debate around to a more careful analysis of the issues based upon science, but now forestry researchers have also spoken out against the methods being used by the scare-mongers.
In the past, there has been an expectation that scientists will stay quietly in their laboratories, get on with their work, stay out of the political arena and leave others to decide on questions of policy. While this has been little better than a polite fiction, it has been accepted by most scientists, so that the protesters have been able to seize the initiative in depicting genetic engineering as monster-making, but now they are confronted with their own monster: reputable scientists who see the protests as a threat to human welfare, and who are now willing to speak out about what they see as right.
The medical researchers are concerned that any limits placed on genetic manipulation will deny the sick and the dying the cures they need, but the forestry researchers are more concerned at getting the same level of production from forests with less ecological damage. The protesters talk of people being made ill or dying from the effects of genetically manipulated organisms, and claim that the GM organisms might cause ecological disasters. They make these claims, however, without offering even a scrap of evidence to suggest that their fears are based on fact.
It is important to recognize the possibility of risks, say the scioentists, and it is sensible to take precautions to eliminate those risks, or to reduce them as far as possible, but when the risks become vanishingly small, it is immoral to avoid applying those methods which have so much to offer humanity.
The safe and careful application of biotechnology to forestry, say the forestry researchers, holds the potential for trees that grow faster, reduces the burdens placed on native forests, possibly lessens the use of some chemicals, and helps to meet the increasing demands for wood pulp, building materials and renewable energy. Their commentary, a statement that was ratified by 99% of the voters from a recent meeting of International Union of Forestry Research Organizations, was published in the December issue of \INature Biotechnology.\i
They recognize that the new technology raises some concerns that should be dealt with, both scientific and ethical, but there is overwhelming consensus among scientists that biotechnology can be used safely and should move forward, according to Steven Strauss, a professor of forest science at Oregon State University and co-author of the paper. He adds that the concerns are not major problems, and both the scientific community and industries have been working to address them for a number of years.
Strauss contrasts this with the claims of some environmental groups that genetic engineering, in forestry and elsewhere, is rife with unknowable risks and looming ecological catastrophe. He refers to the World Wide Fund for Nature which, in November, called for governments around the world to enact a moratorium on the commercial use of genetic engineering in forestry, and to an attack on laboratories in Washington state doing gene research on trees, dubbed "Frankentrees" by the vandals.
"Some opponents of this science, based on little or no scientific evidence or knowledge, have elevated gene research to Frankenstein proportions," Strauss said. "They suggest the sky is falling and do their best to scare and upset people. What we have to do in this debate is separate real risks from pseudo-risks and do the scientific research which can clearly move us forward in a safe, progressive manner."
Strauss notes that the comparatively minor genetic tinkering done with existing trees does not even approach the concerns raised by invading species of whole organisms, such as the Dutch Elm Disease that ravaged forests across America when brought over from Europe. "The movement of single genes, or small groups of genes, is vastly less risky than the movement of exotic organisms, where tens of thousands of highly co-adapted genes are placed into new environments where they can further evolve and take over major ecological niches," he said.
The view of the scientists is that field trials are a critical part of the research needed to establish the value and safety of the technology, and can be conducted responsibly. This is in marked contrast to the suggestion from the World Wide Fund for Nature that field trials of genetically engineered trees are inherently dangerous and should be banned. Strauss argues that the opponents of GM "do not document a single release into the environment or a problem that has resulted anywhere in the world." He also makes the telling point that just about all field trials of trees are destroyed before they ever flower and removed from the test site once the study is completed.
The full text of the position statement by Strauss and other scientists can be found on the Web at www.fsl.orst.edu/tgerc/iufro_pos-statm.htm
The key points of their statement include the problem that world fiber consumption between 1970 and 1994 went up 50% in the developed world and 300% in developing nations. One of the best ways to meet this demand is in plantation forestry, where trees are grown much like agricultural crops in intensive, short-rotation plantations, and genetic engineering is used to create trees that optimize the yield and viability of those plantations. In the foreseeable future, it is only in these areas of intensive growth that genetic engineering will be used. There is no plan to wildly distribute modified genes into the wilderness.
The changes would be directed at faster growth, resistance to insects or benign herbicides, and controlled flowering to avoid releases of the new genes into the environment via pollen and seeds. Because manipulation involves inserting specific, intensively studied genes, it means that most of the risks can be carefully studied, isolated and controlled so they can be anticipated in advance and reasonable decisions can be made about commercial application. If overly precautionary restrictions are thrust upon the industry, this would preclude most of the research needed to assess safety and benefits.
The statement also points out that the risks from genetically modified trees should be compared to known risks from conventional forestry techniques. Genetic engineering just uses a different branch of science, and is not inherently any more risky than the work that has been done in conventional plant breeding for hundreds of years, they say, adding that when breeders attempt to introduce new traits via breeding, or attempt to develop a crop for a new environment, there are many bumps along the road as the varieties and technology are refined. This, they assert, is very similar to the early stages of genetic engineering.
#
"A new target antigen for prostate cancer",1314,0,0,0
(Dec '99)
A report in the \IProceedings of the National Academy of Sciences\i in early December describes a new gene that may lead to better ways of attacking prostate cancer. The report describes a novel cell-surface antigen called STEAP (short for Six-Transmembrane Epithelial Antigen of the Prostate), which is expressed at high level prostate cancers, but is rarely produced in non-prostate tissues. This means that therapies targeting the STEAP molecule would thus target cells in a tumor of the prostate - and more importantly, target the cells in metastatic tumors, the small colonies that break away from the area of the prostate and settle in other areas, while still retaining the giveaway biochemistry of the original tumor.
This sort of opportunity is the dream of every cancer researcher, since this molecule has the potential to make the tumor cells specific targets, with minimal risk of collateral damage in other tissues. In a sense, having a molecule like this associated with the cancerous cells gives the sort of advantage police would have if all burglars had to wear masks, berets and striped shirts, and carry a bag marked æswagÆ.
STEAP is also expressed in androgen-independent tumors, which means it offers an avenue for treatment of cancers that have become resistant to conventional hormone ablation therapies, where the cancers are treated with hormones to reduce their activity. And because the molecule is on the cell's surface, this makes the molecule suitable for a wide variety of therapeutic approaches, including antibodies, small molecules and vaccines.
The research took place in an in-house discovery program run by the pharmaceutical company UroGenesys. They describe their work as a "disease-focused genomics program" which identified the STEAP gene by comparing RNA sequences isolated from patient-derived specimens of prostate cancer to sequences isolated from benign prostatic tissues. Then, once they had successfully cloned STEAP, the UroGenesys team looked for its presence in a wide variety of normal and diseased tissues. Studies showed that "expression of the protein was restricted predominantly to the prostate among normal body tissues". STEAP appears to act as an ion channel or transport protein in and/or out of cells.
UroGenesys uses proprietary xenograft mouse models that mimic the progression of advanced prostate cancer in humans, including the development of metastatic and hormone refractory disease. When they implant patient-derived tumor samples into immune-deficient mice, the researchers can also generate a renewable supply of primary and metastatic tissues that are otherwise difficult to obtain. So far, the method has revealed some 30 additional new antigens implicated in prostate cancer.
#
"Human chromosome 22 sequenced",1315,0,0,0
(Dec '99)
A landmark in the Human Genome Project was reached in early December, when scientists announced that they had sequenced the euchromatic portion of a human chromosome. The story broke first in \INature\i on December 2, and shortly after, appeared in the mainstream news channels, with the result that the \INature\i Web site was impossible to access for the best part of a week.
It is a small start, as the chromosome in question is the second-tiniest in the human genome, and technical difficulties have prevented the mapping of small portions of the chromosome, but they still claim the sequence as ". . . the largest contiguous segment of DNA sequence." The chromosome carries genes which, when defective, seem to be related to schizophrenia, and a number of other disorders.
Even the tiny chromosome 22 contains an awesome 33.4 million base pairs, with something like three billion base pairs in the entire human genome. Some of these base pairs make up genes, instructions for the assembly of proteins that make us, and control all of our operations. The problem then is to work out which parts are instructions, and which parts are just packaging, the so-called "junk DNA". (This term should be treated with caution, as some geneticists are prepared to speculate that there may be some important role for "junk DNA", a role we still have to discover, but nobody is prepared even to speculate about what that role may be.)
There are a number of ways scientists can identify genes. They can search for the previously identified sequences of genes already known from other organisms, or they can use computers to find stretches of bases that might potentially encode proteins, but right now, the programs are not particularly good, and the protein candidates have to be tested in the laboratory, but it can at least help to narrow down the search. Ian Dunham and his team at the Sanger Center have spotted 545 genes, they believe, and Dunham believes that these represent most of those genes which are present in the chromosome. One of the identified genes is almost exactly the same in humans as it is in the yeast genome, suggesting that it must be an important gene indeed, if it has been maintained so closely through the time since the yeast line separated from the human line of descent.
On current trends, the whole of the human genome appears likely to be complete by the end of 2002, after the completion of a draft form of the complete human genomic sequence, some time in 2000. Once that is ready, the real challenge will begin, working out just what all those proteins really do in the human body. The 21st century will see biology transformed by these developments - everybody is agreed on that. But \Ihow\i it will be transformed, and \Iwhat\i it will lead to, we cannot even begin to guess.
In fact, right now, we do not even know how the huge masses of information will be handled. We look at todayÆs trickle of data and may feel comfortable, but the raging torrents will be upon us within just a few years.
One fascinating area for study should be the single nucleotide polymorphisms or SNPs which uniquely identify us. These are the small variations between genes that may occur once every thousand base pairs or so, but even a low rate like that would still mean there are about 3 million SNPs. These are now being listed in a related project involving 10 of the worldÆs leading pharmaceutical companies, which aims to produce a freely available register of the most common SNPs within a couple of years.
The current theory is that there may be, say, five SNPs which cause a disease like diabetes, which tends to be inherited. In the language of genetics, we say that people can have a genetic predisposition to develop the disease, and what this means is that if you have the right genes, and if you encounter the right trigger, you stand a very good chance of developing such a disease.
Somebody with four of the five "diabetic" SNPs would be fine, but if we inherited all five of the SNPs we would be at serious risk. In the future, we may be able to identify such people, long before they develop the disease, acting early to restrict damage. In other words, medicine will go from treatment to prevention by helping people to avoid the triggers in the first place.
Two groups are working on the human genome. One is the publicly funded Human Genome Project (HGP), which contributed the chromosome 22 solution, combining teams from all over the world. The second group, Celera Genomics Systems, is a private company in America, which only began work on the human genome in September 1999, the same month they completed the fruit-fly genome. Celera is a joint venture between the Institute for Genome Research (TIGR) and the Connecticut-based Applied Biosciences Division of the biotechnology giant, Perkin-Elmer.
The HGP makes its results available on the Internet, updating the information every 24 hours, and uses a bank of 200 sequencing machines which solve small blocks of 500 bases. (A good starting point for Web-based HGP information is http://www.nhgri.nih.gov/Other_resources/science.html, or more simply, from http://www.nhgri.nih.gov/)
The feed material for the sequencing machines is tiny fragments of human DNA which have been mass-produced by bacteria, engineered to carry æbacterial artificial chromosomesÆ or BACs. Once the HGP has a set of some 30,000 BACs, these are arranged into a map, often by looking for overlaps at the ends of different fragments.
Perkin-Elmer holds the license to the basic robotic sequencing technology, and Celera uses a different method to produce information which it then sells. CeleraÆs ôwhole genome shotgunö method was pioneered by Craig Venter, a former scientist turned entrepreneur, who was in the news in December 1999 when he announced his intention of "making a bacterium" (see \JThe minimal gene set\j). Venter has already used the shotgun method to sequence the influenza virus, the bacterium \IE. coli\i and the fruit-fly genome.
The shotgun method costs 90% less, according to Venter, but it is a "quick and dirty" method, relying on shattering the genome and reading all of the fragments simultaneously in one of the worldÆs largest supercomputers. The method works better on small genomes, but by piggy-backing on HGP data, Venter can reassemble his scraps into a complete structure. The shotgun method also gets into trouble when it encounters repeated DNA, and the human genome, unlike that of smaller organisms, is packed with repeats.
Even though the repeats are usually "uninteresting", they need to be handled correctly to avoid ambiguities in the final solution to the genome, but while the advocates of the shotgun method see the HGP ôclone by cloneö method as plodding, the chromosome 22 solution shows that it is capable of working well over long distances.
#
"The first complete DNA sequence of plant chromosomes",1316,0,0,0
(Dec '99)
The first complete genome to be published for any organism more complicated than a virus was that of the bacterium, \IHaemophilus influenzae,\i published as recently as 1995. Since that time, the flood gates have opened - at the moment, there are rather more than 60 \BScience review\b articles which mention the term ôgenomeö. One of these, \JFirst complete physical map of a higher plant genome\j, July 1999, reported on the mapping of the genome of a small plant, \IArabidopsis thaliana,\i but now December has brought us the sequences of chromosomes 2 and 4 of \IArabidopsis,\i which were published in \INature\i in the middle of the month.
Chromosome 2 was reported by a team led by Craig Venter of TIGR, while chromosome 4 comes from Michael Bevan and colleagues from the John Innes Institute in Britain. In all, the researchers have sequenced 37 million base pairs (also called megabases, or Mb) in the plant, out of an estimated total of 120 Mb - for comparison, humans have a genome of about 3,000 Mb. Chromosomes 2 and 4 house 7,781 genes that contain instructions for making proteins, suggesting that overall, \IArabidopsis\i has around 25,000 genes on its five paired chromosomes.
By way of comparison, the completely sequenced \ICaenorhabditis elegans\i (a roundworm) and \IDrosophila melanogaster\i (a fruit fly) each have about 20,000 genes. What is more, the genes are likely to contain a few surprises: of the genes identified so far, about 60% have been ôsolvedö by comparing them with genes of known function from other animals and plants, but that still leaves 40% of the plant's genes which have no known correspondence with anything so far discovered - and that means there are some interesting discoveries waiting to be made by the end of 2000, when the sequencing is likely to be complete.
One direction will certainly be worth following. As the abstract in \INature\i notes, "More unexpected is what appears to be a recent insertion of a continuous stretch of 75% of the mitochondrial genome into chromosome 2". On top of that, a very large stretch of DNA (4.6 million base pairs) is duplicated on chromosomes 2 and 4. This duplication represents approximately one-quarter of the total length of each of these chromosomes.
A \IHuman Genome\i newsletter has been available on the Web since 1989, which covers a wide range of genomic issues, both for the human genome and for the genomes of other species, although in somewhat technical language. It is located at http://www.ornl.gov/hgmis/publicat/hgn/hgn.html and includes an archive of all previous issues.
#
"The minimal gene set",1317,0,0,0
(Dec '99)
How many protein-producing genes dies a single-celled organism need in order to survive and reproduce? The answer, offered in \IScience\i in early December, is somewhere between 265 and 350. That is the estimate of Clyde A. Hutchison III, who used a technique known as global transposon mutagenesis to reach his answer. Hutchison and colleagues at The Institute for Genomic Research (TIGR) found that about a third of the genes in \IMycoplasma genitalium\i were unnecessary for the bacterium's survival.
\IMycoplasma genitalium\i causes gonorrhea-like symptoms in humans. It was an excellent choice for this sort of study, because it contains only 517 cellular genes, the smallest number known in any single-celled organism.
The experimental technique uses a form of elimination, by randomly inserting bits of unrelated DNA into the middle of genes. This disrupts their normal function, so if the organism continues to thrive, the disrupted gene was not an essential one. TIGR's Craig Venter took the world stage to announce his plans to "build a bacterium", but that is still quite a long way off. More importantly, this sort of research may one day be a significant step forward in creating minimal, tailor-made life forms that could be further altered for such purposes as making biologically active agents for treating illness.
But while Venter (a co-author of Hutchison's paper) got the main headlines for his plans, most scientists around the world were inclined to scoff. So why should we not hold our breath, waiting for the first laboratory-made bacterium? While this sort of thing can be done for a virus, which is just nucleic acid wrapped in a simple protein coat, ômakingö a bacterium will be a much greater challenge.
Basically, we still do not know all that we need to know, and creating a bacterium ôfrom scratchö will require quite a lot more understanding about how the cell wall of a bacterium is constructed, for example. Even within the target organism, the minimal set of genes includes about a hundred whose function is not yet understood - hence the scoffing at Venter's claims, but they may yet be embarrassed, as Venter has a fairly good record of getting things right.
If he does succeed, or if scientists merely succeed in creating stripped-down organisms, these will have major commercial, social and ethical implications. An editorial in the same issue of \IScience\i by Dr. Mildred K. Cho of the Stanford University Center for Biomedical Ethics commented that "The prospect of constructing minimal and new genomes does not violate any fundamental moral precepts or boundaries, but does raise questions that are essential to consider before the technology advances further."
Cho went on to comment that "The temptation to demonize this fundamental research may be irresistible", and after Venter's announcement of his plan for an artificial bacterium, there was a brief flurry of alarm in the media, but it died away, at least for now. Informal estimates on the Internet for any likely success in this area range from 2010 to 2025, but as one observer commented, both of these dates are well within Venter's life expectancy.
#
"Draft guidelines for stem cell research",1318,0,0,0
(Dec '99)
Human pluripotent stem cells have come a long way since they were first successfully isolated and cultured just over a year ago (see \JA breakthrough with human embryonic stem cells\j, November 1998). Now the US National Institutes of Health (NIH) has published its draft guidelines for research involving human pluripotent stem cells.
These cells have the ability to develop into most of the specialized cells or tissues in the human body, and they can divide for indefinite periods in culture. This means that a single successful culture can supply the needs of many researchers. There are a number of ethical issues related to this research, and the NIH guidelines are directed at ensuring equity and justice for all - so far as this can be planned in advance of new developments.
The uses to which these cells may be put in the future include as a base for generating cells and tissues for transplantation, improving our understanding of the complex events that occur during human and other embryonic development, and in testing new drugs before using them on laboratory animals and human subjects.
There are two ways of getting human pluripotent stem cells. In one method, they can be derived from early-stage human embryos in excess of clinical need and donated by people who were undergoing infertility treatment in an in vitro fertilization (IVF) clinic - the ethical problems here are fairly obvious. The second method derives the pluripotent stem cells from human fetal tissue obtained from pregnancies that had been terminated, and once again, the ethical issues that will arise are fairly straightforward.
In both cases, the donation is made with the informed consent of an adult involved, but in the United States right now, federal law prohibits the Department of Health and Human Services (DHHS) from funding research in which human embryos are created for research purposes or are destroyed, discarded or subjected to greater than minimal risk, and the work so far has been funded by private sources. It was this consideration which led the NIH to seek a legal opinion from the DHHS Office of the General Counsel on whether NIH funds may be used for research utilizing human pluripotent stem cells.
The DHHS has concluded that human pluripotent stem cells derived from human embryos are not embryos, and so are not affected by the Congressional prohibition. The human pluripotent stem cells derived from fetal tissue, however, would fall within the legal definition of human fetal tissue and are, therefore, subject to federal restrictions on the use of such tissue, and so the NIH could not fund research based on such cells. The legal opinion also clarified that NIH funding for research to derive or utilize human pluripotent stem cells from fetal tissue is permissible, subject to applicable law and regulation.
With that in mind, the NIH has now posted a set of draft guidelines for research in this area, and these were available on the Internet at http://www.nih.gov/news/stemcell/draftguidelines.htm during December 1999 and January 2000, while in the longer term, future developments may be expected to be located somewhere at http://www.nih.gov
The draft guidelines identify areas of research involving human pluripotent stem cells that are ineligible for NIH funding, including studies in which stem cells are used to create or contribute to a human embryo; are combined with an animal embryo; are used for reproductive cloning of a human; are derived using somatic cell nuclear transfer into a human or animal egg; or are derived from human embryos created for research purposes.
There are no easy answers in work like this, where stem cell research could easily lead to cures for conditions such as Parkinson's disease, diabetes, heart disease, multiple sclerosis, burns, and spinal cord injuries. Much of the argument will be emotional, whether it is religious argument against the research, or from the families of the ill, the suffering and the dying, arguing for research to go ahead without delay.
In the end, there seems to be only a small place for science in such a debate, and yet it really is no more than a splitting of hairs, for such research is still allowed to be carried out by privately funded workers, and if there are to be any problematical uses of such cells, they are most likely to be those where there is a profit to be made, exactly the sort of work that \Iwould\i be funded by private sources.
#
"Conan the bacterium",1319,0,0,0
(Dec '99)
The strange pink bacterium \IDeinococcus radiodurans,\i in the news last month (see \JDeinococcus radiodurans genome sequenced\j, November 1999) was a popular topic again in December. For starters, NASA scientists see this bacterium that smells of rotten cabbage as an ideal way to provide important drugs to isolated astronauts traveling to Mars, or in the longer term, in restructuring Mars for human habitation. Unlike most other known life forms, our diminutive Conan would do rather well under Martian conditions.
Robert Richmond and his colleagues, R. Sridhar and Michael J. Daly, have dubbed the bacterium a polyextremophile because of its recovery powers. Extremophiles like the alleged nanobacteria (or nanobes, smaller than microbes) in the Martian meteorite ALH84001 introduced these unusual life forms to the spotlight, and they have been there ever since. Since the enthusiasm for ôMartian microbesö (see \JLife on Martian meteorite?\j, March 1997, or search for ALH84001 in \BScience Review\b for more recent stories), there have been discoveries of probable nanobes living in such odd places as human kidney stones and in limestone 4 kilometers under the surface of the Earth, but the status of ALH84001 remains open to debate.
Most extremophiles settle in to withstand one or two extremes, like the odd ecological niches offered by the hot springs of Yosemite in the United States. \ID. radiodurans\i can endure many extremes, including the most dangerous space hazard, radiation. "Radiation-induced DNA damage is an oxidizing type of damage," Richmond says, pointing out that chemical experiments done by the labs aboard the two Viking landers indicate that the chemistry of Mars may be highly oxidative.
Just as the common gut bacterium \IEscherichia coli\i has been engineered to produce large quantities of human insulin, so \IDeinococcus\i variants could be engineered to produce important biochemicals, they say. Most of the development work so far has been aimed at producing strains that can clean up mercury, a deadly heavy metal, and toluene, a dangerous solvent, but in the future, strains might be genetically manipulated to produce various drugs that humans might need while exploring Mars, then put on ice during the mission. About two thirds of the drugs we use are natural products, and that means that somewhere, some organism possesses the genes needed to make that drug. If somebody became ill, treatment would start with drugs from a small supply kept on hand, while the appropriately engineered bacteria are brought from storage and cultured to produce a regular supply.
One of the scientific ôhuman interestö stories which excited the mass media during 1999 was the need to airdrop tamoxifen, a breast cancer chemotherapy agent, at the South Pole for a medical doctor who had diagnosed herself with breast cancer. This was an expensive and risky operation that could have been avoided by having suitable bacteria on hand to make the tamoxifen, or other drugs, on demand, but air drops to Mars would simply be impossible. And more importantly, if the amounts of essential drugs can be reduced, this would also reduce the weight that a spaceship would have to haul to Mars and back.
If Mars is ever to be settled, the scientists believe that engineered strains might be used to produce clean water and oxygen -- and perhaps even food supplements, relying on resources the bacteria find on Mars. And if the bacteria can ôlive off the landö, then perhaps they could be used in terraforming, the idea much beloved by science fiction writers over many years, reshaping the environment of Mars to make it more hospitable to humans. The first terraforming happened when ancient life forms converted Earth's environment from a carbon dioxide atmosphere and calcium-rich seas to the more hospitable world we have today. Now, having spoiled their home, they survive in what we consider to be extreme environments.
Mars is considered to be an extreme environment for humans right now, but with a little help, it could be made more accessible and, eventually, even attractive. But why look to this particular extremophile for help? The main reason is dormancy: the introduced life forms may need to survive for thousands or even millions of years, until conditions on Mars become hospitable for growth, like the plant seeds waiting in the desert for a rare fall of rain. And across that time, the waiting spores will need to stand up to doses of radiation that would kill fragile human cells. Even if the cells hide under a rock, rocks give off radiation, and over time, they give off enough radiation to kill normal cells.
And here, we seem to spot the secret of the resistance \ID. radiodurans\i shows to radiation. Most bacteria form spores which are quite good at preventing radiation damage to their genomes, but this bacterium does not. Some years ago, Robert G. Murray suggested that its DNA-repair system evolved to combat desiccation, and more recent experiments carried out by John Battista, among others, tend to support this notion, but there seems to be no members of this group in a Chilean desert, while others live in hot springs, where drying is unlikely to be a major risk, so the solution remains incomplete.
Another interesting feature in the news during December was the high level of redundancy in the bacterial genome, with 4 to 10 copies of the \ID. radiodurans\i genome existing in each bacterium. These backup copies increase the odds that a mutated gene will have an undamaged counterpart, and it appears that the bacterium aligns copies of its genome so that identical DNA sequences are near each other. Since bacterial chromosomes usually come in circles, this theory invokes pictures of stacked loops of DNA, resembling a roll of hard candies for Americans, or lollies or sweets for other English speakers, and so has earned the name the Life Saver hypothesis. (For British readers, Life Savers are shaped like Polo Mints.)
While pictures of the bacterial chromosome appear to back this notion, many scientists remain skeptical, in part because this idea does not explain why the genetic material comes in four separate circular "chromosomes", where most bacteria have just one thread of double-stranded DNA.
The big challenge lies in finding the genes to insert into "Conan". The toluene-degrading gene, for example, comes from \IPseudomonas.\i
\BKey names:\b Michael J. Daly, Lawrence T. Wackett, and James K. Fredrickson.
\BRelated Internet sites:\b http://www.er.doe.gov/production/ober/ober_top.html, http://www.lbl.gov/NABIR/, http://www.ornl.gov/hgmis/publicat/97santa/sequence.html and http://www.tigr.org/tdb/mdb/mdb.html
#
"Patent on key PCR enzyme ruled invalid",1320,0,0,0
(Dec Æ99)
A key legal decision in a federal district court in California in early December led to a Wisconsin-based laboratory supply company Promega Corporation claiming victory over the pharmaceutical giant Hoffman-La Roche. The case related to the patent rights to a naturally occurring form of the widely used laboratory enzyme ôTaq DNA polymeraseö. US District Judge Vaughn Walker ruled that a 1990 patent involving Taq polymerase was issued, in part, on misleading information and false claims by scientists associated with the Cetus Corporation. This is just the latest round in a protracted legal struggle which has been running, one way or another, since 1992.
The polymerase chain reaction or PCR technique (which gained Kary Mullis the 1993 Nobel Prize in Chemistry) allows researchers to replicate stretches of DNA to produce large amounts of particular genes, without having to use cloning in bacteria or other microorganisms. Promega has maintained successfully that the claims to the patent were ôunenforceableö, on the grounds that the patent holder had intentionally withheld material information and distorted important facts in obtaining the patent.
Hoffman-La Roche had purchased the rights to Taq DNA polymerase for $300 million from the California company Cetus in 1991, and later filed suit against Promega for continuing to sell Taq without the appropriate license, issued under the patent. Promega countersued, claiming that the Taq polymerase extracted by Cetus scientists was identical to enzymes that had already been described in the scientific literature. If true, this would alone be enough to deprive anybody of the right to patent the process.
One of the cited examples was a September 1976 paper in the \IJournal of Bacteriology,\i co- authored by John Trela and graduate students Alice Chien and David Edgar, with the title "Deoxyribonucleic Acid Polymerase from the Extreme Thermophile \IThermus aquaticus."\i
As a side issue, many university researchers in the United States believe that, as researchers, they are not liable to pay the patent fees if they use Taq in their research, and they continue to buy Taq from unlicensed suppliers, mainly Promega. Perkin-Elmer, which handles the marketing of the product for Roche, maintains that there is no ôexperimental-use exceptionö.
A key point of the court case is that Promega also alleged that the scientists had been aware of the public domain status of the information at the time of making their patent application, in which they claimed that the enzyme they described was entirely novel. While the judgement is restricted to naturally-occurring Taq, Promega say that it sets the stage for a finding of unenforceability for all related Taq and PCR patents.
Roche intends appealing the case to a higher court, rather unsurprisingly, since Promega is now suggesting that in addition to a loss of sales and royalties, Roche may also have to contend with ômonetary damages and other sanctionsö.
Promega have posted some interesting expert opinions on the Web, and these can be accessed though http://www.euro.promega.com/taqlegal/experts/experts.htm. There is also useful background information on the Web at http://www.euro.promega.com/taqlegal/summary.htm. Information from Hoffmann-La Roche on the case was a little harder to locate on the Web, but they provide general background information on PCR licenses at http://www.pebio.com/ab/pcrlicensefaq/. For Perkin-Elmer technical information, go to http://www.pebio.com/search/ and enter the search string ôtaq polymeraseö, using the default settings on the search engine.
See also \JLawsuit targets Yellowstone bioprospectors\j, March 1998, or search the Web using the search string \I<"taq polymerase" and patent and Promega and Cetus and Roche>\i.
#
"Inhibiting telomerase to kill cancer cells",1321,0,0,0
(Dec '99)
An early December report in the \IProceedings of the National Academy of Sciences\i indicated that scientists have succeeded in causing the death of human cancer cells by inhibiting telomerase. This is the enzyme, much in the news in the past couple of years, which is capable of immortalizing human cells by preserving the telomeres, the ôcapsö on the ends of chromosomes, which normally wear away slowly, providing a sort of count-down clock which identifies the biological ôageö of a cell. In cancers, this shortening of the telomeres is often blocked, making the tumor cells effectively immortal.
David Corey, Jerry Shay and collaborators at UT Southwestern Medical Center at Dallas developed small synthetic inhibitors against telomerase that when introduced into human cancer cells, caused progressive telomere shortening and eventually cell death in human breast and prostate cancer-cell lines grown in the laboratory. The researchers also showed that if the inhibitor was withdrawn, the telomeres regained their initial lengths.
"The use of anti-telomerase synthetic inhibitors should open up a new class of therapeutic cancer drugs that will have a powerful role in cancer therapy," said Shay, professor of cell biology, in a press release. "They should prevent the recovery of residual cancer cells following conventional therapy and thus make them more susceptible to attack by the immune system or killing by existing therapeutic agents."
Telomerase has an essential ribonucleic acid (RNA) component that acts as a template for adding back telomeres to the ends of chromosomes, but Corey and colleagues designed short pieces of RNA and DNA to bind to the telomerase RNA template and block the enzyme's activity.
One of the interesting features is that the treatment seemed not to leave any cells behind. "In the experiment, which was carried out for over 100 days, no cells survived the anti-telomerase treatment. In most current cancer therapies, there are often some tumor cells that survive the initial treatment, and this can often lead to cancer relapses," Shay said. "Since no cells survived in the study, this indicates that alternative mechanisms to maintain telomere length were not easily activated."
Search \BScience Review\b with the key word ôtelomeraseö for other information on this area of science.
#
"Stuttering may be genetic",1322,0,0,0
(Dec '99)
In late November, Dr. Susan Felsenfeld told the 74th annual meeting of the American Speech-Language-Hearing Association in San Francisco of her collaborative research with US and Australian colleagues on stuttering in a group of 25- to 30-year-old Australian twins. Starting with a sample of more than 1,500 sets of twins who answered questionnaires which included questions about stuttering in the individual and in the family, they identified a subset of 197 pairs of twins in whom at least one twin either stuttered or had a history of stuttering.
After follow-up telephone interviews, they found that the monozygotic (identical) twins were much more likely to both stutter than the dizygotic (non-identical) twins. Both the stuttering pattern and even the level of anxiety over stuttering seemed to be under some genetic control, but while it did not always follow that both monozygotic twins stutter, if one does, the second twin, if not a stutterer, often reported having other speech-related problems, such as articulation disorders. In short, genetics appears to play a part in determining if somebody would ever have a stuttering problem.
#
"Male brains really are different",1323,0,0,0
(Dec '99)
Phrenology would have to be one of the most dubious ôsciencesö of the 19th century, but while there is no evidence for ôbrain bumpsö revealing what lies beneath the cranial bones, it now looks as though the layout of the brain may be important in controlling what we can do.
A report in the journal \ICerebral Cortex\i during December revealed that there are "striking" differences - about 6% - between men and women in a part of the brain linked with ability to estimate time, judge speed, visualize things three-dimensionally and solve mathematical problems. These differences may, according to the researchers, lie behind some well-known gender trends, where more men than women are architects, mathematicians and race-car drivers.
The brain region involved is the inferior parietal lobule (IPL), which is significantly larger in men than in women. The IPL is part of the cerebral cortex and can be found on both sides of the brain, just above ear-level. As well, men have a larger left IPL than right, while women have a larger right IPL, and the difference is more pronounced in women than in men.
The researchers remind us of a study published in \IThe Lancet\i in June 1999, which revealed that the IPL is the same part of Albert Einstein's brain that was particularly large, compared with controls. Scientists, they say, have noticed this region is also larger in postmortem studies of the brains of other physicists and mathematicians.
The lobule is also better developed in humans than in other animals, and has evolved relatively recently. It allows the brain to process information from our senses such as vision and touch, and is needed for the sort of thinking involved in perception and selective attention - just the sorts of things that you would need for mathematics. The data come from magnetic resonance imaging (MRI) scans of the brains of 15 closely matched men and women, and compared peopleÆs overall IPL volume by gender and from one side to the other.
The right IPL has been linked with a working memory of spatial relationships, and with the ability to sense relationships between the body parts and an awareness of a person's own feelings. The left IPL is more involved in perception, playing a role in judging how fast something is moving, estimating time and having the ability to mentally rotate 3-D figures.
Godfrey Pearlson, the lead researcher, cautions against saying that men are automatically better than women because of this difference. Earlier research carried out by Pearlson showed that two crucial language areas were significantly larger in women, perhaps explaining their advantage in language-associated thought. These areas were in the frontal and temporal lobes of the brain, but the differences show an overlap, and he says: "It's easy to find women who are fantastic at math and physics and men who excel in language skills. Only when we look at very large populations and look for slight but significant trends do we see the generalizations. There are plenty of exceptions, but there's also a grain of truth, revealed through the brain structure, that we think underlies some of the ways people characterize the sexes."
We have to wonder, though, what would happen if men linguists and women mathematicians were to have their IPL volumes studied - would they turn out to be atypical of their gender, and if not, how important are the brain differences after all?
\BKey names:\b Godfrey Pearlson, Patrick Barta, Melissa Frederikse, Angela Lu, and Elizabeth Aylward.
#
"An ancient ocean on Mars",1324,0,0,0
(Dec '99)
Even with the recent catastrophic losses of Mars missions, there is still a great deal of useful areography being done. (Areography is like geography on our planet, but carried out on Mars, and it derives from the Greek name for the god of war, Aries.)
An article in \IScience\i in early December pointed to measurements taken by the Mars Orbiter Laser Altimeter (MOLA), an instrument aboard the unmanned spacecraft Mars Global Surveyor which is circling the planet, and uses these measurements to infer the former existence of an ocean which dried up hundreds of millions of years ago.
The possibility had been previously considered in 1989 and 1991. The MOLA beamed a pulsing laser to Mars' surface, and then scientists measured the time it took for the laser to return to the satellite. The laser traveled a shorter time to return from mountain peaks and a longer time to come back from craters, and over time, this has built up a detailed map of the Martian surface.
There are clear channels on the surface of Mars which would carry water into the northern lowlands on the surface of Mars, but the question remained as to whether the water ever collected in large standing bodies, or lakes, seas and oceans. The report claims four types of quantitative evidence which suggest that such bodies did exist.
First, there is a level contact line detectable which appears to be an ancient shoreline, and the surface below this level is smoother, consistent with smoothing by sedimentation at some time in the past. As well, the volume of the area below the suspected shoreline is within the range of previous estimates of water on Mars, and a series of terraces exists, parallel to the possible shoreline, which are consistent with what you would get from receding shorelines as the ocean dried up.
\BKey name:\b James Head.
#
"Planet light",1325,0,0,0
(Dec '99)
Planets have been previously detected around distant stars by ôwobblesö in the stars as large planets swing around them, and most recently by a shading effect (see \JA planet confirmed\j, November 1999) as a planet passed in front of its star, as we see it from Earth. Now an even better technique has been reported in mid-December. The planet in question was already detected, but this matters little: the point is the development of a new technique, if it is confirmed.
Andrew Collier Cameron and colleagues of the University of St Andrews in Scotland studied a planet detected in 1997, orbiting around the star Tau-Bo÷tis, just 50 light years away, with the aim of dissecting out the starlight reflected from the planet, distinguishing it from the starÆs own light, reaching us directly. The secret lies in the \JDoppler effect or shift\j, which we encounter most commonly when a siren passes close by.
The speed of light and sound remain constant, no matter how fast the source is moving, but the frequency of light or sound rises when the source is approaching us, and falls when the source is going away. A planet orbiting a star alternates from rushing towards us as it swings around from behind the star, to rushing away from us as it begins to move to the back of the star again.
The planet they chose to seek out is close to the star, and so it shines very brightly, but the light of the planet is outshone 10,000-20,000-fold by the star itself. All the same, the researchers believe they have separated the light reflected by the planet from the glare of its star. They report that the planet is about eight times the mass and about 1.8 times the size of Jupiter, and that it has a bluish-green color. As another group had previously failed to find any signs of reflected starlight, other astronomers are expected to regard the results with caution until they are independently confirmed.
#
"A record magnet",1326,0,0,0
(Dec '99)
The National High Magnetic Field Laboratory in the USA has set a record in creating a powerful magnetic field. In testing during December, a hybrid magnet which has been in development since the early 1990s reached a peak magnetic field of 44.2 tesla. This is the strongest continuous magnetic field created to date in a scientific laboratory, and represents a remarkable engineering achievement. Ultimately, the large hybrid magnet is expected to achieve 45 tesla or more, almost a million times as strong as the Earth's magnetic field.
The laboratory conducts magnet-related research and provides high magnetic fields as a user facility for US and international researchers. Research in high magnetic fields is critical to modern technologies and scientific research because it provides an additional means for scientists to study matter at the molecular level. The hybrid magnet will be used for research in condensed matter physics, materials science, chemistry, and ultimately, the biological sciences.
#
"Lessons for teachers",1327,0,0,0
(Dec '99)
As 2000 dawns, the Internet has existed for 30 years, and it has been in most peopleÆs awareness for five to 10 years. It is still a very young medium, and we cannot yet see where it is going. All we can say is that it appears to be going in some interesting directions. Even matters as simple as sorting into alphabetical order and searching alphabetically will go, replaced by other, more sophisticated searches. But how will people search ôthe Netö, 10 years from now, or 20, or 50 years from now?
Once, Internet users relied on knowing where information was stored by being told about it, then they began to rely on links to get from one site to another, or people who placed their bookmark files on the Web to help others. Then people learned to rely on search engines which offered clever searches filtered by Boolean logic, date, language or nation of origin.
That was also the time when the information flooding onto the Internet began to run ahead of the search engines, and the time when clever users learned to place meta tags in their Web pages to attract ôhitsö. Most search engines now return a large number of pages which no longer exist, but even if a search scores a hit, a number of studies have shown that a number of the hits are of very dubious merit. There is no guarantee of reliability in privately posted information on the Web, and no certainty that the information is reasonably free of bias.
Now the answer appears to be a matter of reliable sources providing catalogs of ôgoodö material, taking us back a couple of steps to the days when enthusiasts offered their bookmark files for others to use. Authors of amateur sites which are of a reasonable standard will e-mail others of the same caliber, suggesting mutual links. Awards pages list good sites, and the sites gaining awards place a link to the awards page, allowing users to find other similar sites in a hurry, while large organizations set up hierarchical structures of links to recommended sites. As well, there are Web rings, where related sites offer links to other members of the ring in one of several ways, but more and more, people are learning to use high-powered sites to locate useful material located somewhere on the Web.
As a medium matures, so new methods, new practices and new products arise, though not necessarily in that order. What writers would once have called a second draft is now a beta version, but where second drafts might have been shown to one or two close friends, beta versions are placed on the Web, but not widely publicized until the formats have been tested, and enough content has been added to give the product ôcritical massö. A good example of a super-site which has been through a long beta phase is the recently announced Gateway to Educational Materials (GEM) Web site, created by the US Department of Education to provide teachers with lesson plans at no charge. This site is located on the Web at http://www.thegateway.org/ and has been there since 1998, but it was only publicized widely in mid-December.
The site contains lesson plans culled from online sites for federal and state governments, nonprofit and commercial entities, and universities, and it offers more than 7,000 items, classified by content and level with a neat search engine. For both teachers and students (and maybe even frazzled parents), this site offers a faster way of accessing information than traditional search engines, which seem, more and more, to be losing the battle to stay in the race.
For another example, see also \JDoctorsÆ peculiar names\j, December 1999.
#
"Doctors' peculiar names",1328,0,0,0
(Dec '99)
"Nominative determinism" is the fancy name given to a situation where a person's name seems to be linked in some way to their work, like Mr. Waters, the brewery manager, Ms. Drain the plumber, that sort of thing. Because of the wide range of their activities, medical workers have many more chances than most of finding their name classified as apt in some way, as Mari Stoddard, a staff member with the Arizona Health Sciences Library educational services group, has discovered.
As a result, she now has a Web site, http://educ.ahsl.arizona.edu/mla/doctor.htm, where you will find Dr. Skinner, the dermatologist, and Dr. Foote, the podiatrist. The more "unfortunate" include surgeons Drs. Pain and Slaughter. According to reports, Stoddard began the list after an exchange on a medical librarians' Internet list, and soon started to have offers of new names, each of which she has had to verify. Even the medicos have been joining in, and so far, nobody has complained at being listed, not even Dr. Smellsey or Dr. Kutteroff.
#
"Top tens",1329,0,0,0
(Dec '99)
While purists continue to argue that the new millennium starts at the end of 2000, most people have celebrated their new century, and their new millennium, and got on with life, after pausing briefly to assemble a few lists of the top 10 something or others. As a way of defining where our society is at, there are probably few better ways than by examining a few top and bottom tens, to assess our group attitudes.
We will begin our account with Slashdot's ôTop 10 Hacksö, using ôhackö in the sense of a neat way around a problem, rather than the debased popular image of a ôhackerö as a sociopath breaking into computers and provoking disasters. As all good hackers know, a real hacker is only interested in solutions.
The "Top 10 Hacks of All Time" include Orson Welles' 1938 "War of the Worlds" broadcast; the Mars Pathfinder, especially the landing, when the lander emulated a giant beach ball; Ken Thompson's "cc hack"; the AK-47 rifle; the Bletchley Park computers (the ôBombeö and Colossus decoding machines of World War II); the language or tool (make your own choice) called Perl ("probably responsible for more one-off hacks than any other tool in the programmer's arsenal"); Second Reality by Future Crew; the Apple II; the SR-71 "Blackbird" which flies so high that SR-71 pilots need to wear spacesuits which is also the fastest plane that has ever flown; and the Apollo 13 Mission Rescue.
Honorable mentions were accorded to emulators (in general), the demoscene, the Trojan Horse, the Great Pyramids in Egypt, and Duct Tape (Gaffer Tape in some parts of the world); not a "Hack" in and of itself, but almost certainly responsible for more hacks than any other substance on the planet.
There was a lively correspondence after the event, for which readers were referred to the archives of slashdot.org (which bills itself as ônews for nerdsö, located at http://www.slashdot.org/).
#
"Best and worst innovations of the 20th century",1330,0,0,0
(Dec '99)
MBA faculty and students in Tennessee picked the five best and five worst innovations of the 20th century. Not surprisingly, telemarketing was rated as one of the worst business innovations of the 20th century, and the computer rated as the best invention of the last 100 years.
Faculty and students, drawn from 30 countries around the world, were in remarkable agreement. The staff listed neckties, the BETA format VCR and carbon paper as some of the worst innovations of the last century, while students listed thermal fax machines, and automobiles such as the Gremlin, Pinto and Yugo (while curiously failing to give due credit to the Trabbi - is this evidence of a cultural bias?). Both groups listed New Coke as one of the worst business innovations of the last century (the opinion of experts is that readers outside of the area devastated by this product should ask no questions and feel grateful).
Staff and students agreed that computers, the Internet and assembly lines were among the best innovations of the last 100 years. Curiously, voice-mail, e-mail and fax machines made both the faculty's best and worst lists. Faculty also counted the financial markets on their best list, while students nominated the telephone. For the worst list, faculty added taxes and business conglomerates, while students nominated office cubicles as among the worst innovations in the last century. Dilbert, it seems, is one of the touchstones for the \Ifin de siΦcle\i generation.
#
"Tokaimura in retrospect",1331,0,0,0
(Dec '99)
On December 22, the death of one of the Tokaimura nuclear disaster workers was announced. While his name was not released, the radiation dosage he suffered was mentioned, making it likely that the victim was Hisashi Ouchi, who was subjected to 17 sieverts (see \JTokaimura nuclear incident\j, September 1999).
Since the initial incident, enquiries have been continuing, and more detail is now available on what really happened in the fuel reprocessing plant. According to a December \IPhysics Today\i article, available online at http://www.aip.org/pt/toka2.htm, there is more to the story than people originally thought. There were in fact three separate errors involved, one when the plant's operating company modified the government-approved procedure without authorization, but the second and third were created by the workers themselves, possibly with concurrence of their immediate management.
While it was originally believed that the workers were adding fuel to the tank that was supposed to contain it, we now learn that they were using a container not intended for the purpose. It was the wrong shape, and more seriously, it had a water jacket around it for cooling. In the end, both the shape of the tank and the presence of the water jacket (which reflected neutrons back into the tank) contributed to the start of a nuclear chain reaction.
Because the uranium was for the experimental Joyo reactor, and was enriched to contain 18.8% U-235, there should have been a limit of 2.4 kg on the amount of fuel handled in one place at one time, but they put no less than 16 kg into the tank, causing a self-sustaining chain reaction.
The purification process should have involved feeding uranium oxide as a powder into a dissolving tank, where it is mixed with nitric acid to produce soluble uranyl nitrate, which is moved into a buffer tank and then to a precipitation tank, where ammonia is added, converting the uranium back into a solid product, with the contaminants remaining in solution.
This process is repeated until the uranium oxide is sufficiently pure. At that stage, the uranyl nitrate in the buffer tank is shipped to another facility where uranium dioxide is prepared and made into Joyo fuel. The accident happened when purified solids were being converted to uranyl nitrate for shipping. At this point, and in accordance with the JCO manual, but without approval from the regulatory authority, they mixed the uranium oxide and nitric acid in 10-liter buckets rather than in the dissolving tank. Then they poured the solution into the precipitation tank, rather than into the buffer tank.
Criticality depends on quite a few things, but as a general rule, the more compact a mass of fuel, the easier it is for it to "go critical", because more of the neutrons formed in a fission have a chance to hit other atoms and set off a chain reaction - this is why an atomic bomb forces two hemispheres of fissile material together to make a sphere. With that in mind, the buffer tank was designed to be tall and narrow, which makes criticality impossible, but the precipitation tank lacks this shape. Even so, there were no problems until the workers added the seventh bucketload, and that passed the limit, causing the mixture to go critical.
In the Joyo reactor, the minimum critical mass for the \Bsolid\b 18.8% uranium fuel is about 46 kg, but in solution, light atoms such as hydrogen slow neutrons between fissions, making it more likely that they will be absorbed by other nuclei. Added to this, the water jacket reflected more slow neutrons back in, and the water jacket probably also served to keep the critical solution cooler, so it did not expand as fast, which would have made it become non-critical. In fact, criticality continued for some 20 hours, until the cooling water was drained from the jacket and boric acid (boron is a neutron-absorber) was applied.
In all, the Tokaimura incident was locally dangerous, but unlike events such as the \JChernobyl\j event, it was not dangerous to people at any great distance. Nonetheless, it should never have occurred, and it is likely to influence Japanese energy policy for some time to come.
#
"An Australopithecine hand",1332,0,0,0
(Dec '99)
South African paleoanthropologist Ron Clarke reported in December in the \ISouth African Journal of Science\i that he and his colleagues had found a complete hand and arm of an \IAustralopithecus,\i the first find of its kind. A find like this is important, as it will give us a detailed insight into what sort of manipulation the individual could perform. In fact, we not only get the anatomy, we also learn about the behavior of this distant cousin to modern humans, and it should also tell us something about how our own hands and arms evolved.
The remains came from the same cave as the find, about a year ago (see \JA new Australopithecus fossil\j, December 1998) of a complete \IAustralopithecus\i skull and associated remains. This latest discovery probably comes from the same individual, but the researchers are having to recover the hominid bit by bit, because the different sections of the specimen have become separated in rock movements that have occurred over time.
At this stage, the hand and arm are still partially encased in rock, and still need to be readied for removal to a laboratory for closer investigation. The arrangement of bones appears to show a left arm that is stretched above the head with the fingers clenched. The hand bones of the skeleton are of similar length to those of modern humans but the thumb is much more powerfully constructed, and the finger bones are curved like those of apes. While the elbow joint resembles that of the orang-utan, the bones are shorter than in an ape, which brachiates, using its long arms to swing through the trees.
The picture we now have of \IAustralopithecus\i is that the various species walked upright and ate plant foods and small animals - when they could catch them. At the same time, Clarke points to the fossil remains of several species of big cat and hunting hyenas that inhabited the Sterkfontein region at the same time as \IAustralopithecus,\i and concludes that it would have been unsafe for the hominids to spend the nights on the ground. From this, he has argued that the hominids spent much of their time in the trees, and these bones appear to tell the same story, he says.
Clarke believes that the individual fell into the cave, or somehow got trapped and died there, and he hopes to be able to locate all parts of the skeleton eventually, free of the damage that scavengers do to bodies left exposed on the surface. He will be particularly keen to get the rest of the skeleton, particularly the upper part of the femurs (thigh bones), as this will allow a comparison with the modern apes, which have long arms relative to leg length.
#
"Mayfly fossil barometers",1333,0,0,0
(Dec '99)
One of the sweetest pieces of science comes when somebody works out how to obtain a measure of what the weather was like at some point in the distant past: see, for example \JFossilized emu egg shells have a story to tell\j, May 1999, or \JAfrica hot\j, August 1998. Nothing in this field, though has ever been as sweet as the method described by John L. Cisne at the annual meeting of the American Geophysical Union (AGU) in December, for measuring barometric pressures millions of years ago. The Cornell University geologist has discovered that long before humans developed the \Jbarometer\j, a primitive winged insect was experimentally measuring air's density and leaving barometer readings in the fossil record for us to read.
The insect in question is a common mayfly, which has changed little over the past 300 million years, so we can use it to estimate the mass and composition of ancient atmospheres. Mayflies have a larval form which lives in water, and when the adults emerge into the air to mate, anglers know that the trout will be biting as the members of the mating swarm fly up, then "parachute" down, resting their wings before flying up again at 20-30 beats per second with their two pairs of wings.
As a mayfly moves upward in the mating swarm, its main flight muscle fills most of the pterothorax, the body's wing-bearing segments. The force that the muscle delivers to the wings is effectively recorded in the length of the pterothorax, while the corresponding force that the wings exert on the air is shown in the length of the forewing, the longer of the mayfly's paired wings. Then it is a piece of comparatively simple mathematics to calculate the density of the air, once we know how large the forewing must be in relation to the pterothorax for a mayfly to be able to dance. Mayflies from the Permian and Cretaceous periods have the same relative wing size as in modern forms, suggesting rather plainly that the mass and density of the atmosphere must have been practically the same for the last quarter-billion years.
Today's adult mayflies (and presumably those of the past) live only a few days as adults, and many are eaten by predators, but then, as now, a few must sometimes land in soft sediment with a chance of becoming a well-preserved fossil. From the nature of older rocks, we know that the Earth's atmosphere was, at certain times, quite different from today's. At the beginning of our world, for instance, the atmosphere contained almost no oxygen, but until now, the only way to even guess at atmospheric pressure in the distant past was to measure the size of gas bubbles frozen into lava as it solidified, but this is a poor measure at best.
The new method of assessing past pressures may allow us to estimate how much gas was blown off into space by the impact and explosion of asteroids like the one that hit what is now Mexico 65 million years ago, and a number of other questions of interest in unraveling the Earth's history. Any catastrophic degassing from an impact, even if it only caused a relatively small loss, could be enough to contribute to mass extinctions, such as those 65 million years ago among flying reptiles (pterosaurs) and birds' flightless dinosaur relatives, and also to cause any knock-on effects that might come from such a major loss in the food chain.
Related Web sites can be found at http://www.geo.cornell.edu/geology/faculty/Cisne.html and http://www.entm.purdue.edu/entomology/research/mayfly/mayfly.html
#
"Life at Vostok",1334,0,0,0
(Dec '99)
In a report which may hold implications for the search for life in the solar system, readers of \IScience\i in early December learned that bacteria may live thousands of meters below the ice sheet in Lake Vostok, a suspected body of subglacial water deep in the Antarctic interior (see \JIs there life under the ice?\j, August 1999). Two separate investigations of ice fragments from a bore hole have revealed traces of bacteria in ice taken from a piece of clear ice core, described as 18 inches long and 4 inches wide (50 cm long and 10 cm wide).
The core was taken from 3,590 meters (about 11,800 feet) below the surface of the ice sheet and about 150 meters (495 feet) above the suspected water surface of Lake Vostok. Drilling has been halted roughly 120 meters (393 feet) above where the ice and liquid water meet, to prevent possible introduction of material that would contaminate the water, while scientists debate how to proceed.
Discovered in 1974, the lake is one of the world's 10 deepest bodies of water and one of about 70 lakes underneath the glaciers of central Antarctica. The bacteria, commonly associated with soils, are related to microbes called proteobacteria and actinomycetes. They could have been blown on bits of soil from the Patagonian deserts onto the East Antarctic ice sheet and then buried. If so, the microbes could be more than half a million years old.
Another possibility is that the microbes originated in the lake and became trapped as lake water refroze or accreted to the bottom of the overlying glacier. In either case, the study suggests the lake may support a microbial population, despite a million years of isolation from the atmosphere. The researchers believe the core is accreted ice. If this is so, then their findings suggest that there could well be a large and diverse population of bacteria present in the lake. Confirmation of this would answer an intriguing scientific question about whether an extremely cold, dark environment which is cut off from a ready supply of nutrients can support life.
Evidence for the accretion theory comes from the ice's unique crystal structure and mineral composition as seen using a specialized scanning electron microscope. The researchers saw parts of only two large crystals in the sample. The crystals were not oriented vertically as you would expect in a quiet lake, and this could be due to sheer stresses, which may indicate that the ice crystals initially nucleate in the lake water and attach to the overlying ice at a random orientation.
One team (under Priscu) used DNA analysis to demonstrate that, although the bacteria have been isolated for millions of years, they are biologically similar to known organisms. Both groups conclude that microbes could thrive in other, similarly hostile places in the solar system, although there is a small logical problem here: the Vostok bacteria may well have evolved in milder conditions and then adapted to the Vostok conditions, and this course would not be open to life forms that might or might not exist on Europa, a frozen moon of Jupiter.
Against that, life seems to have evolved in some rather unfriendly conditions on Earth, and if it had ever once become established on Europa, the Vostok data suggest that it would be likely to survive afterwards. And while most scientists reject the Hoyle and Wickramasinghe theory of bacterial spores in space, if there is anything at all in this idea, then bacteria are probably well-established on Europa. All we can say for sure is that if life forms arrived or arose on Europa, they could probably survive there.
We have evidence from the Galileo spacecraft that liquid water exists under an icy crust on the Jovian moon, and indications from radar mapping and other information that under several thousand meters of ice, liquid water may exist in Lake Vostok, possibly warmed by the pressure of the ice above or by some sort of geothermal features below.
A conference in England last September failed to come up with any satisfactory way of exploring the body of water without contaminating it. Without some assurance that contamination can be prevented, Lake Vostok must remain off-limits - and so must other similar formations that we may find on Europa or in other parts of the solar system.
\BKey names:\b David M. Karl from the University of Hawaii and John C. Priscu of Montana State University, each with a large group of co-workers.
#
"Rethinking the San Andreas fault",1335,0,0,0
(Dec '99)
A Lamont-Doherty Earth Observatory geophysicist, Chris Scholz, told the fall 1999 meeting of the American Geophysical Union in San Francisco that it is wrong to regard the San Andreas fault as a unique case among the world's major fault zones. He claimed that this was, and is, a mistake that continues to hold back the field of earthquake mechanics.
Perhaps because of its location in and around the worldÆs seventh largest economy, in the past 30 years an entire geological sub-specialty has emerged and become established, based on studying the San Andreas fault. Scholz argued that the behavior of the San Andreas, like all other earthquake-producing faults, can be described by a general law of earthquake mechanics based on the physics of friction. So if the fault is not a special case, much of the research of the past generation has just been invalidated.
All of regular science depends on a set of models, or paradigms, as Thomas Kuhn named them, sets of rules and assumptions which underlie the way that scientists interpret what they see, but every now and then, data appear which cannot be explained, and so we see a paradigm shift. Events like the reading of ScholzÆs paper can sometimes indicate the start of a paradigm shift to a newer and more practical interpretation, but at other times, they will be seen later as a mere twitch in the ongoing and valid study of the long-running orthodoxy: right now, it is impossible to be sure which future is the correct one for ScholzÆs ideas. All we can do is look at the evidence, and see if the Scholz version rings truer than the orthodox interpretation.
The San Andreas fault appears to be relatively cool, as faults go. An active fault of its size, with a large section of California's southern coast grinding north-west relative to the rest of the state, should produce a substantial amount of frictional heat that warms the surrounding ground. Since there appears to be no heating like this in the area of the fault, the San Andreas has long been interpreted as unusually weak, so that there is little friction, and little heat.
In addition to this argument, orthodoxy points to the fact that the direction of maximum crustal compression is at nearly right angles to the fault in some locations. Because of this, the orthodox view is that the San Andreas must be unusually weak and slippery - if it were not, there would be no movement on the fault at all.
To explain this apparent uniqueness, the "weak-fault" theory supporters have suggested that the San Andreas is lubricated by a plastic-like core that allows it to slide without heating up. Rather than the normal style of fault, which may be thought of as two dry bricks grinding past each other, they see the San Andreas as two bricks with a thick layer of axle-grease between them.
There appears to be no evidence for this hypothesis beyond the fact that if it were true, it would help explain the observed facts. This is typical of a paradigm which is ready to be abandoned, when unsupported hypotheses are brought forward to sustain a point of view, and Scholz must have been well aware of this as he expressed skepticism for any complicated scenario that depends on special circumstances while flying in the face of basic physics. "As scientists we're just not free to construct theories that aren't physically plausible, whatever the apparent evidence," he said.
Based on all of the available data, Scholz claimed that the weak fault hypothesis does not hold up, and that the fault's behavior falls well within that predicted by what is known as the Constitutive Law of Rock Friction, a law which explains the full range of observed earthquake phenomena very well, from his point of view. The key, though, is the question being asked: ô. . . the question everyone should be asking is not 'why is no heat being produced,' but rather 'where is all the heat going?'ö he said.
He suggests that the answer may lie in the assumption that all heat produced by a fault dissipates by solid conduction, just as a brick heated on one side becomes warm on the other side. If a fluid is involved in some way, then the heat could be carried away much more quickly and efficiently by convection, leaving the mistaken impression that the fault is unusually cool. Like the hypothesis of a lubricant between the two sides of the fault, the mystery fluid is still to be located and identified, but at least it can be searched out, and the future of ScholzÆs theory may well depend on whether or not the fluid is found.
At the same time, if the fault works in accordance with the law of rock friction, it may produce a set of related geological phenomena that could be used better to predict how and when earthquakes will occur, Scholz said. Nobody has looked for any of these related and potentially predictive phenomena yet, because the assumption was that since the fault was weak, they would not be there.
At the same time, as the underdog, Scholz has needed to engage in a certain amount of polemic, podium-thumping argument and strong-minded presentation, but this is a normal part of the cut and thrust that happens when a paradigm is being changed - or when somebody is trying to change a paradigm, and it happens on both sides of the debate, even if it is never referred to in polite scientific society in quite those terms.
So it is not surprising that Scholz claimed that the erroneous assumptions are now holding back the progress of the science. "It's a story about a bandwagon with the wheels finally coming off," he said in a release on the Internet. All the same, science does not run on debating points, or opinions or votes, but on evidence. So the next step will be to look for some evidence which supports the non-Special Theory of the San Andreas fault. According to Scholz, "The coming decade should be a very interesting one for research along the San Andreas." We will keep a watch on this one.
\BSee also:\b \JKuhn, Thomas (Samuel)\j.
#
"El Ni±o triggers tropical forest reproduction",1336,0,0,0
(Dec '99)
A combination of logging and fire in the rainforests of Kalimantan (Borneo) is placing a unique ecosystem at risk, according to a report in \IScience\i in mid-December. Worse, the damage being inflicted outside of reserves is spilling over into the reserves as well, a problem that stems from two things: the operation of El Ni±o, and current land exploitation practises, including logging practises.
While there are a few dipterocarps in the savannah of tropical Africa and the rainforests of South America, most of the 750 species in 17 genera are found in South East Asia, where these trees dominate the canopy of the rainforest. There, hundreds of dipterocarp species synchronize their seed production on irregular intervals known as mast-fruiting. While this is seen in other plant groups, the dipterocarps are unique because the fruiting time is closely limited, and it happens at approximately four year intervals.
More than 50 different species of Kalimantan dipterocarp trees synchronize their reproduction, so that there are short periods of intensive fruit and seed production. In the report, Lisa M. Curran explains that these bursts of reproduction are set off by the arrival of the El Ni±o Southern Oscillation or ENSO, a periodic shift in tropical Pacific circulation patterns that brings drought to Indonesia, Australia, and the western Pacific.
She found four masting episodes from 1986 to 1999 with an average interval of 3.7 years, and says that with the possible exception of one very minor event in 1994, all of the events took place during ENSO years. From this, she concludes that the climatic of an El Ni±o year trigger simultaneous fruiting in dipterocarps, and are essential for regional seed production.
This gives the canopy trees an incredible advantage, since a typical six-week masting period sees a yield of 200 kilograms per hectare (180 pounds per acre) fall to the ground, ranging in size from a chestnut to a pistachio nut. This drop sets off a giant forest feast, with wild boar, orang-utans, parakeets, jungle fowl, partridges, and other animals all gathering to stuff themselves. Humans also get into the act, gathering the winged seeds of the various dipterocarp species, and selling them as ôillipe nutsö.
Yet even with this intensive feeding and gathering pressure, so much seed is produced simultaneously over such a large area that some always remains to become a carpet of new seedlings on the forest floor. Most of these will remain there, as small and strangely elongated plants, twisting skyward whenever there is a glimmer of sunlight between the leaves, until a forest giant falls, and then the race is on, to be the first tree to reach the gap in the canopy and plug it, excluding all of the other plants from the sunÆs energy.
CurranÆs work took place in the Gunung (Mount) Palung reserve, where there has been a decade of intensive dipterocarp logging in huge timber concessions surrounding the park. Between 1991 and 1998, the production of mature, viable dipterocarp seed fell from 196 kg/hectare (175 pounds per acre) to 18.5 kg/hectare (16.5 pounds per acre). And even though there was a major fruiting event during the 1998 El Ni±o year, no new dipterocarp seedlings were found in the survey area, inside the reserve.
"Even though the park is supposedly off-limits to logging, the forest is losing the ability to regenerate itself," Curran says. Because seed predators can't find food outside the park, they move inside to eat the dipterocarp seeds before they germinate, according to Curran. As well, massive forest fires on nearby logging plantations, which destroyed an area the size of Denmark or Costa Rica in 1997-98, brought pollution and intensified El Ni±o's drought, killing the few remaining dipterocarp seedlings.
The lessons to be learned: reserves will not, by themselves, be enough to preserve rainforests and rainforest species. They need support from sustainable forestry practices, financial incentives to harvest responsibly, and a prevention of clearing and burning for industrial plantations.
Other researchers on this study: Gary Paoli, Izefri Caniago, Dwi Astianti, Monika Kusneti, Mark Leighton, C. Endah Nirarita, and Herman Haeruman.
#
"China's crop production affected by haze",1337,0,0,0
(Dec '99)
China's air pollution has been causing increasing concern (see \JChina and sulfur emissions\j, November 1999), but the problem may be even more worrying, according to a report in the \IProceedings of the National Academy of Sciences\i in late November. It appears that heavy regional haze in China's most important agricultural areas may be cutting food production there by as much as one-third.
The haze covers a million square kilometers or more, and scatters and absorbs solar radiation, reducing the amount of sunlight reaching key rice and winter wheat crops. Since plants require light to survive, this shading effect has the potential to decrease plant growth and food production, but in China's case, the effect is even more devastating, because of a principle known as limiting factors.
In any given situation, one or another of a range of factors acts as the limit to plant growth. In a desert, water is usually the limiting factor, while in sandy coastal soil, the limiting factor is more likely to be the nutrients the plant needs. If a farmer can increase the supply of whatever is the limiting factor, growth is improved, at least until some other component is fully used, when it becomes the limiting factor instead. In the case of Chinese agriculture, where crops are irrigated and fertilized, the available light will normally be the limiting factor, so any reduction in light translates directly into lost production.
The haze cuts out between 5% and 30% of the available light, and so is assumed to cut food production by the same amount. The estimates of losses are based on detailed long-term measurements at Nanjing, 300 km (200 miles) south-west of Shanghai, and these have been extrapolated to the rest of China, but are considered reliable. The estimates are based only on the direct effects of haze on sunlight, and do not include the indirect effects on sunlight potentially caused by haze interacting with clouds or the toxic effects of air pollutants that also reduce crop growth. About 70% of China's croplands are affected, with the worst effects occurring in the highly productive eastern regions.
The haze affecting China is made up of aerosols composed of solid and liquid particles of varying sizes. The aerosols probably come mainly from the burning of coal, dead vegetation and other fuels, though scientists lack detailed information on their origins. Similar effects may also be taking place in India and in some African nations that are also struggling to feed their people, say the researchers. The conclusion: since the technology exists to reduce air pollution, there would be a remarkable payoff if this technology were put in place.
Problems like this will arise in any economically developing or developed country, and the same effect has been measured on the east coast of the USA, but China's haze levels are roughly twice as bad, and records suggest that China's haze problem has worsened over the past 20 years, a time of massive industrialization.
There are many potential sources for the haze. The coal China uses to fire much of its industry produces soot and fine particles. Fuel for cooking and home heating also tends to be dirty, and the normal practice of burning crop debris at the end of the growing season adds to the haze, while westerly winds each spring carry dust from desert regions in the western part of the country. There are also indirect effects, such as aerosol-induced cloud cover and increased reflectivity, which further reduce the sunlight reaching plants, while harm from growth-stunting ozone, acid deposition and other air pollutants also worsen the impact of poor air quality.
The ozone levels in China are now of concern in the United States. A plume of pollution that crossed the Pacific Ocean from Asia in April 1999 contained ozone at 85 parts per billion (or 0.085 parts per million) at 20,000 feet (6,000 meters), a level high enough to violate a new federal ozone standard in the USA. The new standard, formulated in 1997 but under legal challenge and so not enforced, reduces the acceptable level from 120 parts per billion down to 80 parts per billion.
While it is uncertain if the ozone would have mixed with lower levels and caused a health threat, levels of 72 parts per billion were located at 10,000 feet, lower than the tops of some mountains in the USA, and this level is known to cause damage to vegetation.
A meteorological analysis of the plume shows it came from East Asia, and while it was not possible to be more precise than that, elevated levels of other pollutants, including hydrocarbons, carbon monoxide and a key smog ingredient called PAN (peroxyacetylnitrate), prove that the ozone-rich air mass did not come from the upper atmosphere, since those pollutants do not exist at high concentrations in the upper atmosphere.
#
"The world's stressed freshwater supply",1338,0,0,0
(Dec '99)
The fall meeting of the American Geophysical Union in San Francisco was reminded during December that the supply of clean fresh water for use by humans and natural ecosystems is shrinking by the year. The invited speaker, Associate Professor Kenneth Strzepek, has been working with colleagues, using sophisticated computer models and geographical information systems, and their aim has been to identify those river basins around the world that are the most stressed.
Strzepek and his co-workers have been doing their research as part of a background analysis for the World Water Commission's "World Water Vision for the 21st Century" report. The commission is an organization with government and private funding which seeks global solutions to water problems. Right now, a third of the world's population is currently living in regions that are classified as water stressed. The accepted estimate now is that by the year 2025, almost one-half of the population will be living in water-stressed regions, as the demands for water for irrigation, livestock, industry and to sustain natural ecosystems grow. The only way of easing this will be if the ôbusiness as usualö scenario is dropped, said Strzepek. And as we have previously reported (\JA cause for war\j, January 1999), battles in the 21st century are likely to be fought over water supplies.
The projected ôhot spotsö include China's Yellow River basin, Africa's Zambezi River basin, the Syr Darya and Amu Darya River basins which lead to Russia's Aral Sea, and the Colorado River basin in north America. Their models have studied the vegetation, soils and climate from the headwaters to the mouths of the world's major river systems in 40 km (25 mile) square chunks to model run-off and stream flow, taking into account past temperature and precipitation (rainfall and snowfall) data to reconstruct run-off and stream flow data for major river basins going back 100 years.
Evaporation can draw off 25% of a river's annual flow in bad years. As we face the new century, 70% of the worldÆs fresh water is drawn off by humans to irrigate crops, and this is where the trouble really begins. Much of the irrigation water takes up pesticides, herbicides and salts from the croplands it passes over and through, and these are carried into the rivers, polluting them.
According to Strzepek, "In the Nile Delta in Egypt, water quality is a major problem for human and agricultural use due to upstream pollution from agricultural, industrial and municipal uses," and he adds "Similar situations are found in other river systems like the Indus River in Pakistan and the Yellow River in China."
As soon as humans begin drawing more than a sustainable supply from a river, the problems get worse, but while we could work this out from the name ôsustainable supplyö, how do we work out what this is? According to Strzepek, it is a mix of the nature of a river basin's hydrology and storage capacity, but while dams provide more water for humans, they also cause environmental impacts.
As a rule of thumb, researchers agree that no more than 40% of a river basin's water should be diverted for human use, if the environment is to be adequately protected. Yet in the Colorado basin, 100% of the water is used. The average flow in the Colorado River is 15 million acre-feet per year, only 1.5 million acre feet is delivered annually to Mexico, and barely a trickle makes it into the Gulf of California in Mexico.
\BKey names:\b Strzepek, Alyssa Holt, and Jeff Bandy.
#
"The Arctic Oscillation and northern climate",1339,0,0,0
(Dec '99)
The Arctic Oscillation is a climate effect which appears to have wide-ranging effects in the Northern Hemisphere and it operates differently from other known climate cycles, the fall meeting of the American Geophysical Union was told during December. A counterclockwise spinning ring of air around the polar region has been accelerating, and this could be responsible for warmer winters in Scandinavia and Siberia, a thinning of the stratospheric ozone layer, and significant changes in surface winds that might have contributed to Arctic ice thinning (see \JArctic sea ice disappearing\j, November 1999).
John M. Wallace, a University of Washington atmospheric sciences professor, told the conference that the changes at high latitudes could be part of a human-induced climate change. In a news conference later, Wallace, with David Thompson and Mark Baldwin, reflected on their research, and said that the Arctic Oscillation is a seesaw pattern in which atmospheric pressure at polar and middle latitudes fluctuates between positive and negative phases.
During the negative phase, there is higher-than-normal pressure over the polar region and lower-than-normal pressure at about 45 degrees north latitude. In the positive phase, the conditions reverse, pushing ocean storms further north and bringing wetter weather to Alaska, Scotland and Scandinavia, and drier conditions to areas such as California, Spain and the Middle East.
The Arctic Oscillation has been mostly in its positive phase for the past few years, and this means frigid winter air doesn't plunge as far south into North America, and this in turn leads to warmer winters for much of the United States east of the Rocky Mountains, but areas such as Greenland and Newfoundland pay for the milder US weather by tending to be colder than normal.
The Arctic Oscillation parallels an identical phenomenon in the Southern Hemisphere, the Antarctic Oscillation, and this has helped scientists understand the Arctic Oscillation. The Antarctic Oscillation affects the even faster spinning ring of air that encircles the South Pole, but the presence of large land masses in the north probably prevents the ring of air flowing around the Arctic from becoming as strong as that in the Antarctic.
In winter, the Arctic Oscillation reaches up through the stratosphere, some 10 to 50 km (6 to 30 miles) above the Earth's surface. When the oscillation changes phases, the strengthening or weakening of the circulation around the pole tends to begin in the stratosphere and work its way down through lower levels of the atmosphere, and this has scientists interested, because it is the opposite of what they see in other systems. In phenomena such as El Ni±o in the equatorial Pacific Ocean, the changes begin in the ocean and work their way up through the atmosphere.
In the last few decades, the stratosphere has been cooler, and this has strengthened the circulation around the North Pole in winter. The effect of this has been to shift the belt of westerly winds at the surface along 45 degrees north latitude further to the north, and this has delivered larger amounts of mild ocean air to Scandinavia and Russia and brought balmier winters to much of the United States as well.
#
"El Ni±o's dramatic impact on ocean biology",1340,0,0,0
(Dec '99)
The 1997-98 El Ni±o/La Ni±a had a marked effect on the oceanic food chain of the Pacific, plunging chlorophyll levels to the lowest ever recorded in December 1997, bouncing back with the largest bloom of microscopic algae ever seen in the region during the northern summer of 1998, according to a paper published in \IScience\i in mid-December.
The El Ni±o event also drove a dramatic reduction in the amount of carbon dioxide normally released into the atmosphere by the equatorial Pacific Ocean, according to data gathered by an array of instruments on buoys, on ships, and in space, including NASA's Sea-viewing Wide Field-of-View Sensor (SeaWiFS). SeaWiFS provides daily views of the world's oceans and land masses.
There were two surprises in the data: the size and intensity of the ocean's biological rebound from El Ni±o, and also the speed with which it recovered. Measurements from buoys fitted with chemical and biological sensors, along with the SeaWiFS data, showed surprisingly low and then high levels of chlorophyll which coincided with El Ni±o's strongest phase and the recovery period and transition to La Ni±a cooling.
The warm water layer produced during the El Ni±o phase gets deeper, so that the upwelling cold currents, packed with mineral nutrients, are choked off, and when that happens, chlorophyll values plummet as phytoplankton (microscopic algae which form the base of the oceanic food chain) are unable to survive. It appears that the intense bloom of algae in mid-1998 was due to raised iron concentrations in the increased upwelling associated with La Ni±a. Iron is an essential nutrient for phytoplankton growth and it is thought to be the limiting nutrient for productivity in the equatorial Pacific under normal conditions.
Unlike most parts of the world's oceans, the equatorial Pacific is normally a major contributor to atmospheric carbon dioxide due to the carbon-dioxide-rich deep ocean waters brought to the surface here and the relatively low levels of biological activity, but when El Ni±o is active, this carbon dioxide production takes a nosedive, with an output estimated at some 700 million metric tons of carbon less than the previous year - about as much as the whole USA generates from burning fossil fuels in a year.
So even El Ni±o clouds have a silver lining of sorts, it seems.
SeaWiFS images of the 1997-98 El Ni±o are available on the Web at http://svs.gsfc.nasa.gov/imagewall/elnino/elninoimpact.html, where the sequence of images is of special interest.
#
"2000 Science in Review",1341,0,0,0
\JJanuary, 2000 Science Review\j
\JFebruary, 2000 Science Review\j
\JMarch, 2000 Science Review\j
\JApril, 2000 Science Review\j
\JMay, 2000 Science Review\j
\JJune, 2000 Science Review\j
\JJuly, 2000 Science Review\j
#
"January, 2000 Science Review",1342,0,0,0
\JZoonoses on the increase\j
\JMortician catches TB from cadaver\j
\JHuman breast cancer and mice\j
\JGene therapy shrinks tumors in mice\j
\JFetal anemia detection\j
\JWhy women get more lung cancers\j
\JUpdated HIV treatment guidelines on the Web\j
\JResearchers identify liver toxicity risk of AIDS drugs\j
\JHow viruses enter cells\j
\JCloning identical monkeys\j
\JTransgenic fish could threaten wild populations\j
\JPrions are modular\j
\JThe Cuba embargo and public health æcatastrophesÆ\j
\JCounting monkeys\j
\JAre solar systems rare?\j
\JNear-Earth asteroids risk lower\j
\JHubble opens for business\j
\JMars Polar Lander news\j
\JFUSE spacecraft provides its first data\j
\JDNA computing progress\j
\JQuantum phase memory\j
\JInternet traffic density rises\j
\JWearing your computer on you sleeve\j
\JDeath row, Internet style\j
\JThe robots are coming\j
\JRobots on ice\j
\JThe invading, evolving fruit-fly\j
\JA new order of marine fungi\j
\JThe big bang theory of human evolution\j
\JShrinking iguanas\j
\JEmerald fields\j
\JFertilizers and amphibian deaths\j
\JImpact from rainforest greenhouse gases\j
\JSatellite evidence for warming\j
\JNew, major source of atmospheric methyl bromide\j
\JA coral thermometer\j
\JCoffee is good for you after all\j
\JG. Ledyard Stebbins\j
#
"Zoonoses on the increase",1343,0,0,0
(Jan. '00)
Zoonoses are diseases spread from animals to humans, and there are more of them than we thought. While this is alarming for humans, we are not the only victims of diseases carried in wild animals, and biodiversity seems to be as much under threat as humans. While the big headlines go to diseases like Ebola, chicken-borne flu (see \JChicken flu virus scare\j, December 1997) and HIV, believed to be developed from a similar virus in monkeys (see \JNew HIV strain identified\j, August 1998), the real problems may be found elsewhere, particularly among wildlife, according to a report in \IScience\i in mid-January. For example, avian malaria in Hawaii is now thought to have caused the extinction of a number of native species on the islands after it was originally introduced along with exotic, alien birds.
Much of human history has been shaped by disease. In particular, Chinese populations appear to have risen a number of times over the past two millennia, falling when extra population led people into the rain forests and jungles of southern China. Today, African populations are in decline because HIV escaped from monkeys in jungle areas to humans, and Ebola is believed to have an animal host, still to be identified, somewhere in the African rain forest - though bats and some small forest-dwelling mammals have been tentatively implicated.
And now we are beginning to realize that wildlife is at risk: a newly discovered fungal disease has recently been identified (\JVirtual dissection\j, July 1998) as the cause of amphibian mortality in the Central American and Australian rain forests, areas scientists thought were beyond the reach of human environmental change. History is filled with the disasters that disease has brought to human populations, like the \JBlack Death\j, or the introduction of smallpox, typhus, and measles to the Americas in the \Jconquistador\j invasion, killing an estimated 50 million people. What history glosses over is the damage caused to the ecosystems that support the surviving human populations.
The problem is that the risks remain potential, rather in the way that walking along the edge of a cliff remains only a potential risk, until a section of the cliff edge gives way. As the world becomes more globalized, as foodstuffs, tourists, workers and migrants travel ever faster from place to place, so the risk of disease traveling with them becomes greater. Even deadly diseases like Ebola could be passed on to air travelers in a European airport, who could reach the other side of the world and clear customs to mingle with the normal population before the first symptoms showed.
Perhaps the biggest problem arises when there is a "spill-over" of pathogens from domestic animals to wildlife populations. This can come about because of the translocation of host or parasites by human intervention, but also through events that have no human or domestic animal involvement, such as global warming or floods. Just as human diseases have and will spread, so will animal diseases: just as surely as the conquistadors introduced human diseases to the New World, their stock and domestic animals carried animal diseases.
The cattle disease, \Jbrucellosis\j, reached both the Americas and Australia when cattle were taken there, but in the United States, the disease is now found in the \Jbison\j of Yellowstone National Park, which makes the animals a potential threat to domesticated cattle grazing at the park's boundaries. This problem has led to considerable tension between conservationists and cattlemen and the shooting by farmers of bison that graze near domesticated herds, even though there is little evidence of cattle becoming infected, according to one of the authors, Peter Daszak.
Recent analyses of nucleic acid sequences have shown that avian influenza can be transmitted directly from birds to humans - in the past, pigs were thought to have been a necessary intermediate host. As well, potential non-human primate reservoirs for HIV-1 and HIV-2 have also been found, say the authors, while Australia has seen small outbreaks of morbillivirus and Lyssa virus, probably carried by fruit bats which are quite common - there is even a large colony of "flying foxes" in the center of Sydney, in the Royal Botanic Gardens.
Even areas like the Galapagos Islands and Antarctica are not exempt, as food crops, timber, agricultural materials and domesticated animals are moved from place to place, with even larger risks coming from the international movement of landfill wastes and ship ballast water. The costs to be faced are enormous: the authors say that post-exposure treatment given to 655 people who had potential contact with a single rabid kitten in a New Hampshire pet store in 1994 cost $1.1 million. They estimate that the costs of Lyme disease treatments of all kinds in the US may be as much as $500 million a year.
When we are faced with the potential danger of a crumbling cliff edge, the solution is to place barriers and fences, but the solution to this potential danger is less obvious. Hopefully, it will not remain a case of people nodding sagely and agreeing that something ought to be done, without ever actually doing anything. Even a potential danger will eventually inflict damage if nothing is done.
\BKey names:\b Peter Daszak (USA), Andrew Cunningham (UK), and Alex Hyatt (Australia).
#
"Mortician catches TB from cadaver",1344,0,0,0
(Jan. '00)
Tuberculosis (TB) is spread through the air by infectious aerosols, when infected people cough, dislodging some of the bacteria in their lungs. In theory, it should not be possible to develop a tubercular infection from a corpse, but in one case reported in the \INew England Journal of Medicine\i in late January, this appears to have happened.
According to Timothy Sterling, lead author of the study, "Previous studies had shown that funeral home workers had unexpectedly high rates of TB infection and disease, but it was not known if this was due to exposure in the workplace." So there was a suspicion that TB might be contracted this way, but no real evidence of it.
The difference in this instance is that the bacteria in every case of tuberculosis in Baltimore are now routinely tested and DNA fingerprinted. When two or more TB cases have similar DNA fingerprints, researchers set out to identify the point of contact, to see whether recent transmission of TB has occurred.
The dead TB victim was not even known to have the disease until it was established after death, but the medical researchers noticed that the mortician had signed the death certificate of the dead patient. The only known exposure the mortician had to TB bacterium was through the embalming of the infected cadaver.
Embalming is normally regarded as a subject best left undiscussed, but to understand what happened, we need to look at a few of the technicalities. The process involves removing blood from the cadaver and injecting preserving fluids. Aerosols that may carry the bacteria into the air can be generated by the injection of fluids, or by the frothing and gurgling of fluids through the mouth and nose. As well, the cadaver may spasm during the embalming process, which can cause the release of respiratory secretions. After the work is complete, embalming fluids are often dumped into a drain and this also could release infectious aerosols.
Other diseases known to be spread by funerary practices include Ebola disease, which has trouble extending beyond areas of Africa where certain funeral rites are observed, and \Jkuru\j, a disease of the highlands of Papua New Guinea, but there are no public health rules in place to cover funeral homes and tuberculosis. A number of other updates have indicated that TB is on the rise. It is the second leading cause of death from an infectious disease, with 7 million to 8 million new TB cases diagnosed each year.
For the foreseeable future, the news is only going to get worse.
#
"Human breast cancer and mice",1345,0,0,0
(Jan '00)
In mid-January, the \IBritish Journal of Cancer\i published a paper which presents interesting arguments in support of the notion that breast cancers in some cases may be caused by a virus which is transmitted by mice. The evidence is circumstantial but persuasive, based on two lines of reasoning.
First, there is a similar viral sequence between human breast cancer virus and a mouse mammary tumor virus. Second, the authors found that the highest incidence of human breast cancer worldwide occurs in lands where the house mouse, \IMus domesticus\i, is common, either as a native or introduced species.
\IMus domesticus\i is native from Western Europe to Iran, and has been introduced into North and South America, Australia, New Zealand, and Hawaii. Laboratory mice have mostly \IM. domesticus\i genes. Breast cancer is more common in these areas, and lower in Eastern Europe, Japan and China, where this particular mouse is absent, though people moving from low to high-risk areas have an increase, in line with their destination. Soviet Jews moving to Israel, Japanese moving to the US, and people from the Indian sub-continent moving to the United Kingdom, all show increased incidence of breast cancer.
This tends to rule out most genetic and dietary arguments, and points to some factor in the environment. As well, a study of lab personnel who work with infected mice shows that people working with the mice developed a blood serum response to mouse mammary tumor virus when compared to age and gender matched controls.
In one case study, a female lab worker was sero-negative to the mouse tumor virus for 28 months of the study, and then became positive in the 32nd month. Nine months later she discovered a mass in her right breast which was soon diagnosed as cancerous. As a side issue, the authors point to the need for more stringent guidelines for laboratory containment of mouse mammary tumor virus.
If this suspicion is supported by further studies, it would point to a major pathway for prevention, rather than cure. Aside from that, and insights into disease generally, it would also point the way to a vaccine which might act to block this killer disease.
This suspicion is not only predictable when we consider that \IM. domesticus\i has lived with humans, as an intimate commensal, since the beginnings of agricultural societies in the Near East, say the authors, but it is not even new: they point out that the same conclusion was drawn in another scientific paper in 1981!
In another worrying side, they comment that "The existence of regulatory food standards allowing up to two pellets of rodent excreta per pint of wheat confirm the presence of mice in the modern human food chain." It will be interesting to see what happens to this particular regulation over the next 15 or 20 years.
#
"Gene therapy shrinks tumors in mice",1346,0,0,0
(Jan. '00)
A late January paper in the journal \ICancer Research\i reports a first: the use of gene therapy to replace a damaged Rb2 gene (also called Rb2/p130) with a healthy version. The result has been a dramatic shrinkage of lung cancer tumors in mice. Almost as exciting as the hope of a cure of human lung cancer by gene therapy is the first direct evidence that Rb2 is a tumor suppressor. The tumor suppressor genes include p53, which may be involved in as many as half of all human cancers, and they are normal growth control genes which malfunction in some way, allowing a tumor to develop.
Antonio Giordano, an associate professor of pathology, anatomy and cell biology at Jefferson Medical College of Thomas Jefferson University in Philadelphia led the research group, and he has commented that "Rb2 is an important master switch in controlling the equilibrium of the cell. ItÆs the first time that this tumor suppressor gene has been shown to be heavily mutated in primary tumors."
Giordano has been working with colleagues in the USA and Italy. Together, they have developed a gene therapy model using a retrovirus as the delivery system. They put the Rb2 gene into a retrovirus and transplanted the non-small cell lung tumor lines into the backs of the mice and suppressed tumor growth. (Non-small cell lung cancer is the most dangerous and common type of the disease, making up about 75% of cases.)
The Rb2 is a normal growth-control gene expressed in cells in all tissues, and when it malfunctions, it may contribute to several types of cancer, but how and why remains a mystery for now. Giordano may have an interesting clue: "We have found three different mechanisms for how the gene can cause cancer, which suggests Rb2 is a major guardian of the genome and controller of cellular processes," he says.
In another paper in early January in the same journal, Giordano's group reported that in some types of cancer in which Rb2 is heavily mutated, such as nasopharyngeal cancer, a head and neck cancer common in southern China, the mutation caused a cell signalling problem, which contributed to the cancer. In yet another January article, Dr. GiordanoÆs research group showed how an Rb2 mutation involved in BurkittÆs lymphoma resulted from the Rb2 protein being misplaced within the cell.
There are also some cancers where the Rb2 may not be mutated at all, so there is still some distance to go before gene therapy becomes an everyday treatment, but the day is getting closer.
#
"Fetal anemia detection",1347,0,0,0
(Jan. '00)
Between one and two pregnancies in a thousand risks having the fetus develop anemia which requires a transfusion to prevent death, but only 10-20% of fetuses at risk for anemia will develop the deficiency. It is possible to take samples using invasive procedures such as \Jamniocentesis\j and cordocentesis (taking a fetal blood sample from the umbilical cord).
These, however, carry a risk of death to the fetus. Now a new technique has been developed at Yale, a non-invasive technology called Doppler ultrasound to detect whether or not a fetus is anemic. This does away with the need for risky needle-insertion, according to a report in the \INew England Journal of Medicine\i in early January.
One possible drawback was successfully dismissed in another paper, appearing in the \IBritish Medical Journal\i in late January. Based on a case-control study in Sweden, the paper finds no association between prenatal exposure to ultrasound and childhood leukemias. In the past, there have been concerns over a possible association between exposure to ultrasound in utero and an increased risk of childhood malignancies.
The authors, Dr. Estelle Naumburg and colleagues from Uppsala University and the Karolinska Institute, set out to establish whether there was any evidence to suggest a link. They found similar rates of leukemia in children who had and in those who had not been exposed to ultrasound scanning. They conclude that there is no evidence to suggest that single or repeated exposure to ultrasound, at any stage in pregnancy, influences the risk of subsequent development of childhood leukemia.
The anemia detection system relies on the fact that anemic fetuses have a higher blood flow velocity than non-anemic fetuses in their arteries and veins. Doppler ultrasound studies can identify and distinguish the speeds, detecting all the moderately and severely anemic fetuses.
As often happens with tests, it is possible to get a "false positive", where a non-anemic individual is identified as anemic. This could be avoided by raising the limit used to identify cases as anemic, but that raises the chance of getting "false negatives", cases of anemia which are classified as normal. On the current standards, the false positives were about 15%, enough to suggest that those diagnosed as anemic may need to be confirmed by invasive methods. At the same time, the non-invasive procedure is risk-free, cheaper, and faster, and serves to screen out most of the fetuses which are not anemic.
The data used in the study were compiled at eight medical centers in the United States, Europe, South America and Asia.
\BSee also:\b \JDoppler effect or shift\j, \Jultrasound\j, \Janaemia/anemia\j.
#
"Why women get more lung cancers",1348,0,0,0
(Jan. '00)
A protein called gastrin-releasing peptide receptor (GRPR) is found on the surface of cells lining the lung. It fuels lung cancer growth, and the gene for this protein is more active in women than men. That is the main message of a paper published in the \IJournal of the National Cancer Institute\i in early January. This gives us the first biological explanation for the observed greatly increased risk women face, compared with men, of developing lung cancer. The gene is also of interest, because the same research group at the University of Pittsburgh also discovered that nicotine found in cigarettes induces gene activity.
If future studies support this finding, the research may yield a valuable marker for women who are most likely to develop the disease or dangerous pre-cancerous changes. GRPR responds to a hormone, GRP, or gastrin-releasing peptide, and when it does, the receptor then sets off the sort of cell proliferation typically seen in lung cancer. The team has also shown that the nicotine found in cigarettes stimulates expression of the GRPR gene in lung cells.
It seems that women are likely to develop lung cancer after much less smoking exposure than men and much earlier in life, regardless of their smoking history, said the principal investigator, Sharon Shriver, discussing their work. She added "The take-home message, especially for teenage girls, is that they should stop smoking or, better yet, never start." Sadly, it is a message that more and more girls are ignoring.
Previous population studies have suggested that women are at a much greater risk of smoking-related lung cancer, compared with men, and now, for the first time, there is a clear biological explanation of the observed facts. If ever there was a smoking gun, this is it. So with more women than men smoking now, we can already see the potential for a huge public health cost in the next 20 to 30 years.
Using normal lung tissue samples from 38 women and 40 men, including 58 patients with lung cancer, the scientists found that 55% of the non-smoking women, and 75% of the women who had smoked less than 25 pack-years, expressed the messenger RNA which is a precursor of the GRPR protein. Not one of the male non-smokers, and only 25% of the men with a 25 pack-year or less smoking history expressed the GRPR mRNA. (A pack-year is a dosage measure, and is the equivalent of smoking a pack of cigarettes a day for a year.)
Why the difference? The answer is that the GRPR gene is on the X chromosome, and women have two X chromosomes, while men have one copy. Usually, one of these two X chromosomes is inactivated in each cell (the ôBarr bodyö), but some genes are still active on the inactivated chromosome, and the GRPR gene is one of these.
Dr. Shriver is unsure at this stage if the problem is simply that women have two copies of the GRPR gene, available to be switched on, while men have just one, or if the GRPR gene is already active before women start to smoke, or whether both effects are in operation. The study did not take into account the effects of passive smoke, but future larger-scale studies will obviously address this question, and also seek to find if GRPR gene expression can provide a reliable marker of lung cancer risk. About the only good news seems to be that the lung levels of GRPR expression are mirrored in circulating blood cells, allowing a minimally invasive way to measure lung cancer risk.
\BKey names:\b Sharon Shriver and Jill Siegfried.
#
"Updated HIV treatment guidelines on the Web",1349,0,0,0
(Jan. '00)
In less than a decade, we have gone from a situation where it was both amazing and exciting that Indian doctors could use electronic communications to coordinate their attack on an outbreak of plague to the point where the world can access new information and standards as soon as they are set, as has happened with a new \IGuidelines for the Use of Antiretroviral Agents in HIV-Infected Adults and Adolescents\i, which has just been posted at the HIV/AIDS Treatment Information Service (ATIS) World Wide Web site, www.hivatis.org. They are available in both a Hypertext Markup Language (HTML) file for reading online, and a Portable Document Format (PDF) file for printing.
The guidelines were developed by the Panel on Clinical Practices for the Treatment of HIV Infection, a joint effort of the US Department of Health and Human Services and the Henry J. Kaiser Family Foundation. Initially published in 1998, the guidelines were seen as a "living document" from the start, and are updated frequently by the Panel as new data emerge. This change of approach shows us how a new medium can subtly alter the way our society responds to a situation.
The "living document" approach is needed because the number of treatment options for HIV-infected individuals has increased dramatically, making decisions regarding therapy more and more complex, according to Anthony S. Fauci, director of the National Institute of Allergy and Infectious Diseases (NIAID) and co-chair of the Panel.
As new medications become available, and as some patients present signs that their HIV has developed resistance to one or more antiretroviral drugs, the evolving guidelines will ensure that the most recent solutions are available, across the world. They also deal with the tests to be used to identify developing resistance: since the highest priority goal of treatment is to reduce the patient's viral load to undetectable levels, this is a key aspect. The likelihood of success is far greater when the results of resistance testing are available to guide the selection of a new drug regimen for patients.
A key aspect to the guidelines comes in their list of other primary goals of antiretroviral therapy, set down to help medical workers and policy makers to think strategically about antiretroviral therapy. The other goals include:
ò restoration and/or preservation of the patient's immunologic function;
ò improvement of their quality of life; and
ò reduction of HIV-related illness and death.
The tools that may help achieve these goals include:
ò maximizing patient adherence to a regimen;
ò selecting "user-friendly" regimens when possible;
ò prescribing drugs in a rational sequence in order to preserve future treatment options; and
ò using drug resistance assays when treatment fails.
A number of other sections have been reconstructed in the light of these points, and a new hypertext link to detailed information on the use of antiretroviral drugs in pregnant women has been added. This information will help physicians select the most appropriate antiretroviral regimen for their HIV-infected patients who are pregnant.
#
"Researchers identify liver toxicity risk of AIDS drugs",1350,0,0,0
(Jan. '00)
Around 10% of HIV-infected individuals taking antiretroviral drugs experience liver toxicity at a level high enough to warrant stopping treatment. That is the conclusion of research reported in \IJAMA,\i the \IJournal of the American Medical Association,\i in early January. The results show that one particular protease inhibitor, ritonavir, has five times the liver toxicity, accounting for half of all the cases in the study.
Previous anecdotal evidence was that all protease inhibitors were equally toxic to the liver, but this was not supported by the evidence. The researchers advice: protease inhibitors can be used safely, but doctors should monitor liver enzyme levels for signs of trouble, and ritonavir should be used with caution in persons with underlying liver disease. So long as liver enzyme levels are monitored closely, says Sulkowski, doctors who avoid prescribing protease inhibitors in general, or ritonavir in particular, are likely to be doing more harm than good.
The finding is important, because while protease inhibitors have been the key in lengthening survival for people with HIV and delaying full-blown AIDS, some medical practitioners have hesitated to prescribe them because of reported side effects, after several case reports of liver toxicity, especially in people co-infected with the hepatitis C virus, even though the mechanism for the damage remains unclear.
The researchers analyzed 211 people who were undergoing treatment with four different protease inhibitors: ritonavir, saquinavir, indinavir, and nelfinavir, as well as 87 who were undergoing treatment with another category of anti-HIV drugs called nucleoside analogs. They collected information on patients' sex, age, race, social practices, drug doses and clinical variables such as new illnesses, and they also monitored liver enzyme levels using blood tests.
The 10% risk of severe liver toxicity for those taking protease inhibitors only rose to 12% for those with hepatitis C, but in assessing this, it is important to note that those hepatitis C-infected patients who were not taking ritonavir were three times as likely to develop severe liver toxicity, suggesting that, as a general rule, patients with hepatitis C co-infection may be at a greater risk for medication-related liver damage.
\BKey names:\b Mark Sulkowski, David L. Thomas, Richard E. Chaisson, and Richard Moore.
#
"How viruses enter cells",1351,0,0,0
(Jan. '00)
Early January saw a fascinating report in the \IProceedings of the National Academy of Sciences,\i revealing how two related viruses use similar but distinct strategies to enter cells. Researchers compared the tactics of the poliovirus to those of human rhinoviruses, once they had analyzed in molecular detail how the poliovirus interacts with a cell to gain entry. The rhinoviruses are the common cold viruses, similar in size and structure to polioviruses, and both viruses are grouped in a family called the picornaviruses (a term which makes more sense when it is broken down as pico-RNA-viruses).
These two sorts of viruses use different receptors to enter a cell, but the receptors have a similar "footprint" and interact with the virus at similar sites on the virus shell. Each virus has a shell with deep crevices in it, and this is the site at which binding occurs. According to Michael Rossmann, " . . . this site may be a trigger for initiation of the subsequent uncoating step required for viral infection."
The importance of this discovery is that researchers will be able to compare the processes used by the two types of virus, and this will allow them, for the first time, to describe in molecular detail the process by which a virus selectively attaches to its particular receptor. They should also gain some insights into what makes viruses different from one another, and down the track, they could find ways to develop new drugs to prevent illnesses caused by viral pathogens.
The key to a virus entering a cell is specificity. A virus and its receptor have to complement each other, fit together like a lock and key, if infection is to occur.
The researchers used high-resolution cryo-electron microscopy and three-dimensional image reconstruction to get the first three-dimensional image of how poliovirus 1, one of the three types of polioviruses, binds to a receptor, a molecule called CD155, on the cell. This is just one of hundreds of types of receptors found on a cell, and each cell may contain thousands of these receptors on its membrane. While these receptors are there to perform specific chemical processes for the cell, viruses have hijacked the system, developing ways to use receptors to gain entry to cells.
The CD155 receptor is a single protein and is shaped somewhat like a leg divided into three sections, called domains, which extend from a "hip" that penetrates the cellular membrane. Rossmann's group has determined that the virus attaches at a site on the receptor located at the "foot" end of the molecule. Rossmann's group became the first to construct a three-dimensional image of a human cold virus in 1986, and it since has analyzed the structures of several cold viruses attached to ICAM molecules such as the ICAM-1 receptor, which is used by numerous rhinoviruses to infect cells, causing colds in humans.
The receptors show both similarities and differences in their behavior. The long, slender CD155 receptor binds by jutting its end into the canyon with part of the receptor lying roughly on the surface of the virus, while the ICAM receptor, on the other hand, binds and radiates outward from the virus.
The two receptors contain different "residues", carbohydrate units which are stuck on the outside of the receptor. These presumably may allow the viruses to recognize their matching receptors. According to Rossmann, "These viruses have evolved to recognize molecules on the cell receptor that just fit them, and we can now see the reason for that". He adds that as the receptors lock into their specific binding sites, the viruses bind with the receptor to form a single complex. This step may trigger the process that causes structural changes in the virus necessary for cell entry, Rossmann believes.
\BKey names:\b Yongning He, Valorie Bowman, Steffen Mueller, Carol Bator, Jordi Bella, Xiaozhong Peng, Timothy S. Baker, Eckard Wimmer, Richard J. Kuhn, and Michael G. Rossmann.
\BScientist-speak\b:
Here, for those with a better command with scientific language, or some curiosity about how scientists communicate, is an abstract of the paper. An abstract is a quick summary, directed at others working in the same field of research, and so pulls no punches, explains no technical details, and contains no excess words.
The structure of the extracellular, three-domain, poliovirus receptor (CD155) complexed with poliovirus 1 (Mahoney strain) has been determined to 22 angstrom resolution by means of cryo-electron microscopy and three-dimensional image reconstruction techniques. Density corresponding to the receptor was isolated in a difference electron density map and fitted with known structures, homologous to those of the three individual CD155 immunoglobulin-like domains. The fit was confirmed by the location of carbohydrate moieties in the CD155 glycoprotein, by the conserved properties of elbow angles in the structures of cell surface molecules with immunoglobulin-like folds, and by the concordance with prior results of CD155 and poliovirus mutagenesis. CD155 binds in the poliovirus "canyon" and has a footprint similar to that of the intercellular adhesion molecule-1 (ICAM-1) receptor on human rhinoviruses. However, the orientation of the long, slender CD155 molecule relative to the poliovirus surface is quite different from the orientation of ICAM-1 on rhinovirus. In addition, the residues that provide specificity of recognition differ for the two receptors. The principal feature of receptor binding common to these two picornaviruses is the site in the canyon at which binding occurs. This site may be a trigger for initiation of the subsequent uncoating step required for viral infection.
#
"Cloning identical monkeys",1352,0,0,0
(Jan. '00)
A report in mid-January in the journal \IScience,\i of a "cloned monkey" led to a flurry of media excitement and foolish references to Dolly the Sheep and human cloning. The simple reality is that a healthy, female monkey named Tetra was produced by a technique called embryo splitting, the first monkey born by this, technically a cloning method.
The monkey was produced by separating an eight-cell embryo into four sets of two-cell embryos, which were then implanted in female surrogate mothers, but only one of the embryos survived to maturity. While a method like this, with more successful incubation, could produce genetically identical monkeys for research purposes, it is nothing like the tabloid media image of tycoons and dictators cloning themselves, and only distantly related to clones like Dolly.
The real importance of this work lies squarely in the area of medical research, where more complex diseases like Alzheimer's, AIDS, and cancer are proving difficult to study, using mouse models. As well, some human diseases do not surface when genetically introduced into mice. A monkey, while it is not human, is at least a primate, and so genetically much closer to humans, but for many purposes, genetic variability needs to be taken out of the equation. Mice can provide this genetic identity in a number of laboratory strains, but until now, the lack of identical, non-human primate models has posed a hurdle to researchers.
Once the technique is perfected and shared, Gerald Schatten says it could " . . . accelerate the work of thousands of scientists looking for cures to hundreds of diseases." Mice will still have a role, he says, but " . . . most admit another model for human disease is needed to bridge the gap between mice and sick people. We believe identical monkeys are the next logical step in finding these life-saving answers."
According to Schatten, this method, tied to findings from the Human Genome Project, looks set to make major advances, suggesting that efforts to break the genetic codes for diseases may lead to cures in the very near future. "Gene and cell therapies, including stem cell treatments, promise remedies to rebuild hearts damaged by attacks, spinal cord injuries and diabetes," commented Schatten. " Identical monkeys are the most reliable and appropriate models in which to perfect these cures before clinical trials. Also, since these monkeys are identical, the environmental influences in causing disease can be discovered."
And even the monkeys are set to gain from this. If the experimental subjects' genetics are identical, defined and invariable, then the data will be more accurately obtained. This means that researchers can reduce the number of animals necessary for study. They will also find it much easier to identify any immune mechanisms of disease.
\BKey names:\b Anthony Chan and Gerald Schatten.
#
"Transgenic fish could threaten wild populations",1353,0,0,0
(Jan. '00)
Transgenic organisms have been a source of worry and fear, generally due to misinformation or lack of information (see, for example, \JCloning identical monkeys\j). A recent laboratory test shows that at least one kind of transgenic organism, a fish with a growth hormone from another fish, has the potential to replace native populations, if it is released into the wild. This is not evidence that all transgenic organisms are dangerous, but it shows the sort of assessment work that needs to be carried out before transgenic species are released.
The work also leaves aside the possibility that the change in populations \Imight\i be an improvement: for now, all changes are assumed to be bad, especially if they are irreversible. Even so, the situation that was set up is a fairly extreme case, where the laboratory "release" could reasonably be expected to cause "problems". The transgenic fish, while having an advantage in selecting mates, has poorer survival prospects, which makes it sound like a poor prospect for any development program.
Purdue University researcher Bill Muir and biologist Rick Howard published an account of their work in the \IProceedings of the National Academy of Sciences\i in November 1999, but it has only recently been discussed in enough detail. They used tiny Japanese fish called medaka \I(Oryzias latipes)\i to examine what would happen if male medakas genetically modified with growth hormone from Atlantic salmon and were introduced to a population of unmodified fish. For safety, the work was carried out in banks of aquariums in a laboratory setting.
Medaka females choose the larger of two competing males. Since transgenic fish are typically larger than the native stock, that can confer an advantage in attracting mates, so the females prefer the genetically modified but genetically inferior medaka, thus inviting the hidden risk of extinction. The growth hormone treatment involves inserting a gene construct consisting of the human growth hormone driven by the salmon growth promoter into medaka. Unfortunately, the genetic change also reduces the offspring's ability to survive, so the release of a single transgenic animal could bring a wild population to extinction in 40 generations.
In a viability study, the transgenic fish were shown to have a 30% reduction in survival to three days of age. Assuming that large males had a four-fold mating advantage, based on observations of wild-type medaka, the researchers then used computer models to predict the consequences of the transgenic mating advantage combined with the reduced viability of the young, and came up with their conclusion.
So we now know that, given certain circumstances, the release of transgenic organisms into the wild has the potential to inflict considerable harm; the moral would appear to be that we should seek to avoid such circumstances. Muir cautions that the results of his laboratory study should be interpreted conservatively, saying that his model is a simple one, and that in the complexities of the real world, the simple doom scenario may not prevail.
All the same, it is a fascinating case study from the evolutionary point of view, with a deleterious gene being given a selective advantage, contrary to what we would assume, based on standard theory. And like it or not, the finding sounds a clear warning, and the researchers' next goal is to replicate the study with larger fish of economic importance in a bigger environment, to see if a similar effect applies there.
#
"Prions are modular",1354,0,0,0
(Jan. '00)
Two complementary papers in \IScience\i and \IMolecular Cell\i in January describe how prions (proteins which can fold in two different ways) are able to pass their particular conformation from one generation to the next without any change in the DNA of the changed organism. It appears that prions are modular, and this simple fact is likely to change these peculiar and mysterious proteins into one of the most valuable tools in modern molecular biology.
In simple terms, a prion is a normal protein that has folded into an unusual shape. A protein is a long chain of amino acids, and once the chain has formed, it settles down into a predictable folded and coiled form. It is the shape of the protein, the size of the protein, and the distribution of charges over its surface which makes it chemically active inside a living thing.
When a protein flips into its "prion state" the protein can then direct other, healthy proteins of the same kind to adopt the misfolded prion form, so there is a sort of biochemical chain reaction. Once formed, the prions are both "infectious" and heritable, because they can be passed on from generation to generation, even though there is no change in the nucleic acid sequence of the protein.
Previous research had shown that there were two parts to a prion: the prion-determining region and a functional domain which performs some function. According to the report, the prion-determining region of prion proteins can be transplanted onto other proteins, and this then makes them take up the prion form. As the newly-enrolled prions misfold, they clump together, and even though the functional part may still be active, it has no effect, because it is unable to get to where it is needed.
Lindquist and Li took the prion-determining part of Sup35, a known yeast prion, and linked it to a mammalian hormone response factor (a rat protein called a glucocorticoid receptor to be precise). This created a novel prion, and in the cells containing the misfolded protein, the hormone response factor was unable to function properly. Then by tampering with the levels of another protein in the cell, a protein called Hsp104, they were able to make the new protein switch between its functional state and its prion state.
Normally, Hsp104 keeps normal proteins in their properly folded states, but around prions, it helps keep them in their misfolded state, so when the Hsp104 is removed, this lets the prion clumps untangle and go back into solution in the cell.
The second paper describes a newly discovered yeast prion, found when Lindquist and Sondheimer wondered how many of the 6200 known yeast proteins could be prions, since there were known to be at least two. The search targeted a handful of suspect proteins with regions that looked like the prion-determining regions of the known yeast prions Sup35 and Ure2.
Sondheimer labeled the suspected prion-determining domains of these proteins with a fluorescent marker called GFP (Green Fluorescent Protein). Four of the proteins formed clumps in cells which appeared as large green fluorescent dots. When treated with Hsp104, one of the proteins, Rnq1, behaved like a prion in its response to different concentrations of Hsp104. Each time the Hsp104 was removed from the cell, the Rnq1 aggregates disappeared, and when Hsp104 was added, the Rnq1 formed clumps again.
After demonstrating that Rnq1 could be made to switch, he exposed a cell that had soluble, or non-aggregated Rnq1 to Rnq1 in its misfolded prion state, and showed that this cell now became infected with the prion form.
The prospects are quite dazzling: it may be possible to knock out single proteins in a cell, to find out what they do, and because the change is produced by a change in protein conformation rather than a change in DNA, it can readily be reversed by experimental manipulation. In other words, experimenters can switch culture back and forth at will, between the active and inactive states. Something which began as the obscure disease of Scottish sheep called \Jscrapie\j, and then turned into \JBSE\j, now begins to look like an important biological tool.
In another study, reported in mid-January in the journal \ICell,\i a group of researchers, also working on yeast, found evidence that prions might be far more common than had been previously suspected. By identifying the prion-forming trait in distantly related yeast species spanning 300 million years of evolution, they seem to have evidence that prions may be advantageous in yeasts, even if they are unpleasant in mad cows, scrapied sheep and CJD humans.
In fact, other researchers have shown that yeast with prions known as PSI+ are more resistant to certain "environmental insults" than those yeasts which lack the protein, so we may even have a hint of the sort of function played by prions in yeasts.
The group found that across a wide range of yeasts, the ability of Sup35 protein to form a prion state was strongly conserved. They suggest that the ability to form prions allows a cell to restrict activity of a specific protein indefinitely, and yet be able to revive the protein later, because the change is much lighter than a change in DNA, which is much more permanent. Normally the DNA change is needed for a protein to change, say, to adapt the cell better for hot conditions.
Later, when conditions cool again, another DNA change back to the original form may easily prove impossible, while prions can "flip" much more easily. In some ways, this is rather like the Lamarckian form of inheritance of \Jacquired characteristics\j, which is currently being re-examined by some scientists. While there is not a lot of good evidence for the idea as yet, they will no doubt seize upon this as a possible mechanism for bypassing Mendelian genetics. Who said we knew all there was to know about science?
For some background information on prions, see \JNobel Prize in Physiology or Medicine (1997)\j.
\BKey names:\b \IScience\i and \IMolecular Cell\i papers: Susan Lindquist, Liming Li, and Neal Sondheimer, \ICell\i paper: Jonathan Weissman, Alex Santoso, Peter Chien, and Lev Z. Osherovich.
#
"The Cuba embargo and public health æcatastrophesÆ",1355,0,0,0
(Jan. '00)
The \IAnnals of Internal Medicine\i is hardly the place where you would expect foreign policy to be debated, but it happened in mid-January, when the continuing United States embargo against Cuba was accused of contributing to several public health catastrophes, among them an epidemic of blindness, more than 50,000 cases, caused by a dramatic decrease in the supply of nutrients.
MichΦle Barry also pointed to an outbreak of Guillain-BarrΘ syndrome, a form of paralysis associated with water contamination due to lack of chlorination chemicals as a direct result of the trade embargoes, and described cases of infants swallowing lye, which is used when soap is not available. The embargo covers both food and medicine, and Barry treats it as a war against public health with high human costs. She concedes that there is a problem with the curtailments of individual liberties and privacy by the Cuban government, but says that " . . . we as health care professionals have a moral duty to protest an embargo which engenders human suffering in Cuba to achieve political objectives."
The US trade and aid embargo against Cuba began in 1961, but became far more damaging when the Soviet bloc crumbled, and stopped supporting the communist government in Havana, and was stepped up with the passage of the Cuban Democracy Act of 1992, which prohibits foreign subsidiaries of US companies to trade with Cuba.
In a guest editorial in the same journal, US Secretary of State, Madeleine Albright, concedes that the overall record of sanctions as an instrument of policy has been mixed. She points to "wins" in South Africa, Rhodesia, and in persuading Libya to make available for trial two men suspected of the sabotage of Pan American flight 103, the plane brought down over Lockerbie. She also suggests that the pressure on Slobodan Milosevic is working, and that Aung San Suu Kyi supports the use of sanctions against the military regime in Burma.
In April 1999, she says, President Clinton announced that the USA would exclude food, medicines and medical equipment from future sanctions " . . .and that we would extend that principle to existing sanctions where we have the discretion under US law to do so. The change does not affect Iraq, Cuba, or North Korea, where food and medicine have always been exempt from sanctions, but it has enabled us to liberalize regulations that govern exports to Iran, Sudan, and Libya."
A long way down through a rather detailed analysis, Albright indicates that "The sale of medicines, medical supplies, and medical equipment to Cuba is governed by the 1992 Cuban Democracy Act. Within the limits imposed by that statute, the Department of Commerce licensed approximately $45 million in medical sales in 1998 and the first half of 1999 and more than $100 million in humanitarian donations of medicine and medical equipment."
The rest of her case may be found on the Web at http://www.acponline.org/journals/annals/18jan00/albright.htm, while the original article is at http://www.acponline.org/journals/annals/18jan00/barry.htm
It is quite obvious that, given a choice between carpet bombing or invasion on the one hand, and a trade embargo on the other, the trade embargo is likely to do less harm, if it is intelligently applied for good and legitimate reasons. The risk of course is that those in a position to place embargoes may develop a "bully" mentality. But even if this does not happen, the regimes placed under pressure by embargoes can be expected to provide televisual evidence that the effect of the embargo is to harm those least able to defend themselves. The boundary between propaganda and reality has never been more blurred.
#
"Counting monkeys",1356,0,0,0
(Jan. '00)
The idea of animals counting is not new: pigeons, rats, raccoons, ferrets and dolphins have all been shown to have some numerical abilities. The rural mythology of Australia argues that crows (Australian ravens) are excellent at counting. The crows also stand accused of killing young lambs, which is probably not the case, but farmers accept the accusation, and are both ready and willing to eliminate crows by shooting them. Depending on who is doing the telling, crows are able to count up to 5 or 9.
But while this is folk wisdom from the Australian bush, two Columbia University psychologists have reported in the \IJournal of Experimental Psychology: Animal Behavior Processes,\i that monkeys taught the numbers 1 to 4 seem to be able to count as high as 9. In fact, the rhesus monkeys that were used not only counted that high, but gave every sign of actually understanding the concept of numbers.
In the report, Elizabeth M. Brannon and Herbert S. Terrace describe how they designed experiments to test whether monkeys could learn rules for putting objects into categories and then apply those rules to a new set of objects.
To do this, they created computer displays with one, two, three, or four abstract elements such as circles, ellipses, squares, or diamonds of varying size and color. Three monkeys were then given training until they learned to touch the sets in numerical order, two of them in ascending order, and the other in descending order. After training the monkeys on 35 separate displays, the researchers tested them on 150 new displays and their performance did not falter.
They wanted to see if the monkeys understood the size relationship of numbers (four is greater than three, and so on), and they tested the monkeys on pairs of numbers not previously used in the tests: five, six, seven, eight, and nine. The monkeys trained on ascending order extrapolated easily, and ordered the numbers correctly 75% of the time, but the descending monkey was unable to solve the problem with larger numbers.
In the world of numbers, there are different sorts of scales. The difference between a size 2 shoe and a size 3 shoe is one barleycorn, but a size 4 shoe is not twice as large as a size 2 shoe, any more than water at 70 is twice as hot as water at 35 (whatever scale we use), but 2 feet or 2 meters \Iis\i twice as long as 1 foot or 1 meter. Temperatures and shoe size are on an ordinal scale, while lengths and most other physical constants are in a ratio scale which is also an absolute scale, because it has a definite zero. There are also ordinal scales, where there is no regularity in the gaps between items on the scale.
So what sort of scale are the monkeys using with their counting? The researchers are only prepared to say that they are working on an ordinal scale, and that they have compelling evidence that number is a meaningful dimension for rhesus monkeys. They add that " . . . neither language nor numerical symbols are necessary for discriminating and ordering stimuli on the basis of their numerosity". They believe that their experiments did not fully extend the monkeys' latent numerical skills, and appear to be flagging that there will be more to come in this fascinating story.
Other points of interest: the monkeys were trained on number ordering skills for approximately six months, while it takes children learning numbers thousands of repetitions to master similar concepts. Further, there are many similarities in performance between monkeys and people when they are given similar tasks: both monkeys and humans are faster when asked to order three and nine that they are when asked to order three and four.
#
"Are solar systems rare?",1357,0,0,0
(Jan. '00)
We tend to assume that recent discoveries of stars with orbiting planets implies that there must be other stars with sets of planets like ours. There is just one problem: the huge planets that we can detect with today's technology are so large that they would sweep away planets that would support life forms as we understand them.
Behind the question "are there other solar systems?" our main concern is other solar systems that may support life like us, and in spite of the recent discoveries of extrasolar planets, we are no closer to answering the main question. A graduate student astronomer, B. Scott Gaudi, has now suggested that the solar system we know and value may indeed be a rare item in the universe. Gaudi has developed a way of using data gathered by the Probing Lensing Anomalies NETwork (PLANET) collaboration, in order to estimate the likelihood that extra-solar planets exist.
The answer appears to be that whatever happened when our solar system formed was not the norm, and that less than 45% of stars could harbor planets in a configuration similar to our solar system, though that still leaves a \Ivery\i large number of candidate stars. At the biannual meeting of the American Astronomical Society in mid-January, Gaudi presented his analysis of two years' worth of PLANET data and described his conclusions.
The eight institutions that make up the PLANET collaboration watch the skies for signs of gravitational lensing, an effect which is observed when a massive dark object in space, such as a dim star, crosses in front of a luminous source star in the background. Under these conditions, relativity takes over, and the light rays from the luminous object are bent around the massive object in front, rather as light rays are bent when they pass through an optical lens.
Looking from Earth, we see the star get brighter as the lens crosses in front of it, then as the lens passes on, the star fades again. Astronomers call the whole sequence a "lensing event."
If there are any planets orbiting a star which has been "lensed," they should show up as a "blip" of extra brightness during the lensing event. This effect should reveal planets down to Jupiter size, located at Jupiter-like distances, while "wobble" methods can only spot planets of that size if they are as close to their star as Mercury is to the sun.
Gaudi believes that the PLANET search \Iis\i looking hard enough, but PLANET has still to detect a single planet. Either they are not looking hard enough, says Gaudi, or the planets simply are not there, adding that they have now sampled 23 events across 1998 and 1999. He says radial velocity surveys indicate that some 10% of stars have planets close in, and that based on the lensing events, something less than 30% of stars could have a Jupiter-like planet at distances between Earth's orbit and Jupiter's orbit.
"You might expect that if we have a Jupiter, then other solar systems probably have Jupiters three or even five times bigger," he said. "But the fact that we're not detecting any planets bigger than Jupiter probably indicates that there aren't many Jupiters out there."
#
"Near-Earth asteroids risk lower",1358,0,0,0
(Jan. '00)
What more exciting way of frightening people than to threaten them with big rocks, a kilometer in diameter, coming soon to a city or ocean near you, unleashing shock waves, tidal waves, and waves of panic? The movie makers have certainly cottoned on, and a few astronomers have been spreading alarmist messages about the risks. Now one astronomer is suggesting otherwise. According to David Rabinowitz, the number of "near-Earth asteroids" that have a chance, though minuscule, of colliding with Earth this century is half what was originally estimated.
That still leaves a risk, of course, but we have just doubled the odds that our grandchildren will survive - at least against threats from the sky. In fact, not one of the known asteroids is in imminent danger of falling to Earth, and that means that no impacts are predicted in the near future.
Previously, the number of asteroids in chaotic orbits was set at 1000 to 2000, now it is seen as more like 500 to 1000, each of them with about a 0.5% chance of colliding with the Earth in the next million years. That means 500 to 1000 objects, each with a 1 in 200 chance of hitting the Earth over the next million years. We may not be safe in the long run, but as the saying goes, in the long run, we are all dead. At least, says Rabinowitz, writing in \INature,\i we have a reasonable chance of finding the asteroids hundreds to thousands of years before they even come close.
The work, a collaboration with Eleanor Helin, Kenneth Lawrence, and Steven Pravdo, all of the Jet Propulsion Laboratory, used a US Air Force one meter aperture telescope, linked to a large-format, charge-coupled device in Hawaii, on the summit of Haleakala Crater on the island of Maui. The telescope was designed to look for artificial satellites, which is a scale good enough for the task: any asteroid the size of a large house or smaller will generally burn up or blow up before hitting the ground. The larger ones, the size of a city block or more, which could punch through the atmosphere and raise a lot of dust, affecting the world's climate, are easier to see.
The researchers expect 90% of the chaotic asteroids to be identified over the next 20 years, but they hope to reduce this to just 10 years, which is an understandable aim. It would be a shame to get this close, and then miss out and be mashed by the asteroid we failed to spot.
#
"Hubble opens for business",1359,0,0,0
(Jan. '00)
In some rare good news for NASA, which has had a run of bad luck recently, the Hubble Space Telescope (HST) is back in business and working better than ever, following the successful Space Shuttle servicing mission in December. During that mission, astronauts refitted Hubble with improved electronics, a new computer, and critically needed replacement gyroscopes.
By the middle of January, Hubble was again delivering exquisite images, in particular of an intricate structure of shells and streamers of gas around a dying sun-like star 5,000 light-years away. Formally called NGC 2392, it is known loosely as the "Eskimo Nebula" because, as seen through ground-based telescopes, it resembles a face inside a furry parka.
In Hubble's sharp view, the "furry" features resemble giant comets all pointing away from the central star, like the spokes of a wheel. As yet, nobody has been able to explain the structure, but according to planetary nebula expert J. Patrick Harrington, "Of all the planetary nebulae imaged by the Hubble Space Telescope, this new image is unsurpassed in subtle beauty." He says that the clumps that form the comet heads all seem to be located at a similar distance from the star, and believes that this will be important in developing a theory on why the clumps formed in the first place.
#
"Mars Polar Lander news",1360,0,0,0
(Jan. '00)
There was a flurry of hope in December and January when signals were picked up which might have come from the lost Mars Polar Lander. While there seemed to be no chance of operating the lander as planned, there seemed to be some hope of getting a hint about what had gone wrong, but the results were inconclusive. The original suspected signal was so weak that it took several weeks for the Stanford team to process their data and reach the conclusion that they may have detected a signal that could have come from Mars.
In late January, Mars Polar Lander mission managers decided to send another set of commands to Mars to investigate the possibility that a signal detected by a radio dish at California's Stanford University had indeed come from the missing spacecraft. While the Polar Lander was commanded to send a radio signal to the 45-meter (150-foot) antenna at Stanford, several other dishes also listened in, but so far, there has been no signal.
The NASA scientists were not overly hopeful. According to Richard Cook, Polar Lander project manager at NASA's Jet Propulsion Laboratory, Pasadena, "The signal that the Stanford team detected is definitely artificial, but there are any one of a number of places it could have originated from on or near Earth. Still, we need to conduct this test to rule out the possibility that the signal could be coming from Polar Lander."
It now appears that they were wise not to raise anybody's hopes, and it looks as though the Polar Lander is lost forever - or until some later expedition reaches the area.
#
"FUSE spacecraft provides its first data",1361,0,0,0
(Jan. '00)
NASA's Far Ultraviolet Spectroscopic Explorer (FUSE) spacecraft has nearly completed its shakedown phase, and has been declared "open for business" according to Warren Moos, FUSE principal investigator at Johns Hopkins University, with the first scientific results being reported at the 195th meeting of the American Astronomical Society (AAS) in January.
The results confirm the nature of the Milky Way halo, an extended halo of half-million-degree gas that surrounds the Milky Way. According to the FUSE data, the halo was generated by thousands of exploding stars, or supernovae, as our galaxy evolved. Shaped rather like a football, the halo extends above and below extending about 5,000-10,000 light years above and below the galactic plane. It gets thinner as it gets more distant from the galactic center.
The hot gas halo has been known for some time, but that left us guessing about how it got there, and how it stays hot. The new FUSE observations reveal a large amount of oxygen in the halo where the atoms have most of their surrounding electrons stripped away. This form of oxygen, oxygen VI, could only be produced by collision with the blast waves from exploding stars, called supernovae, so this single discovery rules out the alternative view that ultraviolet radiation from hot stars could produce the halo.
Within the next few months, researchers expect to begin a comprehensive study of the abundance of deuterium, a fossil atom left over from the Big Bang, and while fine tuning will continue for some time, the amount of time given to scientific observations will go up.
FUSE is able to detect interstellar gas and determine its composition, velocity and distance by viewing bright celestial objects further away. The gas in between selectively absorbs the light from these objects in a unique pattern of colors, depending on the composition of the gas.
The FUSE spectrograph is at least 100 times more powerful than previous instruments, helping it to reveal a large number of new atomic and molecular features in interstellar gas that could only be guessed at before. The spectrograph separates the light into its component colors, similar to the way a prism separates white light into a rainbow. The resulting patterns identify the gas like optical fingerprints. When the patterns shift to different colors by Doppler shifting, velocity and distance measurements can be inferred.
#
"DNA computing progress",1362,0,0,0
(Jan. '00)
DNA computing is not new: as far back as 1994, Leonard Adleman used DNA to solve a version of the traveling salesman problem. The idea is that words written in the letters of DNA, referred to as A, T, C and G, can represent the ones and zeroes used in the binary logic of computers. Instead of performing arithmetic, the DNA computer eliminates molecules whose sequences appear to be poor solutions and retains ones that seem more promising. At the end of the ôrun,ö the output from the remaining molecules can be read like the Baudot code on a paper tape.
But while DNA computing is no longer new, it passed two major milestones in January, as reported in papers in \INature\i in mid-January, and in the \IProceedings of the National Academy of Sciences.\i
The \INature\i paper deals with taking DNA computing out of the test tube and setting it on a solid base, made of glass and gold. This is not a huge step, but it is a key one, because it shows that computing can be simplified and scaled up to tackle complex problems.
The storage capacity of DNA molecules is immense, because DNA molecules can store far more information than any existing conventional computer chip. In fact, one estimate is that a single gram of dried DNA can hold as much information as a trillion CDs. Even more importantly, one biochemical reaction on a tiny surface can coopt hundreds of trillions of DNA molecules to work together, creating a parallel processing system that mimics the ability of the most powerful supercomputer. As a bonus, just as silicon computing approaches the unavoidable limits of miniaturization, DNA computing offers a way around that barrier.
The DNA molecules were applied to a small glass plate overlaid with gold, with the DNA being tailored to provide representations of all the possible answers to a computationally difficult problem. Then by exposing the molecules to certain enzymes, the molecules representing the wrong answers were deleted, leaving only the DNA molecules with the ôrightö answers.
It is early days yet, so don't make plans to set up culture dishes in the laundry to grow your next computer. All the same, the \INature\i paper represents the establishment of a testbed for working out an improved and simpler chemistry for DNA computing, and it shows a great deal more promise than fluid logic computing, which looked quite promising in the 1960s, or more recent efforts at making adaptive computers from Tinkertoy« components.
The \IPNAS\i paper deals with a computer that uses the biological molecule RNA to solve complex problems, in this case, the knight problem, where chess knights are to be placed on a board in such a way that no knight threatens any other. In a chess board with n squares, each can either have a knight on it or not, so that the number of possible solutions is 2\Un\u, a number which rapidly grows so huge as to be rather daunting.
In this test example, the board used was just three squares by three squares, a total of nine squares, which means 2\U9\u or 512 solutions. The RNA computer used enzymes to slash away the strands which did not meet the requirements for a correct solution, and it produced 43 correct solutions, but it also returned one incorrect response, reminding us that there is a marked need to develop error-checking techniques in chemical computing.
One of the advantages of using RNA for ôDNA computingö is that with DNA, there is a limited set of restriction enzymes, which means that you cannot always cut the DNA molecule in the appropriate place. With RNA, say the researchers, they were able to use just one universal enzyme that targets any part of the molecule. This produced considerable streamlining, and offers the prospect of scalability.
#
"Quantum phase memory",1363,0,0,0
(Jan. '00)
University of Michigan physicists have created a database that stores and retrieves data in atomic quantum phase, according to a \IScience\i report in mid-January. They did this using ultrafast lasers and a beam of cesium atoms to produce a result equivalent to the stand bits and bytes used by today's computers.
The work brings to reality a theoretical approach to quantum phase for data storage and retrieval, which was proposed by L.K. Grover in a 1997 paper published in \IPhysics Review Letters.\i The work relies heavily on quantum mechanics, where electrons, instead of being particles whizzing around an atom, become either smeared clouds or waves, depending on your viewpoint, and they can exist simultaneously in an infinite number of locations or quantum states within the wave.
The main advantage of using quantum data registers according to Grover would be that the rules of quantum mechanics allow you to search many locations simultaneously, and he developed some algorithms to carry this out. The researchers explored one of Grover's algorithms, and found that, as he had predicted, the system offers a faster, more efficient way to store and retrieve data than the binary system we use today.
In the experiment, a computer randomly assigned data to one quantum state in a single cesium atom. Then using a pulse of ultrashort, intense laser light, the experimenters stored the information in the assigned quantum state by flipping the quantum phase or literally inverting the quantum wave for that state. Less than one nanosecond or billionth of a second later, the same atom was hit by a second laser pulse, which located the stored data by amplifying the flipped quantum state and suppressing all other states in the wave packet.
Bucksbaum was cautious about his work in a statement on the Internet, saying that "Quantum phase data storage is a new concept. Most researchers are using the spin of a quantum particle as a storage medium. Our work may turn out to be a step on the pathway to a viable quantum computer system or it could be a complete dead-end. The field is still too new to know which approach will succeed."
\BKey names:\b Philip H. Bucksbaum, Jaewook Ahn and Thomas C. Weinacht.
#
"Internet traffic density rises",1364,0,0,0
(Jan. '00)
The Internet traffic in North America, by far the largest Internet sector, has reached a staggering 350,000 terabytes per month according to a new study by a telecom market research firm, RHK. The level of data traffic is now, for the first time, greater than the level of voice traffic carried on the telephone systems ("the incumbent voice network infrastructure") of the continent.
This volume is still climbing, driven by more users, complex graphics, new applications and an increase in sophisticated Web business presence. As a result, say RHK, there is a growing need for ônew, denser equipment solutions in a still evolving service provider network architecture."
Some of the contributions to their analysis are worthy of note: streaming media (see below) account for 10% of all Internet traffic, while relatively new techniques carrying voice and fax over IP (Internet Protocol) represents between 2% and 3% of the traffic mix.
Streaming media are those systems which provide audio or video in "real time", with a small amount of buffering to cover problems when packets of data are lost in transit and need to be re-sent, but it is possible to see and hear material from the other side of the world, almost as soon as it is broadcast in another country.
Australians can listen to the BBC on the Internet, and Greenlanders can listen to Radio Australia in the same way, so long as their modems are fast enough. One interesting development in late January: Judge Donald Ziegler of the US District Court in Pittsburgh granted an order shutting down a Canadian Web site called iCraveTV.com, which has been accused of stealing programming from American TV networks and offering it as content on its site. Under the terms of the order, ICraveTV cannot continue operating until it shows that it can stop US Internet users from accessing the site.
The indications are that demands for ever-greater bandwidth are going to be part of the Internet picture for some time to come. RHK argues that an increase in data-intensive applications as well as faster access technologies for both home and business will continue to fuel the phenomenal growth of the Internet. While this growth may be flattened slightly by saturation of the North American market, this will be offset by increased growth in other parts of the developed world, and then perhaps in the developing world.
Only 1.5 million cable modems are in use in homes in North America, less than 1% of homes in the US and Canada, so growth in this area, and also in wireless communications, even in less developed countries, should see the market booming.
#
"Wearing your computer on you sleeve",1365,0,0,0
(Jan. '00)
As the Internet traffic soars, wearable computers are coming closer (See \JWorld's smallest Web server\j, February 1999), bringing the prospect of permanent connection, wherever you are, even closer. Andy Fagg of the computer science faculty at the University of Massachusetts has been seen recently, strolling through the frozen foods aisle of grocery stores, wearing an apparatus that positions a tiny computer screen in front of his face.
Right now, that tends to make people wait until he sees them, and then stop and stare, says Fagg, but the work he and other researchers are doing may make such sights as common as the personal music sources that so many people use to play cassettes, CDs, or MP3 music. The idea of a wearable computer is that it should be as easy to put on as a baseball cap or a pair of shoes, and it should provide access to information and communication resources at any time during our waking hours.
The idea is less a matter of writing an e-mail as you stroll past the frozen foods, and more to do with having digital assistance as you go about your life. One of the standard far-fetched examples cited by Internet futurologists in the past few years has been "putting your toaster on the Internet". While this may be improbable, attaching your refrigerator, microwave, dishwasher, washing machine and other appliances to the Internet makes quite a lot of sense, if they are Internet-ready, and if wearable computers are a reality.
For example, a refrigerator with a bar-code reader could keep track of what goods have been placed in, and taken from, the refrigerator, so that this information could be pulled into the wearable computer, or have been transferred there automatically, by infra-red. If the computer knows your dinner plans involve a certain recipe, he says, "If I drive near the grocery store, it wakes up and whispers, 'Don't forget to stop at the grocery store, and by the way, you need these three items for the dinner you want to cook tonight.'"
Fagg sees more promise in having a wearable computer that learns your habits, recognizes that you are entering a meeting-room at the time set down for a meeting, and automatically assembles the documents, such as the minutes from the last meeting.
Right now, the computer weighs in at just under 3 kg, about 6 pounds, half of that made up of batteries. The components are off-the-shelf, soldered inelegantly, he admits, but this is just a prototype, with the whole unit crammed into a blue canvas camera bag, with an odd assortment of wires projecting from it.
The computer features an assortment of serial ports and video ports, a headset with earphones and a video monitor, and a Global Positioning System (GPS) receiver, which only works out-of-doors. When the screen is positioned correctly, it looks as though it's floating a few feet in front of the viewer. One of Fagg's current targets is to train the computer to recognize when it can interrupt, and when best to provide information; a visual display while he is driving could be a problem, and shopping information is of no use when he is working in his office.
For more information on the Web (where else?), see http://www-anw.cs.umass.edu/~fagg/projects/wearables
#
"Death row, Internet style",1366,0,0,0
(Jan. '00)
Spam, that infuriating characteristic of the Internet, just will not go away. Anybody with an Internet address, as soon as they join a list or start a Web page, find their e-mail address stolen away and sold to spammers, people whose aim in life is to send advertising to as many people as possible, as cheaply as possible.
Spammers are justifiably regarded by the Internet community as bottom feeders from the shallow end of the gene pool, and this assessment seems justified when you look at their targets. Spammers aims are directed at those who will fall for the blandishments of get-rich-quick schemes, semi-confidence tricks, pyramid schemes and chain letters. There are others, of course, who try to sell pornography and the like, but all spammers have one thing in common.
A spammer does not care if you are interested in receiving the e-mail or not: even if 99.99% are grossly offended by their message, if they send out a million messages, the remaining 0.01% still means 100 new customers for the "service" on offer. When paper and print and postage are involved, advertisers need to tailor their mailings to target reasonable prospects, but sending the same e-mail to a million addresses is easy, once you have bought a CD-ROM with addresses on it.
It does not matter to the spammer that most receivers are offended and annoyed, since the e-mail address on the message is usually a fake, or has been closed down by the Internet Service Provider (ISP), once they realize that the account is being used for spamming.
Many users have learned to create filters that recognize certain hallmarks of spam mail, such as six successive dollar signs, or certain domains, or key words and phrases which identify some of the sad missives which appear in our mail boxes. As well, some ISPs act to bar certain sorts of mail, acting as a barrier between spammer and recipient: if a large ISP has 20,000 identical messages from the same source, it does not require rocket scientist status to work out that this is unlikely to be normal mail.
One part of the Internet, usenet, has a better way of dealing with ISPs which act as safe havens for spammers, and during January, one particular provider, the @Home Network, was agreed to have gone too far, and a Usenet Death Penalty (UDP) was threatened. The spammers had long used its servers to flood usenet with spam, and the company had not addressed the problem, the admins said, so it was necessary to take action. Sending spam to lists and newsgroups is a quick and easy way of ensuring that huge numbers of people see your message, without having to generate large transmission levels.
The site admins gave @Home one week to clean up its act before all usenet posts from @Home accounts were blocked. The previous inaction from @Home was replaced by a flurry of activity, and by the end of the week @Home had cracked down on spam. The admins called off the UDP and placed @Home on a 30-day probation. In other words, they need to keep cracking down on spam, or the UDP could be revived.
Usenet is described as a cooperative anarchy, where cooperation is assumed and expected, if only because misbehavior affects all of the sites which participate in usenet. A UDP involves site administrators around the world who act together to enforce cooperation. Under an active UDP, every message posted to usenet by the offending site is canceled or failed to be propagated. Even legitimate messages are blocked, which tends to encourage legitimate account holders to go elsewhere. The UDP can be fine-tuned to cover only selected parts of usenet, or selected parts of a transmission domain, but the effect is one of silence. There is, they say, no censorship of content: a particular source of uncooperation is shunned by the rest of the community.
In the anarchy of the Internet, an act like this is not considered lightly, because it is contrary to the culture of the Internet as users understand it. So a UDP is only considered when the ISP in question has consistently failed to address the problem, whether it is spam, rogue cancels, or something else, coming from that ISP. There is also a system set up which assigns a pseudosite to certain classes of mail, like "Make Money Fast", which can be blocked by sites if they choose to do so.
The site being UDPed has no say in the matter; they have been asked to act many times, warned about the intended UDP, and in most cases, have denied the problem, but once the UDP was invoked, they have cleaned up the problem they did not have. No usenet site is forced to honor the UDP: if they do so, it is because they have decided to cooperate in enforcing cooperation. One company, UUNet, threatened legal action when they were UDPed in 1997, but nothing ever eventuated, except that UUNet stopped being a problem, and the UDP was lifted, according to those involved.
The effect of spam on usenet is to push legitimate traffic out of storage sooner than would otherwise be the case, so far from the UDP depriving spammers of their right of free speech, the spammers are depriving ordinary users and posters of their freedom to communicate.
When the @Home network was warned of the intended UDP, after huge numbers of complaints, with a full active Usenet Death Penalty to go into effect at the close of business, 17:00 PST, on Tuesday 18 January, 2000 (19 Jan 2000 01:00:00 GMT), they claimed that the problem came from subscribers who had installed proxy software incorrectly. This, said @Home, resulted in the subscribers becoming spam relays as spammers took advantage of this mis-configuration and sent thousands of newsgroup messages through @Home's news machines.
Their stated plan was to perform frequent network wide scans of their customer base to target proxy servers, and when these customers were identified, to suspend their news service immediately, and until the machines were secure. As a consequence of this, they asked that the deadline be extended. They were later placed on a 30-day probation, to see if spam levels went down and stayed down.
Some commentators were moved to question the continuing value of usenet, with @Home stressing that the usenet remains a "techie" location, and arguing that it is increasingly irrelevant as more users who are less familiar with the technology go on-line. Yet despite these brave words, they seem to have caved in very rapidly to the requirements of the Netizens who first asked nicely, then complained kindly, before baring their teeth.
#
"The robots are coming",1367,0,0,0
(Jan. '00)
All over the world, the number of robots is increasing, not only in the laboratories, but also in hostile terrains (see \JRobots on ice\j). According to the new second edition of the "Handbook of Industrial Robotics," complete with a foreword by late science fiction writer Isaac Asimov and contributions from 120 experts, the population of industrial robots nearly doubled during the 1990s, and they are becoming increasingly important in applications ranging from quality control to space exploration, surgery to the service industry.
In this sense, "industrial robots" means all robots manufactured by industry, not simply robots used in industry. The first edition of the handbook was published in 1985, but while the principles remain the same, the number of robots is sky-rocketing, and our understanding is growing all the time. The new edition deals with matters like coordination and collaboration among machines and multi-robot systems, and there is even a section on group behavior of robots. When the first edition came out, people would not have even considered the idea that different kinds of robots can help each other perform certain jobs.
The main new trends involve the emergence of new types of devices, including tiny micro- and nano-robots and robots with multiple arms or legs, and also the innovations in technologies dealing with electronic controls and sensors, computer vision systems, virtual reality, artificial intelligence and nanotechnology in general.
Published in mid-1999 by John Wiley & Sons, and edited by Shimon Nof, the book pinpoints the following trends.
ò In industry, the number of robots per 10,000 manufacturing employees went from 8.3 to 265 in Japan, 2 to 79 in Germany, 3 to 38 in the United States and zero to 98 in Singapore.
ò There were about 35,000 robots in 1982, 677,000 in 1996, and an estimated 950,000 in the year 2000.
ò Between 1992 and 1997, the robot population in North America grew 78%, from 46,000 to 82,000.
Two useful Web links:
Shimon Nof's home page is at http://gilbreth.ecn.purdue.edu/~nof/, and the Robotic Industries Association can be found at http://www.robotics.org
#
"Robots on ice",1368,0,0,0
(Jan. '00)
For some time now, researchers from Carnegie Mellon University's Robotics Institute have been working on remote-controlled robots, mainly for space exploration, and as a part of this, they have tried a variety of extreme environments (see \JRobots on Mars\j, June 1997 and \JRobots for the 21st century\j, August 1997). As an extension of this work, they have now deployed a four-wheeled robot named Nomad, to Elephant Moraine, a remote area in eastern Antarctica, 269 km (160 miles) northwest of the United States base at McMurdo Station.
Nomad's task will be "to search autonomously for meteorites and classify them in the field with scientific instruments contained in a newly developed manipulator arm". This is the first robotic search for extraterrestrial material that has fallen to Earth, and could serve as a prototype for future scientific missions to Mars and the Moon.
The US National Science Foundation's (NSF) Antarctic Search for Meteorites (ANSMET) was set up in 1976, and since then, ANSMET researchers have collected more than 10,000 meteorites during their annual expeditions to Antarctica, and these have later been made available to the world's scientists for study.
Elephant Moraine is named for its shape, seen as being like a small elephant with a very long trunk. This is considered to be one of the more important sites for meteorite discovery, with nearly 2,000 specimens recovered during seven previous visits, including the first meteorite identified as definitely being from Mars. The area to be searched, near the end of the elephant's trunk, was last searched in 1979.
After transport by light aircraft and helicopter, Nomad will spend three weeks of the southern summer (depending on the weather) driving, looking, choosing and testing to select meteorite candidates it encounters in the area. Among other things, the robot will use high-resolution imagery and spectroscopy to gather scientific data about the rocks it finds. An ANSMET member is on the field team, and will both serve as a guide to the region and collect any meteorites that Nomad successfully locates.
The Nomad program involves new skills for robots, because the robot will need to classify what it finds, where previous explorative robots have simply taken pictures or gathered data, and returned what they viewed to scientists who made all of the judgments and decisions. Instead, Nomad will make its own judgments and inferences about the rocks that it encounters, say the researchers.
The programmed search pattern is like that of a person mowing a lawn, with machine vision spotting the dark rocks against the white background of snow. Then it will use a zoom camera, relying on size and color, to identify those pieces of stone which are meteorites.
#
"The invading, evolving fruit-fly",1369,0,0,0
(Jan. '00)
The world has many examples of introduced exotic species such as African walking fish in Florida, European rabbits in Australia, "Australian pine" (she-oak, \ICasuarina equisetifolia)\i in Florida, and Japanese kudzu \I(Pueraria lobata)\i throughout the southeastern United States, decimating native species and upsetting ecosystems. But how do these invaders flourish so well?
Part of the answer to this question may be found in a study of the accidental introduction into North and South America of an Old World fruit fly which has exhibited one of the fastest evolutionary changes ever recorded. Fruit flies have several advantages in getting into the record books: they breed fast, and they breed in large numbers, as well as being small enough to be blown large distances.
A report in \IScience\i during January revealed that the fly, \IDrosophila subobscura,\i which was introduced into the Americas about two decades ago, has already evolved a north-south wing size pattern that mimics the pattern seen in established populations in Europe. At the same time, it seems to be replacing native fruit flies \I(D. pseudo-obscura)\i in the Pacific Northwest of the USA.
\ID. subobscura\i is a temperate-zone fly found naturally in a region stretching from Spain up to southern Scandinavia and from North Africa to the Middle East. It first appeared in South America near the Chilean port city of Puerto Montt in 1978. These flies quickly colonized much of coastal Chile, though scientists could see no change in wing length a decade later. They were initially discovered in North America in the early 1990s in Port Townsend, Washington state.
The two populations are remarkably similar in their genetics, suggesting that a single ship carried both populations across as a common stock that hitchhiked on a ship that probably stopped in Chile and in North America sometime around 1978. Now the flies occupy a zone of 16 degrees latitude in Chile, and have hopped the Andes into Argentina. In north America, they cover the coast from Santa Barbara in California to the tip of Vancouver Island in Canada, and as far east as Salt Lake City.
\ID. subobscura\i has been in temperate Europe for about 10,000 years, ever since the end of the last Ice Age, and the European population has bigger individuals at higher latitude. In the study just reported, individuals were trapped at 11 North American sites in 1997 and 10 in Spain, France, the Netherlands and Denmark in 1998. The separate populations were then bred in captivity over five or six generations of flies in similar environments, and then the offspring were measured for wing length. This is easy to measure, and it yields a highly repeatable index of body size.
In European populations, there is a 4% difference between the larger flies of Denmark, and the smaller flies of Spain. When the American flies, presumed to come from a common stock, are measured, they turn out to show a similar pattern, which must have developed since 1978.
The patterns are similar, but not the same: females show a 4% difference across the latitudinal range, but in males, this is down to only 1 or 2%, and the part of the wing which lengthens is different. If the European flies have longer "biceps", North American flies have longer "forearms."
The researchers have now obtained stocks of the southern flies from a number of zones, but have yet to report on the differences, if any, found in those stocks. Only one previous example of such rapid evolution has been seen, when Galapagos Island finches were subjected to a severe drought in 1978, and only finches with big, massive bills survived, necessarily passing that trait to the next generation.
If invading species are also able to evolve this fast, it may help to explain why they move so fast. According to Raymond Huey, "Previously scientists studying invasions have largely assumed that evolution can be ignored because it was thought to occur so slowly relative to the dynamics of invasion. This study shows that an invader can in fact evolve very quickly, in just a few years, and potentially have a big impact."
\BKey names:\b Raymond Huey, George Gilchrist, Margen Carlson, David Berrigan, and Luis Serra.
#
"A new order of marine fungi",1370,0,0,0
(Jan. '00)
Husband-and-wife research team, Drs. Jan Kohlmeyer and Brigette Volkmann-Kohlmeyer have just posted advance notice that they have distinguished an entirely new order of marine fungi, to be called the Lulworthiales. Using DNA analysis and working with Joseph Spatafora, they have assigned 11 species in the genus \ILulworthia\i and six species of the genus \ILindra\i to the new order, and the work will be reported in the journal \IMycologia\i (ômycologyö is the formal name for the study of fungi).
The species are all marine decomposers rather than parasites that attack living things, and they break down dead seaweed, marsh plants and wood. This last food source makes them economically interesting, since the wood they ôeatö includes wharf pilings and boats made of wood. They are also ecologically important, because they serve to recycle nutrients in oceans and especially estuaries. Without them and their cousins, say the researchers, life would cease to exist in its current form on Earth.
The two have also found more than 100 new species of marine fungi on needlerush, a common grass species of North Carolina coastal marshes, which may be some sort of record, as the normal number of fungi on a single host is usually no more than eight.
#
"The big bang theory of human evolution",1371,0,0,0
(Jan. '00)
One of the standard triggers for evolution has long been considered to be the ôbottleneck,ö where a small group of individuals are cut off form a larger population, and in isolation, they begin to speciate, to change, as a group, much faster than a larger population could change. There are several reasons behind this.
First, there is the founder effect, where the isolated population is likely to be a small and related sample of the larger population, with far less variability. As a part of this, the fact that the group got themselves into isolation may indicate some unusual characteristic, like the intelligence to build boats, or the drive to cross a desert or a mountain range.
Second, there is genetic drift. In a small population, random events in mating may lead to genes dying out, and once gone, they are unlikely to return. As well, in a small and closely-related population, recessive genes which are bad news, deleterious in the language of geneticists, tend to be eliminated much faster, and any neutral or beneficial mutations are more likely to spread across the entire population, either at random, or by natural selection.
It is likely that most bottleneck events lead nowhere, because the populations die off, but if enough bottleneck events take place, some must succeed. In particular, around two million years ago, says a group of eminent paleoanthropologists, a small group of individuals became separated from other australopithecines. In this group, chance selection effects led to a series of sudden, interrelated changes in body size, brain size, skeletal proportions, and behavior that jump-started the evolution of our species.
Writing in the January 2000 issue of \IMolecular Biology and Evolution,\i the group have looked at the full spectrum of paleontological, archeological, and genetic evidence available, each telling a slightly different story about human origins. They have tried to estimate the error ranges in the gleanable hints, allowing them to spot the common, overlapping areas of agreement. In doing so, they claim to have disproved some high-profile recent theories while supporting one of the oldest modern versions of the origin of \IHomo sapiens.\i
In the polemic of paleoanthropology, that means that they have found another way of demonstrating that their preferred model fits the known facts better than any other model. In short, this is yet another round in the ongoing battle between the ôout-of-Africaö group who believe that modern humans all came from a bottleneck in Africa about 200,000 years (or maybe up to 400,000 years) ago, and those who believe that humanity evolved from earlier hominid forms in many parts of the world. To the ôout-of-Africaö group, the Neandertals were an entirely separate species, unrelated to the later group who swarmed across the world from their origins in Africa, while the regional development school see the Neandertals as much closer, as quite possibly just another race of the same species.
What is common to both groups is that humans started in Africa, and nobody disagrees that there was at least one bottleneck - in fact, there is nothing in the ôout-of-Africaö hypothesis that would be harmed by the acceptance of the two million year bottleneck. But to John Hawks, first author of the paper, that bottleneck was a key issue. "This original population lived before humans colonized regions outside of Africa. In fact, it was the act of becoming human that made these colonizations possible," he says.
Based on anatomical evidence, the authors conclude that a "genetic revolution" took place in a small group isolated from other australopithecines. They say that the earliest \IH. sapiens\i remains differ significantly from australopithecines in both size and anatomical details, and argue that these changes seem to have been sudden, rather than gradual. This is slippery ground, since their inferences are based on a patchy fossil record, and could be overthrown by a single fossil find, but in the absence of such a fossil, it seems to be a reasonable claim, especially when we consider the other evidence.
The evidence from archeology, also notoriously patchy, tells much the same story, suggesting a series of behavioral changes suggestive of a new adaptive pattern of hunting, gathering and scavenging. The authors say that body size is a key element in these behavioral changes, because a larger body requires different ways of moving, and both body size and locomotor needs push up the metabolic demands, the need for food. In short, these changes feed on, and fuel, each other. The behavioral changes shown in the archeological record are far more massive and sudden than any earlier changes known for hominids, they argue.
Where the logic gets tenuous is in their argument that this was the only bottleneck. They claim that the available genetic data do not disprove a simple model of exponential population growth following a bottleneck two million years ago and extending through the Pleistocene Epoch, the Ice Ages of the Northern Hemisphere. On their interpretation, they claim that the data are incompatible with any more recent population-size bottleneck, but this seems to be an argument based more in hopeful interpretation than in substance.
If they are correct, though, and if there was no later time when the size of the human species became small again, then the ôEve theoryö of modern human origins, which sees modern human populations arising recently and replacing all of the world's other humans, will have to fall. But that is unlikely to happen this year, given the tenacity of the other side in this debate.
\BKey names:\b John Hawks, Milford Wolpoff, Keith Hunley, and Sang-Hee Lee.
#
"Shrinking iguanas",1372,0,0,0
(Jan. '00)
Over long periods of time, it is not unusual for species to evolve into larger forms, or even occasionally, into smaller forms. But when vertebrate animals appear to be getting smaller during the course of a study, this is usually dismissed as measurement error or not possible. Now, for the first time, a data set collected on the Galapagos Islands reveals that among marine iguanas, such shrinkage is both happening and reversible.
A report in \INature\i reveals that marine iguanas from two island populations shrank as much as 6.8 cm (2.7 inches) - up to 20% of body length - in two-year time spans. The plant-eating reptiles \I(Amblyrhynchus cristatus)\i were shrinking, the scientists say, to boost their survival during a change in the weather. In each of the periods of shrinkage, 1982-83, 1987-88, 1992-93 and 1997-98, the shrinking happened in El Ni±o years.
The most recent event, 1997-98, showed shrinkage that was just too much to ignore, and when scientists plotted the data, they found an interesting effect. It was individual iguanas that were changing, rather than some sort of selection effect that favored either larger or smaller individuals.
The islands of the Galapagos archipelago normally experience cold, nutrient-rich currents from both the west and south, but during El Ni±o years, however, warm currents and heavy rains raise the sea temperatures. The iguanas eat algae along the tidal basins of the islands' rocky shores, but as the temperature rises, less digestible brown algae replace the iguanasÆ preferred green and red algae.
As they shrink, the iguanas also get slimmer, and their mouths get smaller. As a result, they are more efficient at harvesting the tiny amounts of available algae. Then in the years immediately after El Ni±o events, surviving iguanas ate well and got fat, then started growing longer again. In a careful study, 600 iguanas were measured and marked in 1992. Then after the next El Ni±o, they were measured, and it was found that the larger iguanas, measuring more than 300 mm (12 inches) from snout to anus, were the ones which shrank the most and survived the longest.
The researchers argue that the shrinkage increases their chances of survival. A length change of 1 cm, they argue, can increase their survival rate by 10%, and by shrinking more, they can boost their survival chances by 35%. They believe that bone absorption accounts for much of the shrinkage, saying that a reduction of connective tissues between the bones cannot account for it, and that high levels of corticosterone may be involved.
One feature that may prove attractive to other scientists is the iguanas' ability to enlarge the bones again when conditions get better. Humans suffering from osteoporosis as a result of aging or space flight are unable to recover losses in bone density and length, especially when the losses are in the body's long bones. The control is presumably a hormone or a combination of hormones, or perhaps some other physiological mechanism that tells bone to regrow and recalcify: whatever it is, the mechanism may apply in all vertebrates.
\BKey names:\b Martin Wikelski and Corinna Thom.
#
"Emerald fields",1373,0,0,0
(Jan. '00)
Emeralds have long been regarded as a treasure. The Emperor Nero peered at the gladiators through an emerald monocle, and the Egyptians, the Moguls of India, the Aztecs, and the crowned heads of Europe have been among those who valued these gems. The \Jemerald\j is merely a \Jberyl\j with a coloring of chromium oxide, but the green color has led people to attribute to the emerald the power and strength of spring and immortality.
Emeralds are found in many parts of the world, but a report in \IScience\i in late January indicates that, for the first time, we may be able to "fingerprint" individual emeralds by their oxygen isotopes, to identify their origins. The results already available shed new light on the surprising extent of the Spanish trade, soon after the Spanish conquistadors discovered the New World mines, but they also point to "lost" Asian emerald sources in antiquity.
The story begins with the mystery of nine historic emeralds spanning a time period from the Roman occupation of France until the 18th century. They included a part of an earring from the Gallo-Roman site of Miribel in France; an emerald from the Holy Crown of France which was placed there by the crusading Louis IX (St. Louis) in the 1200s; 18th century emeralds from the treasury of the Nizam, princely rulers of the former state of Hyderabad in India; emeralds studied by French founding mineralogist AbbΘ Haⁿy (see \JHaⁿy, RenΘ Just\j) for his definitive description of the gemstone in 1806; and an emerald recovered from the famous wreck of the Spanish treasure galleon the \INuestra Se±ora de Atocha\i that sank in a hurricane off the coast of Florida in 1622.
The impetus came when work in mines in Colombia and Brazil showed that deposits from different regions, and often from individual mines, have very specific oxygen isotope levels. These levels in gems such as emeralds are influenced by the composition and temperature of the fluids that eventually crystallized to form the emerald, but the final values also depend on the composition and temperature of the rocks that the fluids passed through before they formed the gemstone. In any given site, the emeralds all fall in a cluster, with the oxygen isotope values falling in a narrow range. When this information is combined with the more usual gemological aspects, such as optical properties and the inclusion of other materials, it is possible to say where an emerald was formed.
The isotope values were gathered using ion microprobe oxygen isotopic analysis. This involves bombarding the emerald with an electron ion beam, to dislodge oxygen ions from the crystal lattice of the gem. In the process, tiny pits, invisible to the naked eye, are formed, but the sample is still large enough to allow analysis. The method is described as "virtually non-destructive", close enough to non-destructive to persuade the curators and holders of these gems to allow them to be studied.
While the Holy Crown and Haⁿy emeralds were traced to known areas, the Miribel emerald turned out to have come all the way from Pakistan, while one of the Hyderabad emeralds came from Afghanistan. This seems to suggest that the old Silk Route of Roman times may also have involved the transport and trade of emeralds. Three other Hyderabad emeralds, always assumed to have been ancient, dating back to \JAlexander the Great\j, were shown to have been from Colombia. Since they reached the Nizam's collections in the 1800s, this suggests that the trade in emeralds from the New World spread further and faster than anybody had expected.
As well, the emerald from the wreck of the \INuestra Se±ora de Atocha\i turns out, unsurprisingly, to be of Colombian origin, but again, this shows an early Spanish role in bringing New World emeralds into Old World trade. The next target, say the researchers, will be rubies, just as soon as they have finished the necessary survey of the geology and geochemistry of ruby deposits around the world.
#
"Fertilizers and amphibian deaths",1374,0,0,0
(Jan. '00)
All over the world, frogs, toads and newts seem to be under threat, some of them in polluted areas, some of them in pristine environments. Pesticides, acid rain, predators, habitat destruction, pollutants, detergents and wetting agents, bacteria and fungi, increased ultraviolet radiation from the sun, are just some of the causes that have been suggested, but a report in the journal \IEnvironmental Toxicology and Chemistry\i points to yet another possible cause. The authors say they have discovered that a level of nitrogen-based compounds, low enough to meet EPA standards as safe for human drinking water, a level often found in agricultural areas as a result of using crop fertilizers, is enough to kill some species of amphibians.
The study showed that five species of amphibians, including the Oregon spotted frog, red-legged frog, western toad, Pacific tree frog and northwestern salamander, can be highly susceptible to fairly low levels of nitrate and nitrite exposure, especially in their more vulnerable tadpole stages. When exposed to only moderate amounts of nitrates and nitrites, some of the tadpoles, and also young frogs, swam less vigorously, showed signs of disequilibrium, developed physical abnormalities, suffered paralysis and eventually died. Others, in control tanks with normal water, survived quite happily.
In other words, nitrate and nitrite exposure at levels considered safe for humans or fish is able to inflict considerable damage on amphibians. Agriculture depends heavily on the use of artificial fertilizers rich in nitrogen to produce the world's food supply, and has done for more than a century, originally from deposits in the \JAtacama Desert\j, and later from the Haber process (see \JHaber, Fritz\j). Much of the nitrate added to soil is leached down to the water table, and ends up in waterways, flowing down to the sea. More nitrate may come from natural sources: see \JNitrates in streams may be coming from bedrock\j, October 1998 for more on this.
The Oregon spotted frog has largely disappeared from most of its known historical range, which happens to be in an area of lowlands with intensive agricultural use. In the study, three environmental levels of nitrates and nitrites were used, and the Oregon spotted frog was the most sensitive. Overall, it was three to four times more vulnerable to nitrates and nitrites than red-legged frogs and Pacific tree frogs, and they suggest that this could well account for its disappearance.
In just 15 days, nitrite levels considered safe for humans were enough to kill over half of the exposed Oregon spotted frog tadpoles. All five species were affected by nitrite levels which were within the EPA "safe" limits for warm water fish. This is probably the key finding: nitrates themselves are relatively harmless, but reduced to nitrites, they cause health problems. Shore sites with high contents of organic matter are usually high in nitrites, as are other areas high in animal manure, while nitrate can also be reduced to nitrite in the gut, especially in younger animals.
In all probability, the decline of amphibians has happened for a number of reasons in different places, so that there is no one single cause. In some places, there may be a synergy operating, where two or three different effects, each relatively harmless, work together to take out a population.
The increased numbers of frogs with extra legs, for example, has been blamed on a trematode parasite (see \JFrogs deformed by chemicals?\j, November 1997), but this parasite has been around for a long time. Perhaps the nitrate and nitrite levels are interacting with the trematode in some way. After all, the trematode lives part of its life cycle in a snail, snails eat algae, and higher levels of nitrogen-based fertilizers can cause increased algal growth, increasing the snail populations.
So there is a logical chain to be studied here, to be teased out, and tested for flaws. But while scientists work to identify the cause or causes, the amphibians continue to drop away. We will have to wait to see who wins this potentially deadly race.
#
"Impact from rainforest greenhouse gases",1375,0,0,0
(Jan. '00)
One of the standard wisdoms accepted by all conservationists is that clearing rainforests can only accelerate the rush into global warming. The logic is simple: trees use up carbon dioxide to make more wood and other solid carbon compounds, so if you chop down the forest and burn the wood, you increase the amount of carbon dioxide in the atmosphere, while taking away one of the tools that would absorb the gas. Now it appears that this may be too simple a picture.
Carbon is the building block of life, the basis of our food, and all other living things weave through all of life on Earth in a delicate balance. Plants take carbon dioxide, and fix it in the process of photosynthesis, making the solid compounds that all other life forms rely on. Plants are called "sinks", and so, too, are a variety of geochemical processes which mop up the carbon dioxide in the atmosphere. Carbon dioxide is released into the atmosphere from the burning of fossil fuels and wood, and from the decay of plant and animal material. These are known as "sources."
For a balance in nature, the output of the sources needs to match the uptake of the sinks, but the world's rainforests have long been seen as a sink that was under attack, being cleared, burned, and converted into farmland.
A report in \INature\i during January indicates that the carbon released into the atmosphere by deforestation offsets that absorbed by new forests growing, at least in the Amazon. Rather than acting as major sources of greenhouse gases, the forests under attack may still be neutral, in spite of all that has been done to them. If that is the case, then this may change the way we look at the significance of fossil fuel emissions, which may have played an even larger role than we thought in raising atmospheric carbon dioxide levels.
The authors based their conclusions on satellite data that charts deforestation each year, calculating carbon emissions caused by land-use changes, such as farmland, and estimating the biomass of forests and farmlands in Brazil from 1988 to 1998. They conclude that the forests of the Amazon basin are doing better than people have believed.
While the effect cannot be seen from looking at a single tree, the global view from a satellite can pick out the changes far better. From a distance, we can see past the local and seasonal fluctuations in the balance between source and sink. The satellite data plus computer models can indicate changes that are small on a single tree, but significant in an entire forest, and that is what we need to know about.
The finding will not be seen as good news by conservationists who have used global warming and the need to maintain biodiversity as the main legs of their case for conserving the forests. Luckily for them, the biodiversity argument maintains its validity.
\BKey names:\b David Skole, Walter Chomentowski, Richard Houghton, J. L. Jackler, K. T. Lawrence, and Carlos Nobre.
#
"Satellite evidence for warming",1376,0,0,0
(Jan. '00)
Late January saw the publication of a report in \INature,\i which confirmed that the atmosphere has gotten warmer and wetter over the last decade. The evidence comes from satellite data gathered as part of NASA's Pathfinder Activity. Frank Wentz compared the Advanced Very High-Resolution Radiometer (AVHRR, which measures sea surface temperatures), the Microwave Sounding Unit (MSU, which measures air temperature), and a Special Sensor Microwave Imager (SSMI, which measures humidity) for their accuracy and consistency in monitoring global climate.
Wentz and Matthias Schabel looked for matches between the data sets, and while 11 years is a short period to seek trends in the climate, the combination of these three instruments can produce a much better definition of climate trends than any of the instruments could do on its own.
Wentz said in a press release that "The three satellites combined provide some of the strongest evidence so far of a climate trend of increasing air temperature and humidity." Overall, they found that the amount of water vapor in the atmosphere had increased by 2% between 1987 and 1998. Water vapor is really \Ithe\i primary greenhouse gas in the atmosphere and has a greater influence on global warming than carbon dioxide, but it could also give our planet more cloud, increasing its \Jalbedo\j, and so cooling the planet.
The increase in water vapor would make sense, because as air temperature increases, the atmosphere is able to hold more water, and with clear evidence that the atmosphere has significantly warmed and moistened over the last decade, the only question is whether this is due to natural climate variability or to human-induced climate change.
#
"New, major source of atmospheric methyl bromide",1377,0,0,0
(Jan. '00)
Methyl bromide (see \JOzone hole gets larger\j and \JLooking for alternatives to methyl bromide\j) was in the news again in January, with three reports in the same issue of \INature\i which identified salt marshes in southern California as a major natural source of the environmentally and economically important compound. The reports, one each from the USA, Japan, and Germany, also point to the marshes as a source of methyl chloride.
The ozone layer which stands between us and the dangerous ultraviolet radiation from the sun is at risk, mainly from chlorine-containing gases released into the air by human activities, but also from methyl chloride, which comes mostly from natural sources. Around the world, the production of chlorine-containing compounds such as the chlorofluorocarbons (CFCs) used as aerosol propellants, solvents, and refrigerants, has been brought under tight control, but there are still large gaps in our understanding of the natural processes involved.
There is more methyl chloride in the atmosphere than even the most abundantly used CFC, and most of it derives from natural environmental processes. Methyl chloride supplies chlorine atoms to the stratosphere, where they can deplete ozone. This was not a problem until CFCs came along, because natural ozone destruction was balanced by constant ozone formation.
Some 72,000 tonnes of methyl bromide is used around the world each year, and about half of this escapes into the atmosphere. Methyl bromide has multiple roles, serving as a fumigant in museums, as a pesticide against insects, nematodes, weeds, pathogens, and rodents, and is also produced naturally from oceans and plants on land, and it is generated as a by-product of leaded fuel combustion and vegetation burning.
The USA is the world's major producer, with about 21,000 tons used each year in agriculture, commodity and quarantine treatment, and structural fumigation, but because of its significant role as an ozone-depleting substance, governments have developed controls that limit methyl bromide production. Of the methyl bromide that reaches the atmosphere, about 10% is believed to come from burning vegetation, 20% from fumigation, and 30% is produced naturally in the oceans, but this leaves around 40% of the methyl bromide unexplained.
The American study accounts for a quarter of that production, but still leaves 30% unexplained, as much as the entire amount caused by human activity. Salt marshes add up to only 0.1% of the Earth's surface, but we now know that they emit methyl bromide at rates greater than any other natural environment, on a per area basis. The search for the missing 30% goes on, with more salt marshes in other parts of the world and mangrove forests to be tested to nail down even more of the methyl bromide "budget". The study by Japanese researchers suggests that forested subtropical islands seem to be particularly productive as methyl chloride sources.
The German study has shown that when organic matter is degraded in soils and sediments, chloride and bromide ions can be converted to methyl chloride and bromide (and other related compounds) by iron-assisted reactions. Curiously, these reactions take place even when soil samples are freeze-dried, which would destroy any microbes and enzymes in the soil, so the reactions must be purely chemical, rather than under the control of microorganisms.
#
"A coral thermometer",1378,0,0,0
(Jan. '00)
The continuing argument about global warming brings a regular trickle of claims that the trends we see may be natural cycles, or short-term changes that will soon be reversed. That trickle continues to provoke ingenious scientists to find new ways of getting a proxy measure of past climate. For examples, see \JGlobal warming confirmed\j, December 1997; \JTwenty-year temperature record revealed\j, January 1999; and \JFossilized emu egg shells have a story to tell\j, May 1999 for just a few examples.
Other evidence, like that described in \JA few hot years\j, April 1998, relies on a variety of information, carefully calibrated against what we can see and observe, allowing us to infer what the climate was like at times when there were no accurate records kept. The newest of these measures, coaxing climate signals out of coral reefs, was revealed in the pages of \IScience\i late in January. University of Arizona geoscientist Julia Cole, and her colleagues from Kenya and California, have used the annual growth rings in a coral reef in Kenya to construct a 194-year proxy record of sea surface temperatures for the region. When the temperature is warmer, the corals take up less of the heavier natural isotopes of oxygen, while in cooler conditions, they take up more.
They say that the western Indian Ocean warms and cools on a decadal cycle linked to El Ni±o. At its height, the El Ni±o/Southern Oscillation (ENSO) causes drought in the western Pacific, but brings more rain to East Africa just as it does to the western USA, but it also warms the oceans off east Africa. According to Cole, "Our study indicates that in addition to being responsible for some of the year-to-year changes in East African climate, ENSO may also set the pace for slower decadal changes."
The changes begin in the Pacific Ocean, as a warming in the western Pacific, but then a chain of events is set off, and one of these changes is detected by the corals of the Indian Ocean, and enters the record as the increased sea temperature leaves its mark on the coral growth, and it is there, almost forever. Given that reliable records of sea temperatures go back only a few decades, and given also that the observed global warming since about 1976 may be obscuring what is happening in the ocean, it is essential to get a longer time scale to study, and that is exactly what the corals have to offer. If El Ni±o and its knock-on effects are sensitive to greenhouse warming, the new data set should tell us that this is so.
They find, based on the coral at Malindi, Kenya, that the regional ocean warmed by more than "2 degrees Fahrenheit" (more than a Celsius degree) overall since 1801, and that the years since 1980 are the warmest on record. The ratio of heavy to light oxygen in the coral skeletons can be measured with an accuracy better than one part in 10,000, yielding a precise estimate of the water temperature.
Most importantly, though, the coral they use \I(Porites lutea),\i which grows at 6 meters (20 feet) is ". . . the most tree-ring-like coral I've ever seen," according to Cole, and that means that it can serve as a calendar as well as a thermometer. Cole reports that it is possible to see the annual growth rings even with the unaided eye - usually, scientists must X-ray their specimens to observe the denser layers of growth that mark the end of another year by the coral's internal clock.
Further research is to involve examining some other coral cores to construct a more regional and seasonal picture of sea temperature fluctuations in the area.
#
"Coffee is good for you after all",1379,0,0,0
(Jan. '00)
Good science can come from almost anywhere, but it comes best when fertilized with novel information, provided to attentive ears. The ôbean countersö who maintain a watch on research budgets often call into question the cost of flying scientists around the world, but without such meetings, this story might never have been told.
When Dr. Mike McLaughlin of the Australian CSIRO's Land and Water Division was in Delaware for a meeting, he heard something that made him prick up his ears. Dr. Gustavo Lagos, from the Catholic University of Chile, was presenting a paper about the heavy metal contamination in Santiago's drinking water. After the bad news for coffee drinkers last month (\JCoffee and cancer\j, December 1999), now there is some good news, at least if your drinking water is a bit of a problem.
Toxic metals like copper and lead can get into water from a variety of sources. Some of it comes from storage tanks and pipes, more comes from solder used in joining the lengths of pipe, and some comes from natural sources. In office buildings, water can remain in the plumbing for several days at weekends, dissolving traces of these metals, and water left in metal urns can acquire significant amounts of cadmium and zinc. (This risk is increased even more in large works, where the urns may be used to make tea in large quantities, but that is a side issue here.)
The metals are absorbed more easily by "soft" water, so the very best water is also potentially the very worst. Santiago's water is of the very best, but there were concerns about how much heavy metal contamination in the water was reaching the local population, and it was on hearing this that McLaughlin wondered: how do the people of Santiago drink the water? He suspected that they drank a lot of coffee, and it occurred to him to wonder if the coffee might, in some way, absorb and hold some of the heavy metals.
Both copper and lead have long-term toxic effects in humans, and lead has now been shown to be closely linked to intellectual impairment, a drop in intelligence, in infants and young children (see \JLead in the environment\j, September 1998 for more detail). We need small amounts of copper in our diet (see \JA new excuse to eat lobster, crab, and crayfish\j, July 1999), but not an excess.
The result was a piece of research which will appear soon in the journal \IHuman and Ecological Risk Assessment.\i It was performed by Chris Impelliteri and co-authored by Herb Allen, Lagos and McLaughlin. And the conclusion: drip-filter coffee takes out between 78% and 90% of the dissolved heavy metals from tap water, apparently by absorbing the metals onto the surface of the coffee grounds.
The researchers set out to study this, because if the effect is there, it has important implications in public health, since the daily human exposure to heavy metals in some parts of the world may be rather less in some places than was assumed, and inaccuracy in public health risk assessment can lead to the loss of life, if it goes unquestioned.
Allen and his colleagues decided to find out how much of the contamination is removed in the brewing process, and how much is retained by the coffee grounds, and they found consistent results, using a drip filter, and three different commercial brands of coffee.
The science behind the absorption is simple: the positively charged ions of the metals bind strongly to the uncharged or negatively charged molecules in the coffee grounds. As the water slips between the grains, every contact offers more chances for the ions to be absorbed, and the research found that a deeper bed of coffee grounds did indeed produce more effective removal.
The team equates this to contact time, rather than distance traveled, and says that while plunger coffee may absorb some of the heavy metals, he would expect the effect to be reduced because the water rushes through the grounds under slight but greater pressure than in a drip filter. Equally, instant coffee serves no useful purpose, since the entire brew is consumed: it is only when the grounds are discarded that they carry away the load of heavy metals.
So far, the system has not been tested to see if it deals equally well with cadmium, zinc and mercury. The researchers also found that tea bags bind lead, but not copper.
The ability to extract heavy metals seems to be undiminished when the coffee has been made, so there is obviously a great deal of extra absorption capacity left behind. We recorded a problem with mercury in Brazil last year (\JThe return of Minamata\j, February 1999) for which this discovery, or an extension of it, might be highly relevant.
If supplies of used coffee grounds, or other suitable waste plant materials were available and performed the same task, this could turn out to be far more important than its effect on risk assessment models for drinking water standards in the world's cities, and far more important than reassuring the world's coffee drinkers. And it all happened because scientists were funded to travel from various parts of the world to take part in a meeting.
Equally, students looking for an idea on a science fair project would do well to consider some of the enquiries they could engage in, based on the findings described here. They could investigate the effectiveness of different forms of plant material, test the toxicity of the water (it is no good removing the heavy metals if nasty organics are being added to the water at the same time), and enquiring into the total load of heavy metals that a given filter material might carry.
#
"G. Ledyard Stebbins",1380,0,0,0
(Jan. '00)
A man regarded in his time as the world's leading expert on plant evolution, Stebbins died of the effects of cancer during January at his home in Davis, California. In a press release, the University of California reminded us that Stebbins was so brilliant that his theories on plant evolution established the discipline, yet so chronically absorbed in his thoughts that he once drove 120 miles without noticing a dead rattlesnake on the hood of his car.
That human aspect, however, is less the reason Stebbins will be remembered by biologists. He was long recognized internationally as a major leader and pioneer in the biological sciences, a reputation that was established by his 1950 text, "Variation and Evolution in Plants."
His career got its start in walks with his mother, who taught her children the names and songs of birds, and with his father, who took the children prowling through Atlantic tide pools. When he studied at Harvard, he was exposed for the first time to the notion of Darwinian evolution, and he decided in 1926 to become a botanist, and as a botanist, he brought a new slant to the study of evolution, which seems, most of the time, to be the province of zoologists.
Plants, he said, were different, and needed to be studied differently. "I pointed out, and still point out," he said in 1989, "certain differences among higher plants and higher animals that make it necessary to understand species in a different way. Certain things happen in plants that don't happen, or happen to a lesser degree, in animals." Over 70 years, from 1929 to 1999, Stebbins contributed ideas and knowledge to the biological community, so it is appropriate that he is commemorated in the name of the Stebbins Cold Canyon Reserve.
#
"February, 2000 Science Review",1381,0,0,0
\JReport card 1999\j
\JThe Internet and education\j
\JThe Web's potential for people with disabilities\j
\JPig intestines will cure it\j
\JA gene that incites colon cancer\j
\JNew drugs for prion disease in mice\j
\JGreen tobacco sickness\j
\JHIV and overland heroin routes\j
\JSalmonella and arthritis\j
\JThe structure of a cellular switch\j
\JPotato resistant to late blight and other diseases\j
\JGenetically engineered food and the developing world\j
\JWhy Johnny can't do science\j
\JMapping the world from the shuttle\j
\JThe nucleus of NGC 5548\j
\JThe end of the solar system\j
\JStardust begins work\j
\JThe most distant quasar in the universe\j
\JCommon algae can make hydrogen\j
\JThe end of the oceans\j
\JBird parents, predators, and family planning\j
\JThe origins of the first Americans\j
\JMustard plants mop up lead\j
\JSewage, urban runoff and algal blooms\j
\JThe secret of ecosystem health\j
\JA 16th century epic drought\j
\JAnother dry summer in north America\j
\JWater shortages in the US Southwest\j
\JBorehole temperature readings say it's getting hotter\j
\JThe anomalous lower atmosphere\j
\JDeath by global warming?\j
\JA faster rate of global warming\j
\JThe approaching ice age\j
#
"Report card 1999",1382,0,0,0
(Feb. '00)
Each year in December, \IScience\i journal publishes a list of the year's top ten stories, and soon after, we list the places where we covered these stories as they happened - or where we provided background before the story happened, or we note our lapse, and make it good, as soon as possible. At the end of 1999, we missed doing this, so here, belatedly, is our scorecard for 1999.
\BBreakthrough of the Year\b: this was the stem cells story, and we have 49 references to the term 'stem cell' in our updates. See \JA breakthrough with human embryonic stem cells\j, November 1998, for a good introduction, and search on 'stem AND cell' for other hits.
\BThe runners-up\b
\BGenomics\b: The terms 'genome' and 'genomics' appear 194 times in our science updates. See \JReport card\j, December 1997 for some of our past coverage. See \JThe first complete DNA sequence of plant chromosomes\j, December 1999 for a more recent account with a list of earlier articles.
\BLaser Cooling\b: see \JNobel Prize in Physics (1997)\j for a general background on this.
\BRibosome structures\b: see \JThe first clear picture of a ribosome\j, September 1999.
\BPlentiful Planets\b: see \JPlanet light\j, December 1999, for our most recent story, and \JPlanets galore\j, and \JTwo New Planets\j, both September 1998, \JNew planet\j, June 1998 and \JA new solar system\j, April 1999.
\BMaking Memories\b: See \JMaking a Smart Mouse\j, September 1999 for this story.
\BFlat and Happy\b: Is the universe flat? Cosmologists think so, but we trod lightly on this story of speculation and surmise. You will find parts of it under \JThe top ten breakthroughs of 1998\j, December 1998, and again in \JWhen was the Big Bang, again?\j in May, 1999. Count this one a near-failure.
\BNew Light on Photonics\b: See \JGetting More Bandwidth\j, November 1999 for our most recent coverage of this topic.
\BTracking Distant Ancestors\b: Well, we more or less missed this one, a story about chemical residues from very old shales that suggested that there may have been eukaryotic cells around, 2.7 million years ago. Actually, we ruled it out as a bit too speculative, but now the work is widely accepted, we have a small amount of egg on our face, and we will bring you a full account later this year.
\BMystery Flashes Unveiled\b: The gamma ray burst story: there are no less than 11 updates: search for 'gamma AND ray AND burst' to find them.
We elected to ignore the \IScience\i 'Breakdown of the Year', the Creationist win in Kansas, as being only of local American interest, we covered their 'Blunder of the Year', in \JMars Climate Orbiter goes missing\j, September 1999, on the metric/Imperial units mix-up that lost a spacecraft, and we have provided a detailed analysis on their 'Controversy of the Year', the question of GM foods - see \JThe great genetic engineering battle\j, July 1999, and the stories which follow it.
What a pity we decided the chemicals in the shale weren't interesting enough. Oh well, there's always next year . . .
#
"The Internet and education",1383,0,0,0
(Feb. '00)
There are a number of legends about scientists being asked about the worth of their discoveries (two of the most famous will be found under \Jelectrical measurement\j) but the more common approach to a new technology is to look at it in the form that Marshall McLuhan called rear-view mirror driving. In this approach, we look backwards, and consider, say, the telephone as a form of telegraph, or television as radio with pictures, or we compare the functions of the railway train with those of the stage coach.
The comparison is usually critical of the new technology, so the question becomes 'what educational use is this Internet thing when there is no place to use an overhead projector on it?' As we consider new technology, we fail to recognise that it probably has many new functions to offer.
At the moment, people are only just beginning to look at the Internet as a tool for education - or as any other sort of tool for that matter, but this is hardly surprising, give that the Internet is still in its infancy, even though it has existed since 1969. Many new technologies seem to follow a pattern of twenty years of development before the technology is properly worked out, followed by thirty years of growing acceptance and expansion as new functions are recognised. During the thirty years, people begin to see what the technology is really about, and how they can best use it, and it changes its shape, while changing the society that uses it.
This curious half-century pattern probably relates to the time it takes to replace an adult human population with people who have had time to grow used to the new technology, and then get into positions of power where they can apply them, but give or take a few years, it is remarkable how many technologies have kept to this time table.
The first half-century of printed books saw 35,000 or perhaps 40,000 titles and 20 million volumes which are known collectively as incunabula, from the Latin word for "cradle". At the end of two human generations, the technology of the \Jbook\j had settled into a stable pattern, with accepted sizes, binding standards and typefaces, and most developments since that time have followed a similar path. New forms like the novel and the coffee table book may have emerged since then, but there is nothing printed in books today that would not be immediately understandable to somebody who learned to read in the year 1500, at the end of the incunabula phase.
The first real \Jrailway\j was te Liverpool and Manchester line, built in 1830, with many lines being built by 1850, and continents being spanned by 1880. After that time, the railway shaped the nature of warfare by allowing troops to be transported rapidly into a war zone. For a long while, though, the locomotive was known as the Iron Horse, showing how people looked backward for an analogy that explained the new technology.
"Wireless telegraphy" (a most revelaing name!), later to be called radio, is another fine example. Radio began when Hertz discovered radio waves in 1888, by 1908, it was in effective use, and by 1938, most homes in the developed world had a radio receiver. Television began with the first successful transmissions in 1925, and after World War II, it took off in Britain and America in 1946 - by 1975, most homes in the developed world had a television set.
The telegraph itself began with the first workable telegraphs in Britain and Germany around 1835, large-scale wiring of individual countries by 1855, and the world linked by submarine cables by 1885, so reporters in Australia and the USA no longer rushed to board arriving ships, eager to get copies of overseas papers first, to tie up the latest news. The telephone was invented in 1876, it was beginning to be accepted in some offices around 1900, in 1914, people in New York could talk to people in San Francisco by telephone, and became common in homes around 1930.
The first projected motion pictures were shown to the public in 1895, and by 1915, movies had progressed from short scenes to complex features, cinematography had become an art, and California had become a centre for film-making - by 1945, the Hollywood system was in full swing, and many of the classic films that we know and revere, even today, had already been made. Later, advances in computerised special effects would make most of these films seem amateurish by modern standards. (Looking at it another way, feature movies started around 1909, \IThe Jazz Singer\i, the first ôtalkieö was filmed in 1927, and 1959 saw that standard ôbiggest movieö, \IBen Hur\i released in cinemas.)
Computers were first seen only as glorified adding machines, which explains their name - the original "computers" were people, charged with the task of carrying out arduous and repetitive computations and calculations. While early versions were seen in the 1940s, modern computing really began with Fortran, invented in 1956. In 1975, the first floppy discs were being sold, the Altair 8800 "computer" was on sale, and the software company Microsoft was founded. By 2005, when the half-century is up, most homes in the developed world are likely to have a computer - but few of them will be used for old-fashioned calculation of the sort that the original computers did.
Then we have the Internet, which started as ARPANET in 1969, and came into wider use around the world in 1989, the year in which Tim Berners-Lee created the World Wide Web. As we begin the new century, if the new pattern holds, we are one-third of the way into the adoption phase, and we are beginning to have some ideas about what the Internet is, and how it should work, the functions it may fulfil. A few visionaries could see Internet-type structures as far back as 1945, but could they have predicted the Worldwide Web?
In the late 1980s, administrators in museums of science and technology found it a challenge to get even one computer for a staff of fifty - now we would take it for granted that each work space in an office would have a computer, and that these will be networked. Equally, in the early 1900s, a telephone on each desk was a ridiculous idea, and the notion of a teacher in a classroom having phone and Internet connections is still seen as remarkably wasteful - in most schools, anyhow. Equally, the schools where each student has a computer are rare, and students are required to complete examinations with pen and paper, even where the main part of the examination is essay-writing.
Clearly, technologies continue to be used and modified after their development and adoption phases, but the changes they make to society are muted after that time, as the technology simply becomes a part of the background. Jet airliners began in 1952, and matured with the flights of the first Boeing 747 and the first Concorde, both in 1969 - and changed the way people in the developed world take their holidays and vacations by the end of the century.
And who could have predicted in 1952 that holiday-makers would use Internet cafΘs to maintain a link with the folks at home, or that the vacationers would quite probably have booked their flight over the Internet, while they most certainly would have found out about their destination on the Web, that they would have learned about what they could see and do 'on the Net'? Which brings us back once more to the role the Internet will play in the future in helping people to learn, to find information, and to gain an education.
The matter seemed important enough to the AAAS for them to offer a session on the educational potential of the Web. Right now, said Robby Robson, the education reform coordinator at Oregon State University and a main speaker, the Internet offers a jumble of information that is poorly organized and frustrating to use. Robson believes that in the end, the system could become so effective and sophisticated that it will literally challenge the need for centralized schools and universities, said scientists who spoke in the session.
At the moment, the Internet offers remote access to raw information, but as Robson pointed out, this is not the same as real education. "I would suggest that no one should judge the usefulness of the Internet in education by what they see available today," Robson said. "That would be like equating our modern transportation system to the earliest automobiles travelling on dirt roads."
The key to the future, according to Robson, will be the use of metadata, data about data, information that will vastly improve our ability to find useful and trustworthy information. This metadata concept is similar to the information available from a library's card catalog, but would contain additional information such as technical requirements, educational level, and it would be able to be extended to meet many other purposes.
This would allow the information to be matched to the user's known profile or stated requirements, taking into account things like age, skill level, interests and educational goals. This is not the way that search engines work right now, where they generally search for phrases or words in the entire content of the pages they can see, a bit like ferreting through a warehouse, opening every carton, looking for useful items, more of a rummage than a skilled and honed enquiry.
"In the next few years, however, we should have trusted repositories for education that can use metadata to make the Internet far more useful in education," Robson said. "This infrastructure doesn't yet exist, but we're building it."
This would also probably lead to every learned or professional society in the world putting on line, in a standardized format, such things as course offerings, faculty databases, conference information or professional publications. With that in place, a user seeking a course in forest ecology could quickly be steered towards institutions worldwide that might have such courses, some of them available online.
Robson also addressed the problem of shovel-ware, a problem which dates back at least to the start of modern printing, when every printer's earliest job seems to have been the Bible, simply because it was available. In the same way, radio featured the stars of vaudeville, and television used up the stars of radio or cinema in its early years. "Right now, most people who are offering education on the Internet are just dressing up the same old materials in an electronic format," Robson said. "But there's much more that can be done."
On Robson's prescription, the future educational Internet will involve new digital libraries, "interoperability" so all types of data can be routinely blended and easily managed, interactive courses, lessons that are individually tailored for the needs of specific students, materials offered in the format that different students find most useful, and a pace that is specifically suited to an individual. He thinks online teaching could be as or more effective than any other kind.
The effectiveness of that process, in Robson's vision, could ultimately challenge at least some of the need for large centralized schools and universities. We might see anything from "college towns" enjoying elevated status as the nerve centers of a new economy, to suffering the same fate as mill towns in the twentieth century, he said.
But of course, none of this will ever happen, because the instructor cannot write on people's terminals with a stick of chalk. If you have not heard that objection yet, listen carefully, for negative statements with precisely that degree of logical consistency can be found in every newspaper, and every staff common room, across the world, because people are looking at a new medium in the rear view mirror.
#
"The Web's potential for people with disabilities",1384,0,0,0
(Feb. '00)
How do people who are blind, deaf, or disabled in other ways access and use the Internet effectively? The answer is that they can do so quite well, provided the available steps are taken to improve their access, but according to what the AAAS was told by John Gardner, a physicist who is an expert on the needs of the blind, these steps are often ignored. Industry and government, he says, have been slow to address this issue and make minor changes that could have a major impact.
Gardner developed his expertise the hard way, losing his eyesight to glaucoma in mid-career, and he has developed a research initiative called the Science Access Project. Expertise develops fast when you miss out on a grant because, being blind, you cannot see that the application is not double-spaced, as required by the grant agency. It is that sort of (dare we say it?) blind foolishness that Gardner would prefer not to see happening on the Internet.
Much of the time, the better way of doing things also helps other Web users as well, simple matters like adding an ALT statement identifying the image in a Web page when an image is added, but most page builders do not bother. There are many people who learn better in a variety of ways, by seeing, by hearing, touching, repeating, doing, and if these options are available, the result is more effective communication for all, not just for the disabled.
Gardner has developed a new form of Braille called "Dots Plus" that can make advanced mathematics available for the first time to blind people, and has also helped to create a start-up company called ViewPlus Technologies in Oregon that can directly market some of the new technology his research has invented.
Recent advances include a free "triangle" computer software program that helps blind students read and write mathematics and science. A graphing calculator plays an audible tone that can help blind people conceptually understand the image. New systems have been created to display graphic images from the web. And an instrument called the "Tiger Advantage" has for the first time allowed easy printing of graphics in a tactile, raised-dot form that blind people can use.
"Tiger Advantage" prints directly from standard Windows applications, allowing easy one-step production of text and tactile graphics.
gardner2.jpg
\BCaption\b: This map of the United States, with state borders, was created using the "Tiger Advantage" printer.
Gardner's main point, though, was that virtually all information on computers and the web should have multiple levels of access that anyone - normal or disabled - can tap into, to use in whatever way works best for them. For instance, he argued that some new graphics now showing up on the web are using a technology called "structured vector graphics" that significantly improves their quality. But if people could click on "accessibility links," a different format might appear and labels could be attached to certain points.
The information could then be made far more informative and useful both for normal and disabled users. A voice synthesizer might describe aspects of the graphic image, a "force-feedback" mouse could provide physical feedback so the user could help sense the graphic shape, or a tactile image could be printed. Other innovations would also be possible, so that on a map, a user could move down a row of buildings, feel their outlines and hear descriptions of the building names or the streets in front of them.
"These things are possible, they are not that technologically difficult and I think it's time for them to happen," he said in a Web news release just after the AAAS meeting. "Now."
#
"Pig intestines will cure it",1385,0,0,0
(Feb. '00)
One of the more unusual releases circulating on the Internet in February describes how a material derived from pig intestines is finding a role in medicine. The one product is being used to cure chronic sores, treat incontinence in women, and repair internal organs and hernias. Overall, the US Food and Drug Administration has cleared five applications of its use so far, with the most recent being the treatment of "full thickness" wounds so severe that they sometimes result in amputation, if they do not heal properly.
The information comes from a promotional release, not from a refereed journal, but the source of the release is the highly reputable Purdue University, which means it can be treated as reliable. The material was isolated from layers of tissue called small-intestine submucosa (SIS), after researchers noted, back in the 1980s, that the small intestine had unusual healing properties.
At the time, they were looking for ways to make substitute blood vessels, and while the researchers are unsure how SIS works, they believe it serves as a natural framework for repairing and growing tissue. Once applied to skin wounds or sutured internally, the material prompts the body to build new tissue, inexplicably replacing the intestine-derived material with new human or animal tissue. They have concluded that the substance contains a number of proteins and "growth factors" that signal the cell to start the healing process.
They describe the SIS-assisted healing as being more like the healing seen in a young child, as compared with the slower healing of adults. The efficient healing almost eliminates the formation of post-surgical scars, which deform patients and interfere with the healthy functioning of tissues. According to Neal Fearnot, president of Cook Biotech and a Purdue adjunct professor of biomedical engineering, the body " . . . sees the implanted material and says, 'That needs to be repaired,' so it dismantles the material as it grows new cells of its own type". So far, he says, there have been no cases of rejection in either humans or animals, and another 80 applications are currently under consideration.
While it is not mentioned in the release, there remains a small concern that material from pigs may possibly carry porcine retroviruses which could transfer across to humans. Since SIS is a complex matrix of collagens, other proteins and growth factors, with no cellular material, this risk is greatly reduced, while screening can reduce it still further, but the ideal long-term goal ought to be to develop a synthetic form of SIS, or, with a clear picture of how SIS operates, to develop an artificial material which achieves the same effects.
There has been no sign that viruses \Ican\i be transferred this way, and until that happens, the minimal risk needs to be kept in mind, but it is unlikely to outweigh the major advantages to be gained from SIS. The risk is far greater if pig organs are used to provide transplant materials: a story which broke in mid-March on the cloning of five piglets (see \JFive Little Pigs\j, next month) will make the retrovirus problem a serious one.
Key names: Stephen F. Badylak, Neal Fearnot, Matthew Parmenter and Leslie Geddes.
Web site: http://www.cookgroup.com/cook_biotech/ is the site of Cook Biotech, who are the commercial producers of SIS, and they offer a fact sheet at http://www.cookgroup.com/cook_biotech/fact.html
#
"A gene that incites colon cancer",1386,0,0,0
(Feb. '00)
A gene that triggers the development and growth of pituitary tumors is also expressed in colorectal tumors, pre-cancerous colorectal polyps, and, most abundantly, in invasive colorectal cancer, according to a report in \IThe Lancet\i during February. That trio of 'hits' makes the gene's product an important target for researchers on colon cancer.
The novel transforming gene is called Pituitary Tumor Transforming Gene or PTTG1, and it " . . . may prove to be a powerful tool for identifying colon polyps most at risk for becoming malignant and for distinguishing aggressive colorectal cancer, the third leading cause of cancer death in the U.S.", according to Anthony Heaney, lead author of the paper.
It appears that PTTG1 plays an early and vital role in the sequence of events that converts normal cells to malignant ones. This means that at the time of colonoscopy or colon cancer surgery, cells could be tested to see if they express a high level of PTTG1. If they did show high levels, that would be a clear indication of neoplasia, enough to justify using more aggressive therapy. The discovery could also lead to better ways of monitoring individual patient response to therapy designed to curb the spread of colon cancer.
PTTG1 was first found in Shlomo Melmed's lab by Dr. Lin Pei and associates in 1997, when it was described as 'an oncogene that disrupts the normal steps of cell replication and activates a key player in tumor development called basic fibroblast growth factor, or bFGF'. When bFGF is turned on, it stimulates vessel growth to feed new tumors, which makes them grow and spread, enlarging the tumor.
PTTG1 occurs in low levels in most normal human tissues, but there are large amounts of the gene's product in malignant cells and in pituitary tumors. There are many different stages of cancer progression which can be identified in colorectal cancer, and Dr. Heaney theorized PTTG1 might be amplified in this disease at some stage. Starting from this hypothesis, Heaney's group examined samples from 68 colorectal tumors, including 48 carcinomas and 20 colonic polyps.
PTTG1 was highly expressed in every carcinoma and in 19 of the 20 polyps. That in itself looks like high-class confirmation of the hypothesis, but there was more: when a cancer had spread to a patient's nearby lymph nodes or the liver, the PTTG1 levels were significantly higher than if the cancer was confined to the bowel wall alone. On top of that, higher PTTG1 expression was also seen in tumors that were more vascular. 'More vascular' means having more blood vessels, and tumors with better blood supplies are the ones most likely to grow.
In a masterpiece of scientific restraint, Heaney writes that these '. . . findings indicate that the novel transforming gene, PTTG1, may be a marker of invasive colorectal carcinoma,' when a person less constrained by the rules of scientific behaviour would be up on the rooftops, beating drums, blowing whistles and shouting. However you look at it, this seems destined to be a key discovery.
The findings should also help unravel the biology of tumor development. As this occurs, normal cell growth processes are disrupted and abnormal cell proliferation begins. It appears that the gene is a very early player in that process and acts in the early stages to sensitize the cell to other molecular events, which then drive the normal cell along a path towards cancer.
One odd thing: PTTG1 is a shortened version of Pituitary Tumor Transforming Gene, yet if Melmed's team had been studying colon cancer in 1997, rather than pituitary tumors, the newly-discovered gene might have a different name. "This is how serendipity finds its way into science," Heaney said in a comment on the Internet. "We aren't oncologists, we aren't colon cancer scientists, we're endocrinologists who were looking at the pituitary when we found this gene. By chance, we were able to find it in other tumors as well."
Key names: Anthony P. Heaney, Shlomo Melmed and Lin Pei.
#
"New drugs for prion disease in mice",1387,0,0,0
(Feb. '00)
In late February, researchers from the US National Institute of Allergy and Infectious Diseases (NIAID) announced in \IScience\i that they have found a new class of compounds that can be used to slow the development of scrapie in mice. Scrapie was originally a \Jprion disease\j of sheep, and the cause of BSE (mad cow disease), and more recently, of a newer condition with symptoms very like \JCreutzfeldt-Jakob disease\j (CJD), but caused by eating "mad" cows.
Every prion disease involves the formation of clumps (aggregations) of toxic proteins in the brain, and this aggregation process is blocked by the new compounds. The researchers hope that if future studies show that the chemicals work in humans, this research may one day benefit not only patients with CJD, but also patients with non-prion diseases involving protein aggregation such as Alzheimer's disease and type 2 diabetes, both of which appear also to involve misfolded proteins, even though no prion seems to be involved.
On the evidence available, prions lie behind all the transmissible spongiform encephalopathies (TSEs), a group of rare, fatal disorders that slowly destroy the brain and nervous system. They include BSE, scrapie, chronic wasting disease (often abbreviated as CWD) in deer and elk, and CJD in humans. In these diseases, the brain accumulates abnormal forms of prion protein, where the protein is wrongly folded. Somehow, a single misfolded protein seems to be able to direct other proteins to take up the same wrong form.
The researchers from NIAID's Rocky Mountain Laboratories were looking at the effect of cyclic tetrapyrroles on TSEs. These are compounds that include drugs used in cancer therapy, but other work in the same labs had shown that cyclic tetrapyrroles block the conversion of normal prion protein to an altered, disease-causing form in the test tube. The new research shows that these same agents significantly slow TSE disease progression in mice.
The mice were injected with high doses of scrapie, then given one of three treatment compounds at different times during the 80-day incubation period. When given at the time of injection or when mixed with the injection, the compounds extended survival time dramatically, in some cases by 300%, but when given later during disease, they had minimal effect.
The chemicals may be useful in treating blood products to inactivate infectious prions, but their other importance probably lies in showing us where to look next. Since diagnosis of CJD in humans can be made only after symptoms appear, the research group say they are trying to identify a compound that works later in the disease. "The extremely wide variety of these compounds available for testing makes our current search for one that can affect disease after the onset of symptoms much more promising," according to Suzette Priola.
Key names: Suzette Priola, A Raines, and W S Caughey
See also: \JPrions--how they work\j and \JNobel Prize in Physiology or Medicine (1997)\j for more background.
#
"Green tobacco sickness",1388,0,0,0
(Feb. '00)
A recent report in the \IAmerican Journal of Industrial Medicine\i indicates that American field workers are showing higher levels of green tobacco sickness, characterized by headache, nausea, vomiting and dizziness. Dr Sara Quandt says that 41% of tobacco farm workers reported having green tobacco sickness at least once during the last northern summer.
The illness typically occurs after exposure to wet tobacco leaves in the morning while plants are still covered with dew, or after a rain. It is a form of acute nicotine poisoning caused by contact with nicotine on the plants, which is rapidly absorbed through the skin. The disease was first recorded in 1970, but has been rare, mainly because small family tobacco farms involve less contact with wet tobacco, and the workers used now are generally shorter than those who worked the farm in the past, and shorter people make greater skin contact with the plants. As well, moving from farm to farm, the workers have greatly increased the time they spend in the fields, compared with those only working on their own farm.
Another factor is the spacing of the plants. Production research has shown that the yield is increased when rows are put closer together, and this increases skin contact again. Who could have predicted that such simple changes could have made a rare disease more common?
#
"HIV and overland heroin routes",1389,0,0,0
(Feb. '00)
A study published in \IAIDS\i in January reveals that outbreaks of injection drug use and HIV-1 in Burma, India, China, and Vietnam are associated with overland heroin trafficking routes which start in Burma and Laos. While the role of drug users' needle-sharing and sexual behaviors were known to play a role in the early spread of HIV into unaffected areas, this is the first time that heroin trafficking routes have been related to the spread of HIV in South and Southeast Asia.
The main lesson to be drawn from this, says Chris Beyrer, the lead author, is that single-country narcotics and HIV programs are unlikely to succeed unless the regional narcotic-based economy is also addressed. The evidence for their conclusions came from a number of disciplines. The evidence includes data on the molecular epidemiology of HIV-1, field-based research with drug users and their communities, and existing information on narcotics production and control. They were able to undertake a series of confidential key-informant interviews with injection drug users, drug traffickers, local and ethnic leaders, public health staff, and narcotics control personnel in India, Burma, China, and Thailand.
The Golden Triangle, made up of northern and eastern Burma, western Laos, and northern Thailand, has been a major center of opium poppy cultivation since the 19th century. Burma now produces about 60% of the world's heroin, and heroin use in Burma has been on the rise since 1988. Predictably, Burma has one of Asia's most severe epidemics of HIV infection. Laos is the world's third-leading producer of opium. Thailand is no longer a significant producer, largely through government efforts aimed at reducing poppy cultivation.
The study found that recent HIV outbreaks coincided closely with four main drug trafficking routes: eastern Burma to China's Yunnan Province; eastern Burma to northwestern China; Burma and Laos through northern Vietnam and into southern China; and western Burma to the Manipur State in northeastern India.
#
"Salmonella and arthritis",1390,0,0,0
(Feb. '00)
A study reported in \INature Medicine\i during February describes the establishment of an important link between specific bacterial infections and developing autoimmune diseases such as arthritis. The researchers involved have shown clearly that immune system cells which fight bacteria can also attack normal cells carrying a specific mimic molecule, a molecule that closely resembles a bacterial protein.
So if there has been a previous bacterial infection, immune cells can attack "innocent bystander" cells, which have never even been attacked by the bacteria. This sort of attack by the immune system happens when the cells are stressed by exposure to irradiation, environmental toxins, or the body's stress chemicals.
As many as 10% of those who get \ISalmonella\i infections develop a 'reactive' kind of arthritis which lasts a few weeks, but a smaller, significant number of those patients get a severe, debilitating type of arthritis that is long-lasting. The bacteria have for some time been suspected of triggering the arthritis, and this now appears to be confirmed.
The researchers studied the behavior of a typical bacteria-fighting immune cell, the cytotoxic lymphocyte (CTL), as it approached infected body cells. If a cell has been infected by bacteria, it usually signals its plight by displaying small pieces of bacterial proteins on their surface. Then the CTLs are attracted by this protein 'flag', they dock with the infected cells, and trigger their rapid self-destruction.
The protein 'flag' in mouse cells infected with \ISalmonella\i is also found in certain bacteria associated with human arthritis, including \IBorrelia\i, better-known as the cause of Lyme disease. Unfortunately, it is almost identical to parts of a 'universal housekeeping molecule' found in humans, mice and all living organisms. This molecule helps proteins keep their shape.
When researchers artificially coaxed mouse body cells to display the \ISalmonella\i 'flag', the mouse CTLs would readily attack them, but they also went into attack mode if the cells displayed a piece of the mouse's own housekeeping molecule, or the identical human version.
In other words, the CTLs were responding to a molecular mimic, and in a normal \ISalmonella\i infection in mice, at least half of the CTLs are stirred up to recognize the mouse's own protein as well as the bacterial one, and given that the human system is so similar, it is likely that the human response will be the same.
#
"The structure of a cellular switch",1391,0,0,0
(Feb. '00)
The February issue of \ICell\i outlines the molecular structure of Cdc42. This is a molecular "switch" that turns on essential pathways in both normal and cancerous cells. They also unveiled the structure of GDI (guanine-nucleotide dissociation inhibitor), a key regulator of the Cdc42 switch.
Cdc42 operates on the Ras oncogene (see \JCoffee and cancer\j, December 1999) which then induces a malignant state in a cancer. The switch is a protein complex believed to influence the malignant transformation of cells, so understanding how it works opens the way for the development of unique tumor-blocking drugs, small molecules which alter the Cdc42 function.
Cdc42 is believed to play a dual role, alternating as an essential protein for normal cell growth and as a switch that allows protein from a mutated Ras oncogene to cause cancer, and many of the current strategies for cancer control target the Ras protein. This new knowledge may allow workers to go back a step in the chain, and given the essential function of Cdc42 in Ras-induced malignant transformation, it should be possible to block signals that lead to cancer by modulating Cdc42 activity.
The protein is not new: the mammalian Cdc42 protein used in the analysis was originally purified and cloned by researchers in Cerione's laboratory in 1990. Having the structure is an important breakthrough, but it will be some time yet before drugs can be 'designed' to block the switch.
The structure of the protein complex was mapped at MacCHESS, Cornell University's high-energy synchrotron source, using called X-ray crystallography to reveal the three-dimensional arrangement of atoms in molecules.
Key names: Richard A. Cerione, Gregory R. Hoffman and Nicolas Nassar.
#
"Potato resistant to late blight and other diseases",1392,0,0,0
(Feb. '00)
The late blight fungus, \IPhytophthora infestans\i might have met its match in a potato which was announced at the New York State Vegetable Conference in early February, and while the potato has been bred for specific New York conditions, the methods used were described in detail, offering us a useful general case study on the principles of plant breeding. This potato is multi-talented, being able to fend off late blight as well as other pests such as golden nematodes, scab and potato virus Y (PVY). The new potato is said to be the best clone available that is resistant to both races of golden nematode, while its additional resistance to late blight, scab and PVY make it a rare combination.
The potato, named New York 121 dates back more than 30 years, and has been bred by traditional means, using seeds of potato varieties grown in the Andes mountains of South America, and shaped by repeated selection for adaptations to produce E74-7, the 'mother' of NY 121. This selection was important because of its extreme resistance to potato mosaic viruses.
The other half, the 'father' of the new potato, came from the International Potato Center in Peru. It had resistance to multiple races of the golden nematode, a soil-borne pest, and one generation of breeding produced N43-288, the male parent of New York 121. This parent is mostly of Peruvian ancestry, but includes genes from a wild species from Argentina.
By dusting the female's (E74-7) pistil with the male's (N43-288) pollen nine years ago, Plaisted bred a potato with multiple resistance. From that point, it typically takes 14 years to get a new potato tested, developed, and onto the market, so this is a creditably short development time. The drawbacks: it is a mid-season potato that will be good for boiling, perhaps even baking, but it will be of little use for making French fries or chips.
The potato offers a slightly lower yield, according to the developers, but the potato offers lower costs and damage from spraying, due to its resistance, and it offers useful new material for other plant breeders to work with, increasing the yields. And while the potato strain makes novel chemicals to resist the different pests, and contains genes never before seen in potatoes in New York, it is not 'genetically modified' - which probably only serves to underline the inadequacy of the anti-GM camp's 'scientific' arguments.
Key names: Robert Plaisted, Bill Brodie and William Fry (whose name we will not comment on).
#
"Genetically engineered food and the developing world",1393,0,0,0
(Feb. '00)
The continuing scare tactics of opponents to genetically manipulated food are giving the vocal opponents the high ground over scientists who understand the principles and risks involved, but who have elected not to take up the cudgels on behalf of science. To understand why this is so, we need to look at risk and acceptable risk, and see what the difference is, and then we need to look at how hearts and minds are won.
\BThe nature of risk\b
Driving a car, crossing a road, flying in an aeroplane or smoking a cigarette has a risk attached to it, but every day, all over the world, people take these risks. They do so because they have assessed the risk, and either decided that it is too low to worry about, or that the payback makes the risk worthwhile. Their assessment may be wrong - and in the case of drug use, alcohol abuse and smoking, their assessment almost certainly \Iis\i wrong, but that is their choice.
When scientists talk about risk, they are aware of the residual risk, the small chance something might go wrong, and so they refuse to say that there is absolutely no risk. This responsible scientific caution is grabbed with glee by the Luddite opponents of science. They shout that the scientists have refused to say there is no risk, therefore the process is risky, ignoring the scientists' assessment of the risk as vanishingly small.
There is another aspect of risk. Responsible scientists feel the need to identify the potential for disaster and quantify it, where possible, making sure that the risk is identified and minimised. This sort of exercise goes on all the time, but in the wrong hands, can lead to all sorts of excitement in the media - see \JThe collider that ate the Earth!\j, July 1999 for an example. So quite often, when scientists talk about risk, they mean a faint possibility which they hope, by describing it, to prevent.
\BWinning hearts and minds\b
The problem, simply stated, is that the scientists considered the issue of GM many years ago, recognised the nature of the risks, and realised that the risks were not significant or could be controlled, and saw that the potential benefits were huge. This is immediately obvious to anybody who studies the matter, but scientists are na∩ve about the ordinary public, and about the media who survive by manipulating public opinion and running scare campaigns which excite the public.
In essence, the scientists expected the public to see what they saw, the obvious benefits of GM foods, and to have the insight to realise that, while there were risks, these risks were vanishingly small. So while the science community sat back and contemplated the good that would be done by genetic manipulation, they had no inkling of the harm that was being done by their opponents. They allowed the opposition to gain momentum, they allowed their rational and careful analysis of possibilities to be misrepresented as dire warnings, and they missed the boat.
Now, around the world, scientists are beginning to fight back, to say what should have been said at the outset, that while there \Imight\i be problems, and while they are working hard to identify these potential problems, the whole point of their work is to anticipate the risks and prevent any of them being translated into reality. They are also beginning to say now what should have been said at the outset: that even where there is a risk, the potential damage, even in the worst-case scenario, is far less than the certain benefits that will accrue.
In short, the scientists are saying that if we do not cross the road, for fear of the risk of being run over, we will never progress. And more importantly, they are now reminding people of what is on the other side of the road where, to mix a metaphor, the grass really \Iis\i greener.
In a talk at the AAAS meeting in Washington, D.C. in mid-February, Susan McCouch, associate professor of plant breeding at Cornell University, pointed out that in the developed world, societies enjoy abundant diets more varied now than at any other time in history. This, she said, was in stark contrast to the developing world where millions of people confront profound food insecurity every day. Part of the solution to righting this imbalance might involve something that is increasingly controversial in the developed world: genetically engineered food.
She posed the important question: is it ethical for well-fed people in the United States, Canada and Europe to ignore the potential of biotechnology to improve the nutritional status of hungry people around the world, particularly when the same technology is being used to extend life expectancy by producing pharmaceuticals?
According to McCouch, appropriately used agricultural biotechnology can help alleviate world hunger and malnutrition. She cited as an example the development of genetically engineered rice to produce iron or provitamin A (beta carotene) (described in \JNew rice strains and vitamin A and iron deficiency\j, August 1999) as an example of a product that has the potential directly to improve the quality of life for millions.
The problem of \Jhuman starvation\j will not go away because people wave banners or shout slogans. It will go away when science has found ways around the problems that face humans today. According to McCouch, the " . . . impact of delivering those essential micronutrients through food products such as enriched yellow rice is parallel to fortifying milk with vitamin D, salt with iodine or orange juice with calcium." Transgenic rice, she noted, offers immediate assistance as a staple food to people in need.
One of the major lines of attack used by the opponents of GM has been to attack GM as no more than a ploy by wicked multinational companies to control markets. This is a total nonsense in the case of the GM rice, as our August 1999 story reveals, but McCouch argues that people will support the technology when they see societal benefits, not just corporate benefits. Some opinion surveys show that people initially are uncomfortable with the idea of using biotechnology to transfer genes between organisms, she said, but those reservations can be overcome if people perceive a particular ethically or morally persuasive benefit.
The term biotechnology was originally coined during the industrial revolution, and over the past 25 years, has gained a new meaning, as 'modifying living things to suit human needs and preferences'. As McCouch pointed out, "humans have practiced biotechnology for about 10,000 years, or as long as they have practiced agriculture."
"Most of our domesticated food and fiber species have been altered through traditional crossing and selection to such an extent that they are no longer capable of surviving in the wild," she added. yet even today, not all biotechnology involves genetic engineering. Instead, it is more about understanding what is going on in the cell and the hereditary process. As a result, said McCouch, we are on the edge of an ethical dilemma, because we can do far more than ever before. We need to discuss these issues, said McCouch, but the discussion needs to be informed.
Early in March, Sir Robert May, the Australian-born scientific adviser to British prime minister, Tony Blair, was announced as the next president of the \JRoyal Society\j, a position previously held at different times by Sir Isaac Newton and Sir Joseph Banks. This story will be covered in more detail next month (\JNew RS President\j, March 2000), but it is worth noting that May, a physicist by original training, hit the headlines in Britain for his succinct analysis of the British press coverage of the GM issue, which he called 'crap'.
#
"Why Johnny can't do science",1394,0,0,0
(Feb. '00)
According to Donald Hayes, American students in high school pay the price for years of dumbed-down schoolbooks. Lulled into lexical laziness, he says, they find that science books are often too hard for them to read. In a report read at the AAAS annual meeting in February, Hayes and Loreen Wolfer outlined a plan to close the gap between science books and simplified texts for non-science subjects.
They argue that too many American students shun high school science for "easier" subjects, and they pass into adulthood as poorly educated science illiterates with a vulnerability to pseudoscience. Hayes comments that they " . . . are not prepared for science texts with all the domain-specific words, the equations and the longer sentences. There is a gulf between the two bodies of work in the schools, and the gulf isn't getting smaller."
There are a number of levels of scientific discourse. At the top, there is the full-strength scientific paper, published in a specialist journal, which uses the language of science, and of that branch of science, to say a great deal in a few words. This style normally assumes a high level of background knowledge, or expects the reader to gain that background by reading the references listed.
At a slightly more general level, there is a paper in \INature\i or \IScience\i, which will be read by the scientifically literate, who may be specialists in another area, but they are usually professional scientists. Journals like this will often provide a commentary or an editorial, explaining in simpler terms what the importance of a discovery is, and what its implications may be. These journals can assume general expertise in science in their readers, that the readers have some idea of what a field of research is about. Once again, the references point the reader to more detailed explanations if they wish to delve.
Serious popular journals of science like \IScientific American\i or \INew Scientist\i (and we place ourselves in this category where our science content is concerned) set out to take some new or interesting finding, and place it in a context, to show how it relates to other scientific knowledge. The writers assume that the readers have some familiarity with how science is done, and that they have absorbed the "big ideas", the principles that lie beneath all science, ideas like the concept of energy or biodiversity, or what species or atoms and molecules are. The writers will often provide suggestions for further reading, but they know they are writing for an audience which is on the same wavelength, more or less, coming to the text with a knowledge of the principles involved.
The same cannot be said for the writers of text books in science, who must explain all of the principles as they go along, at the same time introducing new terms, new concepts, and new frames of reference. They need to assume that their readers are completely naive and lacking in experience, and this alone makes it likely that a textbook will be more complex than a popular science journal.
One result of this is that school textbook writers tend to rely on other textbooks for their information and ideas for presentation, so wrong and outdated information is often enshrined and carried on into new titles. The text is kept simple, but the victim is accurate and reliable science.
But if Hayes is to be believed, the science textbooks are much harder than the texts in other subjects, but he argues that the answer is not to simplify science textbooks. Rather, it is to increase the complexity, the richness and the difficulty of the other reading matter students are exposed to. Hayes described a computerized LEX system evaluates texts for accessibility or lexical difficulty.
"After World War II, we simplified books for history, English and other non-science subjects by shortening the sentences and avoiding rare, unfamiliar words that might challenge readers to learn new concepts," Hayes recalled. "The rarer the word, the more specific the concept."
But if these unchallenging books were good enough up to eighth grade, at the end of which US students move into high school, Hayes believes that hard times lie ahead for these students. And if they have to struggle through their textbooks, they will drop out without learning the science they need to withstand the snake-oil merchants.
Hayes has used his LEX method in the past to point to excessively dense prose. With a normal newspaper set at 0, \INature\i between 1946 and 1994 got a rating of +25.7, while \IScience\i was a bit lighter, at +20.1 on the LEX scale. \INew Scientist\i gets a rating of 7.2, \ITime\i is 1.6.
Hayes argues that while the readers used in the most basic school grades have improved, US schools have retained the simple texts used in higher grades, so that students in the USA are working on texts much less challenging than their counterparts in other countries.
Hayes proposes an experiment where some school districts that are planning to replace outdated, non-science textbooks, adopt new, more difficult books with rarer words in more complex sentence structures, while others would continue to use the old, oversimplified books. If he is correct in his hypothesis, testing later should show that the students using the challenging texts performing better.
At the low end, a farmer talking to dairy cows has a LEX rating of -56.0, mothers' talk to children aged 5 scores -45.8, \IWinnie the Pooh\i scores -43.3, and second-grade schoolbooks in the period 1963-1991 are only -42.8. Eighth-grade schoolbooks from the period 1963-1995 period rate -22.0 and recent high school English books -22.3, and high school science tests rate at-0.5.
#
"Mapping the world from the shuttle",1395,0,0,0
(Feb. '00)
During February, the space shuttle Endeavour carried out the Shuttle Radar Topography Mission (SRTM) which gathered high-resolution radar data which will help map much of the surface of the Earth to a higher resolution than ever before. To see why this is important, we will take one case study, to see how the maps can benefit humanity. The study looks at one of the more unusual beneficiaries, a group of geologists who have been trying to unravel the history of the Andes.
Especially on the eastern side of the Andes, where the mountains have continuous cloud cover, there are simply no topographic maps at all. No satellite ever 'sees' the mountains through the cloud, aerial photography is equally impossible, and the areas are just too remote for ground surveys.
Now, with the radar data, maps and pictures like the one shown here will be simple to gather. And a key target for the geologists is the Altiplano, a huge plateau surrounded by higher peaks and spreading through northern Argentina, western Bolivia and southern Peru. The Altiplano is about 300 km (190 mile) wide and 1000 km (620 miles) long, and all at an altitude of 3800 meters, including the world's highest navigable lake, Lake Titicaca. With good topographical maps of the sides of the area, the geologists hope to be able to puzzle out how the Altiplano was lifted up to its present position.
Part of the answer is that the eastern edge of the Nazca plate began shoving its way under the western edge of the South American plate, which makes up the continent itself and part of the South Atlantic Ocean floor, some 200 million years ago. The Nazca plate is a huge slab of the earth's crust which makes up most of the floor of the southeastern Pacific Ocean. As the Nazca plate drives down at 8 to 10 cm (3 to 4 inches) a year, the South American plate has been uplifted.
Until 20 million years ago, only the coastal regions of Chile and Peru were affected, but then the crust folded and began thrusting up mountains so sharply that there is a drop of about 12,000 meters, or 40,000 feet from the western edge of the Andes to the Peru-Chile trench in the Pacific Ocean over a horizontal distance of only 125 kilometers (200 miles). As a result, a chain of volcanoes erupted along the mountain range, and the area has remained active up to the present, and it is also subject to devastating earthquakes. This is where the benefit to humanity comes in, since a clearer understanding of the area's geology might lead to a better understanding of earthquake risks.
But just as important is an understanding of the upper headwaters of the Amazon. This would help scientists understand the hydrology of the Amazon, and may shed light on global weather patterns. Scientists do not need to be told this: they know there is no such thing as useless knowledge, but creative scientists know that even the most interesting knowledge generally has other uses, and this looks like being a very useful piece of knowledge in the future.
The SRTM used two radar antennas, one on board the shuttle and the other at the end of a special 60-meter (200-foot) mast. When the signals from the two antennas are recombined, in a process called radar interferometry, scientists will be able to construct three-dimensional images of the land below. The principle is similar to the way that we use our two eyes to see depth when we combine the images from our two eyes. The radar also returned information about the characteristics of the surface.
As the shuttle orbited the earth, its orbit was tilted 57 degrees to the equator, which gave it access to all of the land from 60 degrees north to 56 degrees south, or about 80% of the planet. Once analysed, the radar data will deliver a resolution of 30 meters, meaning that objects and features as small as 30 meters across will show up. For most of the world, remote areas are generally mapped only to a resolution of 1 kilometer. NASA plans to develop topographic maps at the new resolution for all of the surveyed areas, but NASA will also supply the raw data and provide research grants to about 40 groups of scientists worldwide for more detailed analysis.
The group studying the Andes expect now to be able to identify active faults, and you can analyze the river systems to estimate how fast the mountains are uplifting and eroding.
andes1.jpg
\BCaption\b: This exquisite Landsat image of the Andes region known as Altos de Pica in northern Chile, was produced at a resolution of 1 km, but is reduced here. It shows part of the western range of the Andes, known as the Cordillera Occidental, flanked on the west by a valley known as the Pampa de Tamargul. To the east is the beginning of the Altiplano plateau. A large smooth area in the center, yellow in this false-color version of the image, is a lava flow from a spectacular volcanic eruption 17 million years ago. Picture copyright ⌐ Cornell University, used here with permission.
andes2.jpg
\BCaption\b: Detail of the lava flow in the centre of the first picture, at the original scale. Picture copyright ⌐ Cornell University, used here with permission.
andes3.jpg
\BCaption\b: Detail of the valley known as the Pampa de Tamargul, at the original scale. Picture copyright ⌐ Cornell University, used here with permission.
#
"The nucleus of NGC 5548",1396,0,0,0
(Feb. '00)
The nucleus of the galaxy NGC 5548 is unusually bright and associated with gas flowing around and into a giant black hole which forms the centre of the nucleus. Now, following a 24-hour observation using NASA's Chandra X-ray Observatory, a team of astronomers has taken measurements which demonstrate the existence of a blanket of warm gas that is expanding rapidly away from the black hole. A report of their findings will be published in the March issue of the European journal \IAstronomy & Astrophysics\i.
The team used the Low Energy Transmission Grating in conjunction with the High Resolution Camera to measure the number of X rays present at each energy, and from this, they constructed an X-ray spectrum for the source.
In the central region, where gases are falling into the black hole, the inflow produces an enormous outpouring of energy that blows some of the matter away from the black hole. The Low Energy Transmission Grating allowed the team to get a view of the gas that forms a warm blanket that partially covers the innermost region where the highest energy X-rays are produced.
The high energy X rays streaming away from the region of the black hole heat the blanketing gas to temperatures of a few million degrees, and the blanket absorbs some of the X rays from the central source. The result is a spectrum with both absorption lines and emission lines in the X-ray spectrum, and because each line relates to a particular element, this gave them a roll call of the elements present in the gas. They were also able to determine the number of electrons each atom has retained in the hostile environment of the black hole, and how the gas is moving there.
Overall, they found lines from eight different elements including carbon, nitrogen, oxygen, and iron, and the amounts were about 100 times more than previous readings at optical and ultraviolet wave lengths.
Digital versions of the X-ray spectrum and other information are available on the Internet at: http://chandra.harvard.edu
Key names: Jelle Kaastra, Rolf Mewe, Albert Brinkman, Duane Liedahl and Stefanie Komossa.
#
"The end of the solar system",1397,0,0,0
(Feb. '00)
This is how the world, and its solar system ends: in the short term, we either freeze or fry, while in the long term, we decay. That was the gist of a paper by Fred Adams, presented at the annual meeting of the American Association for the Advancement of Science (AAAS) in mid-February.
Adams, who is an associate professor of physics at the University of Michigan, has taken a different line from other scientists, who are mostly interested in how the universe and the solar system began. We can relax, though, as the problems are not due to start just yet, but we should not relax too much, because our descendants won't be here to see the end.
Over the next 7 billion years, said Adams, our sun will age, gradually exhausting its fuel supply and collapsing into a white dwarf, but before it does this, it will mushroom in size, shining so brightly that it will fry the surfaces of all the inner planets, including the earth. So we don't have 7 billion years, just 3.5 billion, give or take a few million years, before the biosphere is sterilised, swept clean of all life forms as we know them.
There was some good news in Adams' paper. First, life is only about half-way through its time on earth, and almost anything could happen in the intervening period. Aside from any strange evolutionary pathways that we could not even begin to guess about, we may be rescued by a close encounter between our solar system and a passing star.
Adams and Gregory Laughlin, a scientist at NASA's Ames Research Laboratory, used a computer to model possible interactions between nearby binary stars and the orbits of the Earth, sun and outer planets, especially Jupiter. It seems that Jupiter is vulnerable to gravitational interactions with a passing star, that its orbit could be disrupted quite easily by a star passing by, close to our sun. This in turn could affect the earth, with even a modest change in Jupiter's orbit having the potential to bring about catastrophic changes to the earth.
They estimate there is about one chance in 100,000 that the earth could be flung into the sun in the next 3.5 billion years, but the earth might also be thrown out into deep space, where it would take about a million years for the oceans to freeze solid, while life of sorts would continue near hydrothermal vents on the ocean floor, which are warmed by radioactive heat from deep within the Earth. In other words, life on earth might become similar to the life that scientists hope one day to find on Europa, the moon of Jupiter which has thick ice sheets that seem to cover liquid oceans, deep below the surface.
Long after our solar system has faded away, said Adams, our galaxy will move into the Degenerate Era. By then, the only stellar objects remaining will be white dwarfs, brown dwarfs, neutron stars and black holes. Over time, the available dark matter will be consumed, and then the mass of white dwarfs and neutron stars will begin to dissipate through a process called proton decay, and then the enormous mass of the black holes that will have formed must eventually dissipate into thermal radiation, photons and other decay products.
It gets even gloomier, with the final result, according to Adams, being that once the black holes have radiated away, there will just be a diffuse sea of electrons, positrons, neutrinos and radiation suspended in nearly complete and total blackness.
See also \JEnd of the oceans\j, this month.
#
"Stardust begins work",1398,0,0,0
(Feb. '00)
NASA's STARDUST probe (see \JProject Stardust\j, February 1999) began collecting samples of a cloud of gas and dust from between the stars. Contrary to most science fiction movies which portray space as cold, black and empty, our solar system is sweeping through a void that is anything but vacuous.
In terms of what we can achieve on earth with vacuum pumps, interstellar space is still a hard vacuum, almost totally empty. But when you take into account the huge amount of space that carries clouds of gas and dust, there is quite a lot of material out there. More importantly, these dusty clouds are the places where new stars are born and where complicated chemical reactions form organic molecules, including amino acids.
We have known that the cosmic dust was there for many years, with detection coming from NASA spacecraft such as Pioneer, Voyager and Galileo. Detection is one thing, but capture and analysis is another thing altogether, and the STARDUST mission is out there with the task of catch dust grains which enter our solar system, moving 30 at km/s (around 20 miles per second), about 30 times faster than a speeding bullet and ninety times faster than the speed of sound. The dust grains may be small - about 1/1000 of a millimeter across, but they are still important.
To astronomers, the dust is important, because it obscures their view of what lies beyond the dust, but the dust is important to life. As it absorbs light energy, the dust in the cold dense clouds between the stars is a centre where complex organic molecules can form, perhaps providing the building blocks of life, in a sheltered place where the ultraviolet light from nearby massive stars cannot reach.
The craft is to collect dust which will be returned to earth in 2006, and the grains are expected to be mostly carbon, silicon, and oxygen. Even so, they are the major repository of heavy elements in our Galaxy outside of the stars, or so scientists think. So while STARDUST's main role is to collect dust spewing away from the comet Wild-2, which the craft will meet in 2004, it has now deployed its cosmic dust collector and pointed the device toward the stream of incoming interstellar gas.
Stardust uses aerogel, a material which is 95% air, and the lightest artificial material ever made. When a high-velocity particle hits aerogel, it buries itself in the material, creating a carrot-shaped track up to 200 times its own length. The American scientists say jokingly that this is like collecting invertebrates by 'scraping bugs off a windshield', with the exception that aerogel is able to collect the 'bugs' without destroying them. And unlike the windshield on a car, the collector is reversible, and right now, dust is being collected on the back, while dust from Wild-2 will be collected later on the 'front'.
The plan is to collect interstellar dust until May 25, after which the collector will be returned to its stowed position until mid-2002, when another period of interstellar dust collection is scheduled.
While nobody is sure what typical interstellar dust grains look like, but we can get some clues from the ways interstellar dust absorbs, emits, and reflects light. It is clear that the dust is quite unlike household dust. Recent work suggests that most of the dust grains are not spherical, and it is now most likely that the dust follows a fractal adhesion model for dust grains involving 'random conglomerates of spherical compounds of different properties'. But to really know what interstellar dust looks like, we will need to wait until 2006, and hope the mission survives the trip through the tail of comet Wild-2
#
"The most distant quasar in the universe",1399,0,0,0
(Feb. '00)
A newly discovered \Jquasar\j, RD J030117+002025, in the constellation Cetus, is now the previous record holder as one of the earliest known structures ever to form in the universe. It was spotted using the 5-meter Hale telescope at Palomar Observatory in California and the 4-meter Mayall telescope at Kitt Peak in Arizona, and a spectral analysis of the quasar's light was completed at the Keck Observatory in Hawaii.
The new quasar (see \JThe farthest quasar\j, December 1998) has a red shift of 5.50, meaning that light travelling to us from this quasar has journeyed about 13 billion years to get here. Because the quasar is so far away, looking at it is necessarily the same as looking at the earliest times in the universe.
Until recently, no one had discovered an object that came close to a red shift of 5.0 (see \JAstronomical record set - from earth\j, March 1998), and only two other quasars make the cut-off, with red shifts of 5.0 and 5.03, so the odds of finding such a quasar were low, given that the portion of the sky under study was just 10 arcminutes by 10 arcminutes. The findings will be presented in an upcoming issue of the \IAstrophysical Journal Letters\i.
Key names: Richard Elston, Daniel Stern, Peter Eisenhardt, Hyron Spinrad, Steve Dawson, and Adam Stanford.
#
"Common algae can make hydrogen",1400,0,0,0
(Feb. '00)
One of the most interesting reports to come out of the AAAS meeting in February was about a metabolic switch that triggers algae to turn sunlight into large quantities of hydrogen gas. This discovery, with its potential to make valuable fuel cheaply, was described by one of the discoverers as "the equivalent of striking oil." First reported in the January issue of the journal \IPlant Physiology\i, the discovery has been a bit of a sleeper until a press conference at the AAAS.
Usually, hydrogen is extracted from natural gas, a non-renewable source, or sometimes by electrolysis at great expense, but this discovery makes it possible to harness nature's own tool, photosynthesis, to produce the promising alternative fuel from just sunlight and water. Small-scale cultures of the microscopic green alga \IChlamydomonas reinhardtii\i have already shown their hydrogen production capabilities in the laboratory, but current production rates are not high enough to make the process immediately viable commercially.
That, say the discoverers, is something that could be fixed by achieving a ten-fold gain in production, well within the possible scope of future research. They will be looking at other species, and seeking to select more efficient strains of the best species.
At that point, a hydrogen economy would be feasible, with a single, small commercial pond producing enough hydrogen gas to meet the weekly fuel needs of a dozen or so vehicles. For nearly 60 years, scientists have known that certain types of algae can produce hydrogen gas in this way in trace amounts, but not enough to be of interest. The breakthrough has come with finding a molecular switch that turns off the cell's usual photosynthetic apparatus, so that the cell uses stored energy, with hydrogen as a by-product.
All it takes, it seems, is the absence of an essential element, sulfur, from the microalga growth medium. The absence of sulfur stops photosynthesis and halts the cell's internal production of oxygen, then with no oxygen, the cells are unable to burn stored fuel in the usual way, through aerobic respiration. They then switch to an alternative pathway which may be universal in many types of algae, and this generates hydrogen gas.
Melis believes that this may be an ancient strategy that the organism developed to live in sulfur-poor anaerobic conditions. While the alga culture cannot live forever when it is switched over to hydrogen production, it can manage for a considerable period of time without negative effects if it is allowed first to grow in the normal way, and accumulate a generous supply of carbohydrates and other fuels. After this, the algae are transferred to stoppered glass bottles, with no sulfur present, and they are allowed to consume all the oxygen.
After about 24 hours, photosynthesis and normal metabolic respiration stop, and hydrogen begins to bubble to the top of the bottles, with yields of about 3 mL of gas for each liter of culture each day for four days, at the end of which time, the culture must be allowed to return to normal photosynthesis to recover, before being tapped again.
Key names: Tasios Melis, Liping Zhang, Michael Seibert, Dr. Maria Ghirardi and Marc Forestier. #
Moldable, "tunable" magnets.
(Feb. '00)
A group of Canadian researchers reported in \IScience\i, late in February, on a novel magnetic material which may lead to entirely new uses in the future. It remains early days, but they believe that uses might include high-density data storage, anti-static coatings for aircraft or spacecraft, " . . . and a host of other applications".
They believe the final product will offer a tough, lightweight, moldable material, with "tunable" magnetic properties, but what they have done so far is to tweak iron-and-polymer molecules, transforming them into a magnetic ceramic material, which can be molded into various shapes.
The key to their success is a technique for opening the rings in polymers. They treated monomers of silaferrocenophane (SFP) with gentle heat to produce poly(ferrocenylsilane), or PFS. When this is poured into molds and heated gently, the molecules form a cross-linked network, loaded with trapped iron.
When this material is treated at around 500 degrees Celsius (930 degrees F) in a pyrolysis chamber, the iron atoms are set free to seek out other iron atom, forming nanoclusters. Larger clusters are more strongly magnetic, or ferromagnetic, so the researchers could control or tune the material's magnetism, by adjusting the temperature inside the pyrolysis chamber, with higher temperatures producing larger clusters.
In the researchers own words, they have " . . . created a new class of magnetically tunable, shaped ceramics, which you could potentially form as powders, wires, films, or tapes, for example". And while they are the first to stress that there is more work to be done, they say that they have now characterized these materials to the point where you can now sit down and decide what the utility of these materials might be.
Key names: Mark MacLachlan, Geoffrey Ozin, Manners, Madlen Ginzburg, Neil Coombs, Thomas W. Coyle, Nandyala P. Raju, and John E. Greedan.
#
"The end of the oceans",1401,0,0,0
(Feb. '00)
An American meteorologist believes that the Earth's oceans will disappear in about one billion years due to increased temperatures from a maturing sun, but, as he told the AAAS meeting in February, the planet's problems may begin well before that, because of falling levels of carbon dioxide in the atmosphere. A Penn State professor of meteorology and geosciences, James F. Kasting, says that as the sun progresses along the main sequence, it is getting brighter with time and that affects the Earth's climate.
Eventually temperatures will become high enough so that the oceans evaporate: at 140░F, (60░C), water becomes a major constituent of the atmosphere, but much of this can be expected to migrate to the stratosphere, where it will be lost into space. Where past predictions have assumed that the oceans would only evaporate when the sun left the main sequence, in 5 billion years.
A main sequence star leaves the main sequence when it stops burning hydrogen, and in the case of the sun, this is the point when our yellow, G-2 star becomes a red giant, reaching out to the orbit of Mercury. That planet will then disappear, while Venus will lose its atmosphere and become a burnt-out planet. The Earth will suffer the same fate, even though it is outside the red giant's immediate reach.
But on the calculations Kasting has done with Ken Caldeira, now at Lawrence Livermore Laboratory, the time scale could be as short as a billion years, but before our planet is a waterless desert, other things will go wrong. As the climate becomes warmer, the cycle of silicate rock weathering will speed up. This cycle releases calcium ions which combine with carbon dioxide from the atmosphere and bury it in the oceans as calcium carbonate.
Over time, this may drag carbon dioxide levels down so far that plants are unable to survive. There are two main pathways for photosynthesis: the C3 and C4 pathways, with 95% of all plants (including most trees and crops) using the C3 pathway. Within half a billion years, the levels of carbon dioxide will have fallen to the compensation point for C3 plants. Below the compensation point, carbon dioxide is not concentrated enough for these plants to photosynthesize.
The C4 plants, which include corn, sugar cane and other tropical grasses, can still photosynthesize because they have an internal mechanism which concentrates carbon dioxide, but these plants alone cannot sustain the biosphere as we know it today.
Obviously such matters are not a pressing concern right now, so why the interest? Mainly, such studies help us to understand how long a planet remains in an orbit that allows life to exist on it. Around any star, there is only a thin spherical shell which is 'just right' for life, not too hot, and not too cold. As the star matures, this shell moves slowly outwards, until it leaves a given planet behind. So if planets lose their water supply earlier than expected, this creates a shorter window for livable planets.
"If we calculated correctly, Earth has been habitable for 4.5 billion years and only has a half billion years left," says Kasting.
See also \JThe end of the solar system\j, this month.
#
"Bird parents, predators, and family planning",1402,0,0,0
(Feb. '00)
A report in a later February issue of \IScience\i gives us an interesting new insight into the forces which act to control the clutch size in birds, the number of eggs laid in a given season. In tropical and southern regions, female birds on average have fewer babies per nesting attempt, compared to their the birds in moderate, northern climates. This has provoked a long-running debate over whether clutch sizes are limited mostly by food supply, by the effects of predators, or both.
In 1949, Alexander Skutch tied these two theories together, suggesting that predators limit the meals available to bird babies because they force the parents to limit the rate at which they visit the nest, ultimately resulting in the evolution of smaller families. Birds who fly off the nest to find food run the risk of either attracting predators, or leaving their young open to attack. Until now, this hypothesis has not been tested.
The research team monitored 1331 nests in subtropical Argentina, and then compared these with 7284 nests in Arizona. In Argentina, bird mothers produced about 2.58 eggs per nesting, while those in Arizona averaged 4.61. As well, clutch size was smaller for species with higher predation rates, where researchers also saw fewer but bulkier meal deliveries. In other words, the Skutch theory stood up, at least at first glance.
The overall results from Argentina were less supportive: predators were far less troublesome, and female birds brought food to their young faster, but they had smaller clutch sizes in comparison with the Arizona bird mothers. On the Skutch hypothesis, say the researchers, they should have seen higher predation rates and smaller, less frequent feedings among the South American bird families.
In Argentina, the clutches had roughly half the number of nestlings seen in Arizona, and yet, the parents were bringing food to them at higher rates. Clearly, the smaller clutch sizes in southern areas cannot be explained solely by food delivery or predation rates. The researchers suggest assessing parent mortality rates in northern and southern climates, to see if this is also a factor. Birds who must endure cold winters or long, arduous migrations may need larger clutches to ensure the survival of their species, they suggest.
Key names: Thomas E. Martin, Paul R. Martin, Chris R. Olson, Britt J. Heidinger, and Joseph J. Fontaine
#
"The origins of the first Americans",1403,0,0,0
(Feb. '00)
Where did the first 'American Indians', the native Americans, the first humans in America come from and when? The clues from archaeology are unlikely to tell us much, for these were modern humans, inveterate innovators, who quickly adapted to new materials and new needs, so whatever signs of their culture they leave behind for the archeologists, we are unlikely to glean a great deal from it. So that leaves us with the biological clues left behind in human remains and in their descendants.
University of Michigan anthropologist, C. Loring Brace, outlined to the AAAS in Washington D.C., some of the information to be found in craniofacial measurements of old and new skulls from around the world. By running a tape over thousands of ancient and modern skulls, collected over a period of 20 years and combining this with new data from Mongolia that became accessible recently, Brace has shown how the first inhabitants of the Western Hemisphere fit into several different groups based on craniofacial patterns.
Brace and his colleagues made two dozen measurements on each skull, and used these to generate a dendrogram, a tree-shaped structure which shows how a set of specimens relate to each other. The specimens may be thought of as being twigs at the ends of the branches of the tree, and the closer together two twigs are, the more closely related those specimens must be.
From the dendrogram, the descendants of the first humans to enter the New World, including natives of Mexico, Peru, and the southern United States, have no obvious ties to any Asian groups. This, says Brace, may be because they have been separated from Asia for such a long time, but Brace hopes that " . . . new samples from Novosibirsk, Moscow, and Saint Petersburg, which we've recently been given permission to measure, will illuminate their origins."
A second group, including the Blackfoot, Iroquois, and other tribes from Minnesota, Michigan, Ontario, and Massachusetts, appears to be related to the Jomon, the prehistoric people of Japan, and the Inuit appear to be a later branch from that same Jomon trunk. Tribal groups who lived down the eastern seaboard of the USA into Florida share this origin, according to Brace.
Yet another group, originating in China and including the Athabascan-speaking people from the Yukon drainage of Alaska and northwest Canada, spread as far south as Arizona and northern Mexico. Brace describes them as closer " . . . to the living Chinese than to any other population in either hemisphere."
While it is already possible to say that the original Americans are not just minor variants of the same people, there is more to come Brace. believes that this will happen when the University of Michigan Museum of Anthropology craniofacial database is augmented with new samples located in institutions in the former Soviet Union, from sites in Mongolia, Siberia, and Eurasia, since the database is lacking detail on these areas.
#
"Mustard plants mop up lead",1404,0,0,0
(Feb. '00)
A vacant city lot in Hartford, Connecticut, that had been polluted with toxic levels of lead has just been cleared of lead, allowing a local soup kitchen to use the land to plant a garden that will help feed homeless people. The clearing was undertaken by six undergraduate students and to staff from Hartford's Trinity College, using a special breed of Indian mustard they planted at the lot over the northern summer of 1999.
The allowable lead level for soil that is to be used for residential or agricultural purposes is 500 parts per million (ppm), and the mustard treatment has taken the lead from greater than 1000 ppm to below the safety threshold. The process, once known under the more general term bioremediation, is now separately recognised as phytoremediation. It uses plants to remove pollutants from the environment or to render them harmless. The seeds and expertise were provided by EdenSpace, a biotechnology company which is exploring the possible commercialization of the process.
Lead contamination represents a particularly difficult problem to deal with, because there are no permanent, low-cost solutions for heavy metal contamination, and there are an estimated 30,000 such sites in the USA. The area of land, half a hectare (1.2 acres) is next to the soup kitchen which will now use the area for a garden, and it was once covered by a paint store which had been demolished, with much of the debris buried at the site.
The area had been ear-marked for a community garden, but this idea had to be dropped when the lead problem was found. Two crops were grown, and harvested, roots and all, the plants were burned, and the ash was disposed of as a hazardous waste.
The project looks to be the sort of community project which might be carried out in other areas. Key words for use in any Web search on this process include the company Phytotech, and the terms phytoremediation, lead, mustard, Indian mustard, \IBrassica juncea\i and \IRaphanus sativus\i
#
"Sewage, urban runoff and algal blooms",1405,0,0,0
(Feb. '00)
A recent paper in \IAquatic Microbial Ecology\i reveals that dinoflagellates, a common type of marine algae, may prefer urea, an organic nitrogen compound found in urine and in agricultural and urban runoff, over inorganic nitrogen sources such as the ammonium and nitrate ions that occur naturally in the ocean. When excess nutrients cross their paths, these single-celled organisms, can grow into potentially toxic blankets of algae commonly known as red tides.
This finding has particular implications for communities which pump their sewage, treated or untreated, into the ocean. One particular bloom site of interest to the Californian researchers, is, unsurprisingly, off the coast off California, where one of the blooms they looked at extended from the upper Baja peninsula in Mexico to the Monterey Bay. It occurred after heavy urban runoff events in the southern California region, and now appears likely to have been triggered by the increased concentration of urea introduced to the ocean by urban runoff, say the researchers.
The 1995 bloom was the largest seen off the California coast since 1902, and the new research began with the knowledge that urea can nourish the growth of dinoflagellates under laboratory conditions. The new study showed that the dinoflagellate responsible for the 1995 bloom, \ILingulodinium polyedrum\i, can use organic urea as a nutrient source and even prefers it over the inorganic forms of nitrogen which are more usually monitored in pollution studies.
The researchers say that urea represents an average of one-third of the total nitrogen uptake supporting growth of phytoplankton in regions where red tides can occur, rising as high as 60% in Chesapeake Bay on the US east coast, at certain times of the year. While phytoplankton are essential to life in the sea, serving as the base of the marine food web, there can be too much of a good thing. When sunlight and nutrients are both available, the plankton may start rapid growth, or blooms, leading to dense patches of algae floating near the surface of the ocean that can double in size daily.
As a rule, blooms are not harmful, but a small number of phytoplankton species can produce potent neurotoxins when they form into a bloom, sometimes poisoning or killing higher life forms such as zooplankton, shellfish, fish, birds, marine mammals, and even humans as the toxin is passed up the food chain (see \Jalgae\j for more on one of these problems, ciguatera poisoning).
While \ILingulodinium polyedrum\i produces yessotoxin, a compound related to the class of poisons that cause paralytic shellfish poisoning, there was no evidence that the 1995 bloom was toxic, but large algal blooms of any type pose an additional risk by lowering the available oxygen in the surrounding water when they decay, causing small marine animals, such as zooplankton and fish, to suffocate.
Other bloom are more deadly. A recent bloom of the diatom species \IPseudo-nitzschia australis\i was recently identified as the culprit when more than 400 sea lions died and many more suffered from domoic acid poisoning on California's Central Coast in 1998. So far, there is no evidence that this diatom responds to urea.
Key names: Raphael Kudela and William Cochlan
#
"The secret of ecosystem health",1406,0,0,0
(Feb. '00)
The diversity of plants and animals has long been regarded as a good indicator of a healthy ecosystem, but a study of 112 microcosms, living in Petri dishes in a laboratory, suggests that the health of a system may also be tied to a complex codependency between the plants and animals that produce organic matter and the simple organisms that break it down. The study was described in a paper in \INature\i in mid-February.
Producers (plants and algae) get their nutrients from inorganic sources supplied by decomposers (fungi and bacteria), and the decomposers in turn acquire the carbon they need from the producers. It seems that when there are more species in each group, the greater diversity brings more efficiency, and there would seem to be a clear message here for somebody planning to replace a rainforest with a banana plantation, a prairie with a wheat field, or an old growth forest with a pine plantation. Producer diversity, which we can see, is important, but decomposer diversity is also important.
Each microcosm was a Petri dish with 50 mL of a growth medium, kept in the same temperature and light conditions. In various combinations, the dishes were loaded with zero, one, two, four, eight or 12 species of bacteria and zero, one, two, four or eight species of algae. All combinations other than the zero-zero one were used.
Algal production varied greatly, and was shown to be correlated directly with the diversity of both algae and bacteria species. With galloping extinction (some researchers think half the world's species will be extinct by 2050), this interaction becomes even more worrying, for most biodiversity reduction affects both producers \Iand\i decomposers, so that it is quite possible that the first 50 years of the 21st century will see humans having as much impact on the world as a the asteroid impact, 65 million years ago which is accused of wiping out the dinosaurs.
The only question is: will the humans be confined to play the part of the asteroid, or will they also appear on the bill, cast in the role of the dinosaurs in this production?
Key names: Shahid Naeem, Daniel Hahn and Gregor Schuurman.
#
"A 16th century epic drought",1407,0,0,0
(Feb. '00)
With the news (see\JWater shortages in the US Southwest\j and \JAnother dry summer in north America\j, this month) that La Ni±a may continue to drive droughts across the north American continent over the 2000 northern summer, it is probably small consolation to know that things were even worse in the second half of the 16th century in north America.
The evidence of a "mega-drought" in the 16th century, one that wreaked havoc for decades in the lives of the early Spanish and English settlers and American Indians throughout Mexico and North America, has been found in tree rings from drought-sensitive trees in Western North America, the Southeast and the Great Lakes. A drought of these proportions in modern-day America could cause a catastrophe unless water resources are wisely conserved.
A report of the finding appeared on the Internet during February, and will appear in detail in an upcoming issue of the journal \IEOS, Transactions of the American Geophysical Union\i. In brief, the evidence shows that dry conditions extended from the Sierra Madre Occidental in Mexico and the Southwest to the Rocky Mountains and the Mississippi Valley throughout the last half of the 1500s. Severe conditions occurred at times in Mexico, the Southwest, Wyoming and Montana, and the Southeast.
All the way back to 1200 AD, there was no other drought like this one. Instead of gentle fluctuations around the mean, "the basement collapsed and went down to another level," said David Stahle, professor of geosciences at the University of Arkansas. In places, the extended period of dryness lasted 40 years. What is more, the available records collaborate the story told in the tree rings.
Spanish settlers at Santa Elena on Parris Island, South Carolina, left archives recording a sever drought from 1566 to 1569, and in 1587, the year Sir Walter Raleigh's colony on Roanoke Island disappeared, the Parris Island settlers abandoned their colony. Tree ring records show the year was the region's worst drought in 800 years.
An historic drought of this magnitude should serve as a warning to nations to learn to use their water resources wisely, Stahle says. "If there's any lesson to be taken home from the paleo-record, it's that we need to conserve our water resources, it would help prepare us for the inevitable return of drought."
The growth of a tree depends on the water and nutrients it receives: it is rare for light to be the limiting factor. Cells laid down through the seasons in temperate areas vary in a regular way, and in bad years, there is irregular variation as well. As a result, researchers peering through microscopes can tell much about a region's climatic history by looking at the recorded tree ring growth from year to year, using pencil-thin core samples from living trees.
Because we have climate data for at least a century, we can use this to build statistical models which then can be used to reconstruct past climate changes going back hundreds of years. One tree may tell a slightly skewed story, because of local events, but a set of 30 or 40 tiny core samples from trees in the same region form a library with a shared recording of the climatic past.
The finding has a wider significance. It may help explain why some American Indians in the Southwest and northern Mexico abandoned their pueblos between 1540 and 1598, and going on the pattern of the affected area, the drought seems likely to have been an extreme form of La Ni±a, but clearly it was unrelated to more recent global warming. It would be nice to know the cause, since the extreme effects of El Ni±o and La Ni±a affect all parts of the world, one way or another, as extreme floods in Australia, Argentina, Madagascar and Mozambique demonstrated in early March, 2000.
#
"Another dry summer in north America",1408,0,0,0
(Feb. '00)
An examination of the evidence suggests that the La Ni±a-driven drought of 1999 in north America is likely to continue in the northern summer of 2000. The cool La Ni±a waters in the eastern Pacific Ocean, believed to be related to the 1999 summer drought in the eastern United States, are forecast by the National Oceanic and Atmospheric Administration to strengthen and persist through spring and early summer.
A westward and northern movement in drought conditions is expected through the coming summer, climatologist Jim Newman says in \IGeophysical Research Letters\i. He says that overall, La Ni±as are associated with higher drought risks across the continental United States and Canada. "Drought is easily the No. 1 natural disaster in terms of its potential impact on food production and supply," he said in a comment released on the Internet.
#
"Water shortages in the US Southwest",1409,0,0,0
(Feb. '00)
The Pacific Decadal Oscillation (see \JMove Over El Ni±o\j, October 1997) may be shifting into a new phase, according to some climate experts. The PDO is a long-term Pacific sea temperature and sea surface pressure pattern, and if they are right, the American Southwest could be poised at the beginning of a drought that could last 10 years or longer.
An Internet news release during February warned that it might be ten years before we know for certain, but researchers in Arizona are already looking at how a drought similar to the one that occurred in the same area in the 1950s would affect Phoenix and Tucson water supplies in the year 2025. During recent 'good' years, populations in the area have boomed, and they are trying to find out now if the area could be at risk in the future. Their prognosis is not encouraging.
The PDO is a fairly regular pattern of high and low pressure systems over the northern Pacific Ocean, off the coast of Alaska and Canada, which operates on a 20 to 30-year time scale. Previous shifts occurred in 1925, 1947 and 1977,and some climatologists believe that the PDO shifted again around 1995.
The PDO correlates with relatively wetter and drier periods in western North America, and recent research suggests that the PDO enhances El Ni±o and weakens La Ni±a conditions in one phase, then weakens El Ni±o and enhances La Ni±a conditions in its alternate phase. Since 1977, as the new research would predict, the American Southwest has been blessed with wetter winters during El Ni±o years and not-so-dry winters in La Ni±a years. But if the change has occurred, then the current short-term drought caused by La Ni±a (see \JAnother dry summer in North America\j, this month) may only be the start of a long dry spell.
There was a prolonged and severe drought in the Southwest of the US during the 1950s, and the coming droughts could be expected to be as bad.
Key name: Barbara Morehouse.
#
"Borehole temperature readings say it's getting hotter",1410,0,0,0
(Feb. '00)
A careful new study of borehole temperatures from more than 600 sites around the world has revealed that the planet's 500-year warming trend accelerated considerably in the 20th century. The bad news is that, at least in the Northern Hemisphere, the 500-year warm-up has been even greater than previously estimated with other techniques.
That is the main thrust of a paper in \INature\i in mid-February. Since 1500, Earth's temperature has increased about 1 ░C (1.8 ░F), with half of that increase taking place in the 20th century: in the Northern Hemisphere, the temperature change has been 1.1 ░C (2 ░F) over the past five centuries and 0.6 ░C (1.1 ░F) in the 20th century.
The analysis is based on temperature readings taken with sensitive thermometers lowered into holes drilled from Earth's surface. Because of heat conduction, temperature changes at the surface generate "signals" that travel downward into subsurface rocks, say the researchers. While signals from short-term daily or seasonal variations penetrate only a few meters, and the Earth quickly "forgets" them, the temperature changes that take place over hundreds of years are preserved in deeper rock.
The temperature changes travel slowly, about half a meter a year, so the top 500 meters of the planet's crust is an archive of the last millennium. While some of the results are a bit patchy and inconsistent, the overall pattern is extremely clear on a global scale.
An earlier analysis of borehole temperature data from 358 sites in eastern North America, central Europe, southern Africa and Australia, which showed a similar worldwide warming over the past 500 years, and this study adds extra sites, and also extends the geographical coverage. Like other measures such as studies of tree rings (see \JA 16th century epic drought\j, this month), ice cores (see \JA quick cold snap?\j, July, 1999), lake sediments (see \JAfrica Hot\j, August 1998) and coral growth (see \JA coral thermometer\j, January 2000), this yields consistent results.
According to Pollack, "All the methods generally show a very unusual 20th century, and ours does, too. The 20th century is the warmest century of the last five, and the one which is most rapidly changing. What we show that is somewhat different is that the total temperature change over the past five centuries has been greater than some of the other methods are showing."
Key names: Shaopeng Huang, Henry N. Pollack and Po-Yu Shen.
#
"The anomalous lower atmosphere",1411,0,0,0
(Feb. '00)
Even though scientists are now agreed that the world is warming up, there still remains the problem of why the lowest 8 km (5 miles) of the earth's atmosphere has not warmed as quickly as the earth's surface. A paper in \IScience\i in mid-February suggests three causes: the thinning of the ozone layer, emissions from the Mt. Pinatubo volcano, and the influx of sulfate aerosols and greenhouse gases into the atmosphere, which may be behind the difference in temperature trends at the surface and in the lower troposphere.
The large group involved in the study looked at three observational data sets:
- a century of thermometer readings of sea surface temperatures and air temperatures a few meters above land
- a half century of radiosonde measurements of troposphere and lower stratosphere temperatures
- two decades of global observations of tropospheric temperatures (up to eight kilometers) taken by a series of satellites that measure the upwelling microwave radiation from oxygen molecules
In the period 1979 to 1998, the surface data show a warming of 0.2░ - 0.4░ Celsius, while the radiosonde and satellite data show no warming or only a slight temperature rise (0.1░ C) in the lower troposphere over the same period. The researchers are fairly confident that ozone depletion and the Mt. Pinatubo emissions are likely candidates for explaining at least part of the cooler temperatures in the lower to middle troposphere compared to the more intense warming at the surface, but it is unlikely that we have heard the last word on this difference, which is the only remaining hope for those who say that there is no such thing as global warming.
Key names: Ben Santer, Tom Wigley and Gerald Meehl, and ten other authors.
#
"Death by global warming?",1412,0,0,0
(Feb. '00)
Ecologist David Pimentel, last in our news on the problems of poor quarantine laws (see \JWhat is a quarantine system worth?\j, January 1999), suggested to the AAAS meeting in Washington D. C. during February that global warming will create a favorable climate for disease-causing organisms and food-plant pests, which will also be a much more challenging planet for humans struggling to survive. He suggested that even if coroners in the future do not bring in verdicts of 'death by global warming', perhaps they should.
Already, he said, there are noticeable increases in human diseases worldwide, mostly due to environmental factors such as infectious microbes, pollution by chemicals and biological wastes, and shortages of food and nutrients, but global warming will only make matters worse. There were, argued Pimentel, seven worrying trends, as released by Pimentel on the Web, with additional comments in [square brackets]:
- Today, infectious disease causes approximately 37% of all deaths worldwide, but the estimated number of deaths due to a variety of environmental factors is higher and still growing. Environmental diseases are attributed especially to organic and chemical pollutants, including smoke from various sources such as tobacco and wood fuels. [See, for example, \JChina and sulfur emissions\j, November 1999]
- More than 3 billion people currently are malnourished, the largest number and proportion of humans in desperate need of food and nutrients in human history, and that number increases every year. Malnutrition increases susceptibility to infectious and environmental diseases, such as diarrhea and pollution-related illnesses. [See \JNew rice strains and vitamin A and iron deficiency\j, August 1999, for background]
- A population increase to 12 billion in the next 50 years (based on current growth rates) will exacerbate the spread of disease globally, the Cornell ecologist said. Densely crowded urban environments, especially those without adequate sanitation and nutrition, should be of great public-health concern because they are sources of disease epidemics. Dengue fever is spread by the \IAedes aegypti\i mosquito [and also by other mosquitoes in the \IAedes\i genus in certain parts of the world] which breeding in old tires and other water-holding containers. This group of mosquitoes is expanding rapidly in crowded tropical cities. With global warming, this mosquito and others will spread north [and south], transporting dengue and other diseases from the tropics.
- Waterborne diseases - already accounting for nine out of 10 deaths from infectious disease in developing countries - will become more prevalent in a warmer, more polluted and crowded planet. For example, only eight of India's 3,120 towns and cities have full wastewater treatment facilities. Hundreds of millions of people in India and other developing countries are forced to use untreated water for drinking, bathing and cooking.
- Today, air pollutants adversely affect the health of more than 4 billion people worldwide, and air quality in many places is getting worse. The number of automobiles worldwide is growing approximately three times faster than the world population. Meanwhile, an expanding world population is burning more fossil fuels for domestic and industrial purposes. The grim history of lung cancer - a three-fold increase from 1950 to 1986 - could be repeated, Pimentel predicted, in developing countries. About 4 billion people in developing countries who cook with wood and coal over open fires suffer continuous exposure to smoke. Wood smoke is estimated to cause the death of 4 million children each year.
- The more than 3 billion of the world's people who are malnourished increasingly are susceptible to infectious and environmental diseases - cropland has been diminished by 20% in the last decade, per capita fertilizer production has fallen by 23%, and per capita irrigation water supplies have dropped by 12%.
- And increasing global climate change will result in a net loss of available food. Although there may be some benefits in crop production from warmer climates, these beneficial effects will be more than offset by the projected decline in rainfall in critical crop-growing regions like the U.S. Corn Belt. Crop losses from pest insects, plant diseases and weeds will increase in a warmer climate: as it is, insect pests, plant pathogens and weeds cause the loss of more than 40% of the world's food - despite the application of 5 billion pounds of pesticides each year.
#
"A faster rate of global warming",1413,0,0,0
(Feb. '00)
An article to be published in \IGeophysical Research Letters\i on March 1 reveals that the rate of global warming is accelerating and that in the past 25 years, it achieved the rate of two degrees Celsius (four degrees Fahrenheit) per century. This rate had previously been predicted to be reached in the 21st Century.
In 1997-8, a string of 16 consecutive months saw record high global mean average temperatures, a pattern never before seen, in the entire history of using instruments to systematically record temperature in the 19th Century began. According to Thomas R. Karl, the lead author, there is only a one-in-20 chance that the string of record high temperatures in 1997-1998 was simply an unusual event, rather than a change point, the start of a new and faster ongoing trend.
Since that analysis, they have worked their way through the 1999 temperature data, which should have shown results that were well down the scale, since it was a La Ni±a year, but outside the band between 20 degrees north latitude and 20 degrees south latitude, 1999 was the second warmest year of the 20th Century, just behind 1998, an El Ni±o year.
So either the world is getting warmer even faster than before, or something else is happening which leads to the appearance of an accelerating rate of warming. Unusual and chance fluctuations are always likely to be part of the weather patterns we can observe, but in the absence of any other evidence, it seems likely that the rate is accelerating, and given the time it will take to smooth out any such increase, the authors have urged that studies be conducted to enable society to minimize the risks of climate change and prepare for more, and perhaps even more rapid, changes to come.
Key names: Thomas R. Karl, Richard W. Knight, and Bruce Baker.
#
"The approaching ice age",1414,0,0,0
(Feb. '00)
Most climate researchers are more worried right now about global warming, but paleoclimatologists are more concerned about the prospects for the next ice age, even though that is probably 5000 years away. During the past 10,000 years since the end of the last ice age, the period known as the Holocene, our Earth has enjoyed a relatively warm and stable climate. This period has also coincided with the emergence of human civilization and the dominance by our species of the globe, but how will we fare when the ice sheets return?
Note that the term is when, not if. According to a paper published in \IScience\i during February, and arising out of a conference in October 1999, George Kukla argues that a currently popular interpretation of Holocene climate as uniquely benign is both mistaken and misleading: it is quite likely to put the breeze up us in more ways than one.
The geological shows that Earth has seen an ongoing cycle of ice ages dating back millions of years, when the cold glacial periods are broken by briefer, warmer periods called interglacials. The Holocene is just another interglacial that is more than half over, Kukla said. The pattern of glaciation closely matches cyclic variations in Earth's orbit around the sun, and many researchers believe that the form of our orbit is what drives glaciation. The match-up of orbit and climate is called the Milankovich cycle, after the scientist who analyzed and popularized it in the 1920s.
There is little room for doubt about the link, which is too clear-cut to allow doubt, and given that it is unlikely that climate drives the orbit, it seems definite that orbit drives the climate, and that means we are in for another ice age, though for now, the actual mechanisms that initiate and drive glaciation remain a mystery. Right now, the configuration of the sun and earth is fast approaching what it was 116,000 years ago when the last interglacial period ended. While the annual mean temperature on Earth is now rising, polar mean temperatures remain steady and ice fields in the upper elevations of Greenland are actually expanding.
The last interglacial, the Eemian, ran from about 128,000 to 106,000 years ago, and it is clearly recorded in ocean sediments, lake sediments and ice cores from the polar regions, but all of these require interpretation, because the data they offer are indirect measures of relative conditions. We interpret the climate during the Eemian by looking at the oxygen isotopes in foraminiferan shells, and oxygen isotope data from water molecules trapped in successive layers of old ice, to estimate the actual polar temperatures when the ice formed. Plant pollen measures from northern European lake sediments offer a record of the transition on land between temperate woodlands and the sparse grasslands that dominate during cold periods.
The idea that the Eemian showed major swings came from an interpretation of a single ice core collected in Greenland by a European team in 1992. The oxygen isotope data from that core showed the Holocene to have been uniformly mild in comparison to the Eemian, which appeared to have witnessed relatively severe temperature swings. Once that notion was published, scientists began looking for other evidence of Eemian climate extremes, and that, according to Kukla, is where we went wrong. As more people weighed in, looking for confirmatory evidence, they found it, and so we became a little more convinced that the Holocene was special, because it was so much more stable.
In Kukla's view, we should look harder at the data from another core later collected near the first one, as well as the overwhelming weight of ocean sediment data. These lines of evidence show the Holocene to be similar to the early and mid Eemian, he said.
This may lead us to look in the wrong place, according to Kukla. He points out that glaciation starts in the polar regions, which together comprise only 14% of Earth's surface, and that the evidence suggests the ice ages begin building at the poles thousands of years before their effects are felt elsewhere. So it is possible that we should look away from the global mean temperature, and look rather at the temperature difference between the poles and the equator. A greater difference would probably increase the flow of water vapor from the tropics toward the poles, where it would fall as snow to feed the growing ice fields.
Indeed, argued Kukla, it is possible that greenhouse warming could even hasten the transition to glacial conditions by exacerbating the polar/equatorial temperature difference and increasing the rate of water transport poleward. On his scenario, and based on the record revealed in ocean and lake sediments, the volume of polar ice will slowly grow, with sea levels slowly dropping even as the polar/equatorial temperature difference stays constant. The oceans and continents will remain relatively warm, except near the poles, with a climate that grows more unstable with time, until the polar ice surges into the mid-latitude oceans, and another ice age is upon us.
See also: \JMilankovich, Milutin\j.
#
"March, 2000 Science Review",1415,0,0,0
\JHow animals use Earth's magnetic field\j
\JAre magnetic fields bad for you?\j
\JAre magnetic fields good for you?\j
\JNew vaccines\j
\JNight-lights and nearsightedness\j
\JNew TB test announced\j
\JAnticancer agent in hazelnuts\j
\JBreast implant dangers minimal\j
\JComfrey and liver damage\j
\JThe caterpillar and heart research\j
\JA link between common virus and heart failure\j
\JModified rice could end food shortages\j
\JFruit fly genome is published\j
\JFive Little Pigs\j
\JPiglets open the way to xenografting\j
\JAvoiding Bt resistance\j
\JMarcus Wallenburg prize, 2000\j
\JNew RS President\j
\JMathematicians: Double Soap Bubble Had It Right\j
"How animals use Earth's magnetic field",1416,0,0,0
(Mar. '00)
Many birds, salamanders, salmon, hamsters and even humans, have been shown to have a built-in magnetic sense (see \Jmagnetism\j for more details), but up until now, nobody has been sure how this sense worked. A report in the February issue of the \IBiophysical Journal\i suggests that a blue-light photoreceptor found in nerve layers of the eyes and brain may play a part, or even form the whole of the magnetic compass that lets migratory birds and many other creatures find home using the EarthÆs magnetic field.
The receptor is called cryptochrome, and it is known to play a prominent role regulating an animal's circadian (day-and-night) rhythm. Chemical experiments and computational modelling described in the report indicate that cryptochrome may be the site of a neurochemical reaction that lets birds, for example, process visual clues from the magnetic field and stay on course.
Typical biomolecules interact with Earth's magnetic field too weakly for the field to alter the course of their chemical reactions, but some evidence gathered by Klaus Schulten suggested that certain chemical reactions involving so-called radical pairs can be influenced by weak magnetic fields, like the field of a door magnet. Now Thorsten Ritz, a member of Schulten's team, has found theoretical evidence that a biochemical reaction involving cryptochromes can be influenced by an Earth-strength magnetic field.
In many cases, migratory birds and other animals are unable to distinguish between north and south based on magnetic information alone. They can only detect the angle of the magnetic field lines with the horizon, which is explained through symmetries in visual modulation patterns. The theory put forward by Schulten was that, if radical-pair reactions in cryptochromes were connected by photoreception to the vision of animals, the magnetic field might modulate visual sensitivity. Animals would "see" the geomagnetic field by superimposing information about the field's direction onto its visual images.
Behavioral biologists have now found that many magnetic responses require light, and that the orientation of some animals was erratic when they were exposed to monochromatic red light. Such findings strengthened the theory, Schulten and Ritz say, because radical-pair reactions require light above a certain energy threshold.
In a comment on the Internet, Schulten says, "The visual modulation patterns that we found show surprising agreement. The hunt for the elusive magnetoreceptor is not over, but we have provided a new, promising track."
#
"Are magnetic fields bad for you?",1417,0,0,0
(Mar. '00)
A nested case control study, reported in \IOccupational and Environmental Medicine\i suggests that there may be an increased risk of suicide after prolonged exposure to low-frequency electromagnetic fields. The study looked in detail at a sample of 6000 of the 139,000 electricians and other field technicians employed at any of the five electric power companies in the United States at any time between 1950 and 1986. Those sampled had an average length of time in the industry of 16 years.
The suicide level was twice as high among those whose work regularly exposed them to electromagnetic radiation. The highest risk of suicide was found among those with the highest levels of exposure, particularly in the year preceding death. The association was even stronger among those whose death occurred before the age of 50.
The authors suggest that electromagnetic fields may reduce the production of melatonin, a hormone that maintains daily circadian rhythms, including the sleep and wake cycle. Reduced levels of melatonin are associated with depression, but, while this is a possible explanation, it does not rule out other possible causes for this significant link.
Key name: David Savitz, Department of Epidemiology, University of North Carolina
#
"Are magnetic fields good for you?",1418,0,0,0
(Mar. '00)
In a report circulating on the Internet, Henry Lai, a University of Washington research professor of bioengineering, proposes using oscillating magnetic fields to defeat malaria. According to Lai, the malaria parasite \IPlasmodium\i appears to lose vigor and can die when exposed to such fields, which Lai thinks may cause tiny iron-containing particles inside the parasite to move in ways that damage the organism.
Drug resistance in malarial parasites has lifted the world level of malarial infections to 500 million people, with 2.7 million deaths each year, 1 million of them children. If Lai's solution works, it is highly unlikely that \IPlasmodium\i could develop a resistance to magnetic fields.
While the cure may sound a little "quackish" at first glance, it makes good logical sense when you look into it. The parasites consume the hemoglobin of the red blood cells they invade, breaking down the globin portion of the hemoglobin molecule. But the iron portion, or the heme, is left intact because the parasite lacks the enzyme needed to degrade it.
Free heme molecules can damage the membranes in the parasite's cell through a chain reaction of oxidation of unsaturated fatty acids. In response, \IPlasmodium\i has evolved the ability to make the heme molecules harmless by binding them into long stacks. It is these stacks, Lai thinks, which are magnetic. Whether the explanation is correct or not, laboratory studies have shown a drop, under a weak alternating (oscillating) magnetic field, of between 33 and 70% in the parasite levels, and also a reduced level of metabolic activity by the parasites.
It is possible that the field disrupts the 'stacking' process, leaving the harmful heme loose to play havoc within the parasite or, if the parasite has already bound the heme into stacks, the oscillating field could cause the stacks to spin, causing damage and death to the parasite. But it is early days yet, and Lai says more research is needed.
"We need to make certain that it won't harm the host," he warns. "My guess is that it won't. It's a very weak magnetic field, just a little stronger than the earth's. The difference is that it is oscillating." If the method is proven effective and safe, Lai envisions rooms equipped with magnetic coils to produce the oscillating field.
"It would be very easy," he says. "People could come to the room and sit and read or whatever while they're being treated. Or you could set it up in the back of a big transport truck, then drive from village to village to treat people."
Key names: Henry Lai, Jean E. Feagin, and Ceon Ramon.
#
"New vaccines",1419,0,0,0
(Mar. '00)
The 7th Biennial Meeting of the Vaccines and Immunotherapeutics Conference was held in Victoria, Australia, during March, where more than a hundred vaccine scientists from Australia and beyond shared information on the latest strategies to improve human and animal health. Highlights were details of new vaccines against cancer, tuberculosis and malaria, which owe a lot to advances in gene technology.
The conference targeted both human and veterinary health applications for new vaccines and therapeutic treatments, signalling an approach to a different era in vaccine development. Unlike our position at the start of the 20th century, the simple tasks like vaccines for polio, measles, diphtheria and scarlet fever have been dealt with, and we are left with the more complex and difficult health problems, such as tuberculosis (TB), HIV, cancer, and malaria.
One Australian group, from the Ludwig Institute for Cancer Research in Melbourne, is investigating a treatment for melanoma (see \JMelanoma vaccine trials succeed\j, November 1999 for similar work). Across a wide range of cancers, scientists are now investigating the differences between cancer cells and normal cells in the hope that these differences could be used as the basis for a therapeutic vaccine to treat cancer patients.
One topic that is gaining a lot of attention involves using natural immune boosters, called cytokines, to help combat cancers such as melanoma and a range of human and animal diseases, including HIV/AIDS. The ongoing problem of developing a vaccine against malaria was also discussed at the conference. The latest notion involves 'teaching' the immune system to fight a toxin that is produced by the malaria parasite. This project is being undertaken by the Walter and Eliza Hall Institute of Medical Research in Melbourne.
Other methods described in papers included vaccination with naked \JDNA\j and engineering new 'designer' antibodies. Dr Marion Andrew, from the Australian CSIRO's Animal Health division, commented that "There are many examples of creative approaches to disease prevention and treatment. DNA vaccination is proving to be an exciting new technology to deliver safe and effective vaccines. The use of modified, harmless viruses, to carry vaccines to specific parts of the body where they are needed is another example."
Dr Andrew, who was a conference presenter and co-organiser, added that "In many cases, an approach which uses a number of new techniques will be needed to beat those hard-to-solve disease problems." According to another CSIRO scientist, Dr Peter Hudson, 'designer' antibodies and their fragments now represent over 30% of all biological proteins undergoing clinical trials for diagnosis and therapy in humans.
Other issues raised included trials of vaccines for contraceptive use in animals such as mice, wild foxes and Australian possums (a major pest in New Zealand) in an attempt to achieve control that has limited 'knock-on' effects on other species sharing the same habitat. (See \JHuman breast cancer and mice\j, January 2000, for one reason why control of the house mouse is important.)
#
"Night-lights and nearsightedness",1420,0,0,0
(Mar. '00)
Contrary to an earlier report (see \JNear-sighted children and the light\j, May, 1999), it seems that night-lights are not to blame for children being near-sighted after all. A further study, published like the first one in \INature\i, looked at 1220 children, but this one found no significant differences. Of 417 children who had slept without a light on, 20% became myopic (near-sighted), of 758 children who had slept with a night-light on, 17% became myopic, and of 45 children who had slept in a fully lit room, 22% became myopic.
The study covered a full range of ethnicity in four areas across the United States, and involved surveying parents for the study, asking what kind of night-time lighting had been used in their children's rooms before age 2. Eyes grow rapidly during the first two years of life, but myopia usually does not develop until much later. The average age of the children surveyed in this study was 10 years.
The first study did not take into account the eyesight of the parents, and the researchers in this study noticed that near-sighted parents were more likely to use a night-light in their child's room. Genetics plays a significant role in causing myopia, and this habit may well account for the findings in the first study. So it appears that parents can rest easy, and leave the night-light on.
Key names: Karla Zadnik, Donald Mutti, Lisa Jones and others.
#
"New TB test announced",1421,0,0,0
(Mar. '00)
A new test for tuberculosis was described in the March issue of the \IJournal of Clinical Microbiology\i. Called microscopic observation broth-drug susceptibility assay (MODS), the test can quickly and cheaply detect tiny amounts of the \IMycobacterium tuberculosis\i bacillus, as well as determine whether a particular strain of the bacillus is resistant to any drug.
The test is at once inexpensive and sensitive enough to be used in the field by health officers in developing countries. The majority of the estimated 8 million new cases of clinical tuberculosis, and 3 million deaths caused by TB each year happen in developing, resource-poor countries , where the conventional tests for the bacillus are simply not feasible because of their high costs and equipment requirements.
The effectiveness and sensitivity of the MODS assay has been tested in Peru, where it was shown to give results comparable to those of other high-tech, more expensive, and time-consuming methods.
#
"Anticancer agent in hazelnuts",1422,0,0,0
(Mar. '00)
\JTaxol\j is now a registered trademark, but the drug known either by this name or as paclitaxel, has now been found in hazelnuts. Although it has previously been found in a fungus, this is the first report of the potent chemical being found in a plant other than the yew tree. The discovery was announced at the 219th national meeting of the American Chemical Society in late March.
The study began with an attempt to find out why some hazelnut trees are resistant to a plant disease known as Eastern Filbert Blight. A chemical analysis of extracts from these hazelnut trees identified paclitaxel in some of the trees, with the chemical being isolated from the nuts, branches and shells of the trees.
The U.S. Food and Drug Administration (FDA) has approved Taxol« for the treatment of ovarian cancer, breast cancer and AIDS-related Kaposi's sarcoma. Commercial supplies of Taxol« are now manufactured by a semi-synthetic method that relies on extracts from the leaves of another yew species. While paclitaxel has been synthesized artificially in the laboratory without using any yew parts, this method is currently too complex and expensive to implement commercially, so this new discovery could be important. This is especially so as researchers are now finding an increasing number of other medical applications that are boosting demand for it. Clinical studies have shown that the drug is promising for the treatment of psoriasis, polycystic kidney disease, multiple sclerosis and Alzheimer's disease, among others.
The yield is lower than in the yew tree, which gives 60 to 70 ppm of paclitaxel from dried material, while the hazelnut yield is more like 6 to 7 ppm on a dry weight basis, but it is easier to gather. But if you have cancer, don't rush out to stock up on hazelnuts: there is probably not enough paclitaxel in a handful of nuts to make a difference medically.
#
"Breast implant dangers minimal",1423,0,0,0
(Mar. '00)
A mid-March report in the \INew England Journal of Medicine\i outlines the largest, most comprehensive study done to date of the possible link between silicone breast implants and connective tissue diseases, and indicates that there appeared to be no evidence that the implants impair women's health. The study looked at such illnesses as rheumatoid arthritis, lupus erythematosus, scleroderma and other systemic conditions, but could find no link between the implants and any of these conditions.
The researchers performed a series of meta-analyses, which means combining and analyzing information from previous studies. Meta-analysis is particularly useful when previous studies have lacked enough subjects to produce a clear result, according to Dr Janowsky, the lead author, who added that their " . . . conclusion is consistent with earlier work indicating no relationship."
Altogether, the workers found 757 citations in the world medical literature and they reviewed all of the potentially relevant work on the effects of silicone breast implants. Finally, they used data from nine cohort studies, nine case-control studies and two cross-sectional studies that met various standards, including the presence of an internal comparison group and having sufficient numbers available to create tables for the meta-analyses. Most of the studies originated in the USA, but the researchers also used investigations from Canada, Australia, the United Kingdom and northern Europe.
The study did not include women who had had direct injections of any material into their breasts, including silicone, but there was not enough information to assess the effect of implants if they leaked or ruptured while inside women's breasts. The review did include two studies that included enough information to analyze the effect of how long implants had been in place. The analysis suggested that the likelihood of any illness did not increase over time.
It now seems that intensive publicity about suggested adverse health effects of breast implants could have made women more aware of their symptoms and more observant, compared with women who did not have such symptoms. In spite of that, say the authors, " . . . there was no evidence of an association between breast implants in general, or silicone-gel-filled breast implants specifically, and any of the individual connective-tissue diseases, all definite connective tissue diseases combined, or other autoimmune or rheumatic conditions."
This does not mean that women with implants will not develop these conditions, but indicates that, as a group, they are no more likely to develop the various complaints than women without implants.
Key names: Esther C. Janowsky, Lawrence L. Kupper and Barbara S. Hulka.
#
"Comfrey and liver damage",1424,0,0,0
(Mar. '00)
The herb comfrey is popular as a laxative and anti-inflammatory medication, but it can cause severe liver damage and should be banned. That was the opinion of a scientist addressing the first international scientific conference on "The Efficacy and Safety of Medicinal Herbs", at the University of North Carolina - Chapel Hill, in early March. Pyrrolizidine alkaloids present in the plant have been linked to poisoning after people consumed them in tea and contaminated cereals.
Dr. Felix Stickel of the University of Erlangen and Salem Medical Center in Heidelberg, Germany, pointed out that comfrey is banned in Germany and Canada, but is still freely available in the United States and that, in some cases, the use of the herb can lead to patients needing a liver transplant.
The actual mechanisms by which damage happens are uncertain, but the main injury appears to be destruction of small veins, which leads to cirrhosis and eventually to liver failure, Stickel said. He was one of the presenters at the conference where speakers from Canada, England, Germany and the United States discussed eight of the world's most widely used herbs: garlic, ginseng, \IGinkgo biloba\i, comfrey, saw palmetto, feverfew, echinacea and St. John's wort.
According to Dr. Lenore Arab, professor of epidemiology and nutrition at UNC-CH, people are self-medicating without any guidance. They might be taking extremely high, possibly dangerous doses because they think more is better, or they may be getting so little that they are simply wasting their money. Health professionals lack knowledge about herbal remedies, but the public are not asking in any case. Arab and two colleagues confirmed that garlic, in both its raw and cooked forms, appears to protect somewhat against colorectal and stomach cancers. Supplements and over-cooked garlic may be much less effective. The herb also exhibits an ability to lower fats in the blood, reduce clotting, promote circulation, protect the liver and enhance the immune system, and Arab believes that there may be a case for eating a clove a day, since it has a measurable effect.
Other speakers confirmed the usefulness of a \IGinkgo biloba\i extract in treating dementia, as demonstrated in at least two large randomized clinical trials. Ginseng was given a sort of approval as a good antioxidant, but it was pointed out that we still do not know if ginseng's ability is due to a specific component or represents a synergistic action by a mixture of phytochemicals. In the same way, feverfew got limited approval as a method of preventing migraine headaches: most studies favored feverfew over inactive compounds and few side effects were found; however, the clinical effectiveness of the herb in preventing such headaches was not proven.
Palmetto seems to be an acceptable treatment for urologic symptoms among men with benignly enlarged prostate glands. Echinacea, the highest selling supplement in the USA, may be of some use in preventing colds or making them less severe, but it should not be taken for systemic illnesses or if allergy appears, according to another paper.
A St John's wort study revealed that nearly half of the patients studied had already tried it on their own, but often irregularly, or at incorrect doses, and the speaker recommended that anyone with persistent symptoms of low mood or loss of interest should see a health practitioner rather than self-medicate. So the verdict on herbs overall: most of them are useful, but the dosage needs to be better managed, and we need to know more about how the herbs operate, so we can manage that dosage better. Just lay off the comfrey. A book summarizing the presentations will be published later in 2000.
#
"The caterpillar and heart research",1425,0,0,0
(Mar. '00)
The cabbage looper caterpillar, generally regarded as a pest, has become an important tool for making a protein found in the human heart that is responsible for muscle relaxation. Discovering how to stop the protein from relaxing the heart could save thousands of lives. Calvin Hale, a scientist at the MU Dalton Cardiovascular Research Center, who has been working on this says in a recent issue of the \IJournal of Protein Expression and Purification\i: "Before, we had to grow the protein in Petri dishes and it was very time consuming. It would take quite a long time and even then, we would get a very small amount. Now, with just one caterpillar, we are able to get more protein than we could grow in ten Petri dishes in a much faster time. This allows us to study the protein at much greater levels, closer to what is found in the heart."
The heart muscle is forever contracting and expanding, and this process is controlled by calcium in the muscle. First, the cells take in calcium, and as the level of calcium rises, the heart begins to contract, and then to relax, calcium must be transferred out of the heart. This is all managed by a special protein that acts as a sort of intelligent pump, hauling calcium ions in one direction, and sodium ions in the other.
If everything is normal, the contraction-relaxation process happens at regular intervals, but for patients suffering heart failure, the heart lacks tone and tends to relax much longer than it should. The solution, say researchers, will be to find out how this "exchanger" protein does its job so they can block it to keep the heart from relaxing as much. The problem has been that supplies of the protein have been hard to come by, with tiny yields from the Petri dishes.
Enter the caterpillars, and a genetically altered virus, a baculovirus, which is not dangerous to humans. This virus was modified so that it would spread throughout the caterpillar body and produce the exchanger protein. Then, the only difficulty was in harvesting the protein in useful amounts. Hale and his team got around this by developing a new method of purifying the heart membranes that actually carry the protein they are targeting, and they have now raised some 1500 of the caterpillars.
The next step, already under way, is to crystallize the protein to make it easier to determine the structure and the ultimate design of a drug. Scientists have already cloned and mapped the genome of the protein, but they are still trying to discover its structure. Without the caterpillar, none of this would be possible.
#
"A link between common virus and heart failure",1426,0,0,0
(Mar. '00)
A report in the April issue of \INature Medicine\i reveals that a gene in humans has been discovered which allows one of most common and highly contagious viral infections to trigger deadly heart disease. The gene called p56Ick allows a common coxsackievirus to attack the heart, causing heart failure and even death in some patients.
Coxsackieviruses are part of an extremely common family of viruses that live in the human digestive tract. They are highly contagious, and an estimated 70% of the population has been exposed to Coxsackievirus B, which spreads easily from person to person like the flu. There is no vaccine against coxsackievirus infections (in contrast to its famous relative, the polio virus) and no cure. Although the most common result of a coxsackievirus infection is a case of "the flu", the viruses can also cause pancreatitis leading to diabetes, arthritis, meningitis and myocarditis (an infection of the heart muscle), which can lead to heart failure. Children with coxsackieviral myocarditis can develop flu-like symptoms followed quickly by heart failure and death.
Coxsackievirus B can be detected in the hearts of between 30 and 50 percent of all adults with heart failure due to heart muscle weakness, a condition that frequently leads to the need for a heart transplant. The difference, it seems, is whether or not the victim carries the p56Ick gene. In those people at risk, the p56Ick gene helps the virus to trigger the immune system to turn against the heart muscle.
It seems that Coxsackievirus B viruses use T-cells to hitch a ride into the heart muscle, where the virus stimulates the immune system to attack the heart muscle tissue. Specially engineered mice that lack the p56Ick gene were completely immune to heart disease, despite being exposed to large doses of the virus, while normal mice (with the p56Ick gene) developed severe inflammation of the heart muscle and died from heart failure.
Heart disease is the number one killer in the Western world, and one in eight cases of heart failure may be blamed on coxsackievirus B, but now there is a potential way to identify the people at risk in advance. But perhaps more importantly, the discovery of a cellular factor that controls the body's reaction to a coxsackievirus B infection should lead to finding the genetic trigger, and eventual block, for the viruses that cause the common cold, diabetes and diarrheas as well.
Key names: Peter Liu, Josef Penninger.
#
"Modified rice could end food shortages",1427,0,0,0
(Mar. '00)
Just ahead of the announcement in early April that the whole rice genome has been worked out (see \JRice genome complete\j, next month), comes a story in \INew Scientist\i of a strain of genetically modified rice that boosts yields by a massive 35%. It was announced to almost no media attention at all, at an international conference on rice biotechnology in the Philippines during March. The GM rice, which has been tested in China, Korea and Chile, extracts as much as 30% more carbon dioxide from the atmosphere than controls, offering a way of curbing global warming, according to the story. While the logic of this latter claim is at best tenuous, the increased crop yield is a solid and remarkable result.
The Manila conference, hosted by the International Rice Research Institute (IRRI), heard Maurice Ku of Washington State University say that the bumper yields approach the targets the IRRI says will be needed to feed the worldÆs population over the coming 20 years. Ku has yet to publish the work in a peer-reviewed journal, but presenting it at a conference like this is an acceptable way of revealing scientific developments.
\INew Scientist\i quotes grudging support from Kevan Bundell of the charity Christian Aid, who commented that "this rice may have potential provided it doesn't make poor farmers more reliant on expensive external inputs, such as herbicides". There is no evidence that the rice will make any such demands, but this form of grudging faint praise accompanied by a piece of misdirection such as this is becoming more common when lay people are asked to comment on GM issues that are beyond their comprehension.
(According to one prominent Australian geneticist, it would be " . . . as logical to comment that the rice may have potential so long as it does not sneak into houses at night and strangle babies in their sleep". She declined to make this comment on the record, suggesting that in today's climate, it would lead to her either being accused of baby-strangling, or of breeding organisms with this power. It must be a matter of concern that scientists are feel the need to react in this way to lay critics of science.)
Ku inserted maize genes which boost photosynthesis, and the new genes enable the plant to absorb more CO\D2\d and also to stop oxygen from blocking sugar production. Ku and his colleagues used maize genes because maize is one of the more "advanced" plants that uses the C4 photosynthetic pathway, while the "less advanced" rice uses the C3 pathway.
In C4 plants, carbon dioxide is taken in and stored as four-carbon acids such as oxaloacetate, malate and aspartate, while the C3 plants, including most of our major food crops (like potatoes, rice, wheat and other cereals) lack the enzymes to perform this trick, and must create three-carbon compounds such as phosphoglycerate first. A side-effect of this is that the C3 plants later release much of the CO2 that they originally absorbed.
Ku and his colleagues Mitsue Miyao and Makato Matsuoka inserted a maize gene for phosphoenolpyruvate carboxylase, the enzyme that initiates photosynthesis in maize, and this produced a 12% increase in yield. But adding the gene that makes pyruvate orthophosphate dikinase, another enzyme vital for C4 plants, boosted the yields by 35%. They are now working on adding a third gene, which codes for nicotinamide adenine dinucleotide phosphate-dependent malic enzyme.
#
"Fruit fly genome is published",1428,0,0,0
(Mar. '00)
Over nine decades, the tiny fruit fly \IDrosophila melanogaster\i has yielded many of the most fundamental discoveries in genetics. The first inheritable mutations were seen there; the first genetic maps were made when the fly's chromosomes were mapped crudely at the time of World War I; and, by 1916, we knew that genes are located on chromosomes because there were four linked sets of genes, and four pairs of chromosomes in \ID. melanogaster\i.
But while we speak of four "pairs" of chromosomes, the fly actually has five distinct chromosomes, three paired chromosomes, plus the X chromosome and the Y chromosome, which determine sex, just as they do in humans. Together, the chromosomes contain 13,601 genes in some 215 million base pairs, with 80% of the genes on the large chromosomes 2 and 3.
In late March, an almost entire genome for the fly (an estimated 97 to 98% of all of the genes) was published for the first time in \IScience\i. It appeared as a series of articles jointly authored by hundreds of scientists, technicians, and students from 20 public and private institutions in five countries.
The work was led by Gerald Rubin of the University of California at Berkeley and the Howard Hughes Medical Institute (HHMI), who heads the Berkeley Drosophila Genome Project (BDGP), and by J. Craig Venter of Celera Genomics in Rockville, Maryland. It brings \Ialmost\i to an end a project that has been going on since 1991.
When work began in earnest in 1998, there were already extensive but incomplete maps of the location of specific DNA sequences on the fly chromosomes, and about 20% of the fly genome had already been sequenced in detail. Most of this work had been done at BDGP, but the aim of the collaboration was to see if a method favored by Celera, called whole-genome shotgun sequencing, could be used on larger organisms having many thousands of genes encoded in millions of DNA base pairs.
Before that, the method had worked well on small bacterial genomes, but there was some doubt about whether the strategy would scale up to larger organisms. If it did, this would clear the way to use the method on the human and mouse genome projects, where it would be faster and more efficient than traditional methods. The short answer: the method has withstood the demanding challenge of solving the genome of a more complex organism.
In May 1998, the BDGP was one year into a three-year National Institutes of Health grant and had finished 20 percent of the sequencing when Rubin was approached by Venter with what Rubin calls "an offer that was too good to turn down." Venter proposed that his newly formed company, Celera, would sequence the \IDrosophila\i genome free-of-charge using their whole genome shotgunning technique.
The Celera technique requires cutting the \IDrosophila\i DNA into three million random clones with overlapping ends. These clones are then sequenced by automated DNA sequencing machines at Celera, where there are some 300 sequencers, each costing $300,000. After the sequencing machines have done their work, massive computing power is put to work to assemble the complete genome sequence in a process similar to reconstructing a jigsaw puzzle, by finding ways of linking the fragments by their overlaps.
Venter formed Celera as a commercial venture to sequence the human genome by 2001, several years before the date projected for completion by the international Human Genome Project. While promising the data would be made available to researchers, Venter was also betting that Celera could make money by licensing early looks at the sequencing data to the pharmaceutical industry, but first it needed to be validated.
The \IDrosophila\i genome was a 'proof-of-principle' for the whole genome shotgun strategy. It seemed like a good idea to do a medium-sized organism in which there was extensive scientific interest, and one about which a lot of good information was already available in terms of map and sequence data that could be used to validate the strategy. Celera started the sequencing last April and finished collecting the raw data in early September. Now it appears that all of the effort was indeed worthwhile.
Roger Hoskins and his team, at the BDGP physical mapping project, set out to produce a physical map of that part of chromosomes 2 and 3 that expresses genes - about 45 percent of the chromosomal material is highly condensed heterochromatin (see below), which does not encode genes. A physical map is not a complete sequence, which identifies every single base-pair along a given stretch of DNA, but a good map pins down the locations of unique short sequences that can be used to establish the correct long-range order of copies of longer DNA sequences, and thus of any genes they represent.
The method involved taking sections of \ID. melanogaster\i DNA that have been inserted into the bacterial workhorse of genetics, \IEscherichia coli\i. These segments are known as "bacterial artificial chromosomes", or BACs, and each BAC is an accurate copy of a small stretch of the genome. The map marks each BAC with at least one unique "sequence-tagged site" (STS), or better still, with two or more such sites. Then, when the sequence is complete, each of the 17,000 BACs can be assigned to its proper place along the map.
Of the 215 million bases or so in the genome of \ID. melanogaster\i, about 120 million bases are in the form of euchromatin. This is DNA that can unwind and open, the DNA that encodes genes. The rest is material known as heterochromatin, which forms the centers and ends of chromosomes and consists mostly of noncoding sequences. Much of the heterochromatin resists sequencing because it occurs in very short sequences of bases that are repeated many times over, in sets known as long tandem arrays. The target for the sequencers, then, is the euchromatin.
Researchers use probes tailored to each sequence-tagged site, and this lets them locate an STS wherever it occurs in a random collection of clones of euchromatin fragments. There are 1923 of these STS markers, which means they are spaced roughly every 50,000 bases in the euchromatin they were working on. When they match up these sites on overlapping clones, sets of clones of different lengths can be lined up with one another and linked together into longer sequences, yielding an STS content map.
The DNA is purified from whole flies that are frozen and ground up. The DNA is then cut into pieces using enzymes, and these are then inserted into various hosts that replicate numerous copies of them, the clones. Choosing a host is a matter of compromise because, while short viral clones reproduce sequences of bases very accurately, they often match more than one location in the genome. At the other extreme, YACs (yeast artificial chromosomes) are very long stretches of DNA, up to millions of bases. Using YACs may help reduce the number of steps needed to create a physical map, but YACs are unstable and may incorporate many sequence errors.
The BDGP researchers relied on BACs, DNA clones that are stable at lengths up to hundreds of thousands of bases. They were able to 'tile' many different clones, and mostly had 13 different overlapping clones at each point along chromosomes 2 and 3. Once these are sequenced and cross-checked, this gives a highly reliable sequence for each of the genes on the chromosomes under attack.
With the much shorter chromosomes 4 and X already mapped, the BDGP researchers made a 'rough draft' sequence of the genome with shallow coverage (less than two clones deep), which served as a check against Celera's whole-genome shotgun sequence, and this is being used to close some of the 1600 gaps in the Celera sequence.
The BDGP method was successful mainly because it deliberately targeted BAC end-sequences, which are special types of STS, usually found within 500 base pairs of the end of a BAC, and the BDGP team's primary recommendation for future genome-sequencing projects is that they should concentrate on increasing the number of BAC end-sequences. Over a third of the sequence-tagged sites in the \IDrosophila\i mapping project were end-sequence tags.
One important upshot of this work is that, of a set of 289 human genes implicated in diseases, 177 are closely similar to fruit fly genes, including genes that play roles in cancers; in kidney, blood, and neurological diseases; and in metabolic and immune-system disorders. "We can find human tumor-suppressing genes in flies easier than we can in the mouse," says BDGP's Susan Celniker, pointing out that experiments can be done using fly genes that would be impractical (or unthinkable) using human subjects.
Aside from the practical applications of the work, the almost-complete genome stands as both a milestone in the history of genetic research and a doorway to new methods of progress. But the gaps are interesting as well: the five short gaps in the euchromatin map that were not spanned by any clone were also present as gaps in the whole-genome shotgun sequence produced in collaboration with Celera Genomics.
Another interesting aspect of the gaps requires us first to think back to the classic days of genetics, when workers spent a great deal of time looking at the chromosomes in the larvae of \IDrosophila\i. To do this, you had to pull the head away from the body of a larva with very fine forceps, and this pulling action dragged the salivary glands along as well. These glands could then be separated and squashed under a cover slip, before being stained with acetic orcein, which made bands appear on the giant chromosomes found in the glands.
The salivary gland chromosomes are 'giant' because they are made of multiple, perfectly registered copies of the same DNA, repeated side-by-side. This helps the larva produce lots of a particular protein in a short time: to make lots of 'protein X', the larvae make lots of copies of all of the genes, including the protein X genes. These chromosomes are called polytene, and their banding patterns unambiguously identify regions of the chromosomes. For many years, classical geneticists got their training preparing such slides and drawing them under high-powered microscopes.
In the modern use of this effect, BDGP workers separated DNA segments by a method called gel electrophoresis to provide distinct visual identification of the many BAC clones. These clones were then stained a different colour and allowed to hybridise with the polytene chromosomes, at which point each clone would attach itself to the matching portion of the chromosome. When all of the clones were stained and added, it was possible to assess how much of the chromosomes had been covered: their estimate is 97.8% of the euchromatic portion of chromosomes 2 and 3.
Curiously, flies have only twice as many genes as yeast, so going from a simple, single-cell fungus to what one researcher described as an animal that can fly around without crashing into walls, and which has tissues, nerves, muscles, memories and other kinds of complicated behaviors like circadian rhythms, takes very little extra genetic information. This is a little bit like finding that you can not only make a supercomputer from the same components that are used to make a PC, but the parts you need are about what it would take to make two PCs.
Meanwhile, Celera's human sequencing work has already begun and should "start to look like a genome" toward the end of the year, according to a company spokesman. Celera is naturally optimistic, but Rubin has only reasonable hopes at this stage: "It worked better in \IDrosophila\i than most people expected it would. I think it will work for humans. But the problems are more complex for humans, so we'll have to wait and see."
#
"Five Little Pigs",1429,0,0,0
(Mar. '00)
The Edinburgh-based company PPL Therapeutics has announced the birth of five pig clones, produced by the same methods which gave us Dolly the sheep (see \JSheep cloning a success\j, February 1997, and \JFirst transgenic cloned sheep\j, December 1997, among other entries). Since Dolly was produced, researchers have produced cloned sheep, cows, mice, goats and now pigs and, so far, all of these have been produced by nuclear transfer from quiescent cells, the only successful cloning method so far.
The five piglets, which were born on 5 March, have been named Millie, Christa, Alexis, Carrel and Dotcom, named for the Millennium, Christiaan Barnard, and Nobel prize-winner Alexis Carrel, and for the sort of company which seems to be most valuable these days. Only four piglets were expected, based on ultrasound scans, and one of the names was obviously a last-minute addition (answer to this riddle below).
The piglets were created from adult cells using newly patented "nuclear transfer" technology similar to that which led to the birth of Dolly. The piglets are important because their cloning brings closer the time when we can make modified pigs whose organs and cells can be successfully transplanted into humans, the only near-term solution to solving the worldwide organ shortage crisis. PPL Therapeutics expects clinical trials into the use of animal organs for human transplant to start in about four years (see \JPiglets open the way to xenografting\j, this month). This is the first time cloned pigs have been successfully produced from adult cells.
With Dolly, the source of her nucleus, the empty egg cell and the surrogate mother were all from distinctive sheep breeds, but the cloning of the piglets was determined more directly. DNA from blood samples taken from the piglets was shown in independent tests to be identical to DNA from the cells used to produce the piglets, but clearly different from DNA taken from the surrogate mother.
The UK patents GB 2318578 and GB 2331751 on the use of quiescent cells in nuclear transfer were granted quietly to the Roslin Institute (http://www.ri.bbsrc.ac.uk/), the Biotechnology and Biological Sciences Research Council (BBSRC) (http://www.bbsrc.ac.uk) and the Ministry of Agriculture, Fisheries and Food (MAFF) (http://www.maff.gov.uk) in January 2000. A Notice of Allowance was received at about the same time from the U.S. Patent Office for a patent application filed to protect this technology in the U.S. These patents originate from applications filed by Roslin Institute on 31st August, 1995. The techniques have been licensed to PPL Therapeutics (http://www.ppl-therapeutics.com/) and the Geron Corporation (http://www.geron.com/)
And the afterthought name for the last piglet? It was Dotcom, of course, though the researchers were happy to point out that any association between a company's name and "dotcom" seemed to be good for the share price.
#
"Piglets open the way to xenografting",1430,0,0,0
(Mar. '00)
The cloned piglets born during March (see \JFive Little Pigs\j) have brought us one step closer to the point where modified pig organs and cells can be successfully transplanted into humans in a process called xenotransplantation. Pigs are the preferred species for this on scientific and ethical grounds. This is big money territory, as the market for solid organs alone could be worth $6 billion annually, with as much again possible from such cellular therapies as transplantable cells to produce insulin for the treatment of diabetes.
The cloning of pigs by nuclear transfer has been something of a challenge in part because, in the language of medical science, pig reproductive biology is "inherently more intractable" - more difficult to manipulate - and in part because pigs need a minimum number of viable fetuses to maintain pregnancy. Sheep and cows, on the other hand, need only one fetus.
While there are similarities with the cloning method used on Dolly, researchers say that the work, carried out by PPLÆs US staff in Blacksburg, Virginia, also used a new technique, for which they are now seeking a patent. Needless to say, the details of the method are not available at this stage. But cloning the piglets is just the first stage in a project aimed at producing a "knock-out" pig, a pig which has a specific gene inactivated.
The gene targeted is \Balpha 1-3 gal transferase\b, which is responsible for adding a particular sugar group on the branched sugar chains that occur on all cell surfaces. Humans and primates differ from other animals in lacking this enzyme and, as a result, our bodies recognise the sugar's presence on grafted pig organs as a signal to attack because our human immune system considers it to be foreign. The immune response then sets off in the human host an effect called æhyperacute rejectionÆ of the transplanted organ.
PPL's strategy is to inactivate or "knock out" the gene in pigs that makes the enzyme which attaches this sugar group. Without this signal, the xeno organ will not trigger the initial attack. PPL has already knocked out this gene in sheep cells, but has still to introduce the same gene into other pigs cloning. After that, it will be necessary to introduce perhaps three other genes to control the two causes of delayed xenograft rejection.
PPL researchers expect to add a gene to the pigs to produce a naturally occurring protein to moderate the action of the immune system. This gene addition strategy, combined with immune suppression drugs, has already been used with some success by Imutran and other companies that have achieved survival of pigs' hearts in primates for several months. One advantage of this protein is that it reduces the need to use immunosuppressive drugs.
The transplanted xeno organ can still be rejected in the fairly short term by two other mechanisms collectively known as delayed xenograft rejection (DXR). The first of these mechanisms results from a loss of the anti-coagulation factors that normally exist on the surface of blood vessels to stop blood clotting and blocking these vessels. These protective anti-coagulation factors are lost when a xeno organ is transplanted. PPL plans to overcome this problem by adding a second new gene to the pig that will cause the replacement of the anti-coagulation factors when, and only when, they are needed after the organ has been transplanted, thus avoiding creating a sort of artificial hemophilia, if they were added too soon.
DXR is also caused by the appearance, again on the surface of blood vessels, of excessive amounts of a molecule called VCAM, which is normally present only in small quantities. VCAM assists in attracting white blood cells from the blood and infiltrating them into sites of infection and inflammation. It is naturally overproduced in response to infection or inflammation.
VCAM is also overproduced in a xenotransplant situation, with the result that white blood cells pour in to destroy the organ. To overcome this problem, PPL will add a third extra gene to the pig to produce a new protein inside the cell that traps VCAM and prevents it reaching the cell surface. As before, the production of this trapping protein is controlled so that it only comes into play when needed, after the organ has been transplanted.
But why bother? Why not use human organs instead? The sad reality is that there are not enough donors to go around and, even in countries like Austria, where an "opt-out" card scheme assumes organs can be used after death unless the person carried a card to say "no", there are not enough organs for the people needing them. In the US, for example, more than 62,000 patients are now waiting to receive donated hearts, lungs, livers, kidneys and pancreases. A new name is added to the list every 16 minutes, and every day 11 people die waiting.
The shortage arises mainly because most potential organ donors die of old age, cancer or heart disease, which makes their organs unusable. In the future, so called "transgenic cloning" may be used to take the patient's own cells to form whole new organs, but right now, that is a long way off. There are patients now who cannot afford to wait, and xeno organs offer the prospect of a virtually unlimited supply. This can allow a transplantation early in the disease progression, before damage has been done to other parts of the body as the damaged organ loses its function.
The final cause of organ rejection is long-term rejection, which occurs in both allo (human to human) and xeno transplants. In allotransplantation, long-term rejection is controlled by continuous drug therapy, but the magnitude of the response to a xenograft is such that drug therapy alone is unlikely to be enough. Even if a xeno organ only lasted, say, five years, it would not be a problem because a new one could be supplied when required.
In practice, potential transplant patients will be given a transfusion containing modified cells taken from the carefully selected strain of pigs that will supply the organs. This transfusion will "tolerise" the patient and thereby reduce long-term rejection. This treatment would inactivate just those T cells of the immune system that would otherwise attack the pig organ rather than targeting all of the T cells and leaving the patient open to the risk of infection.
In essence, xenotransplantation is the transfer of organs from one species to another. The fundamental problem with transferring organs between species is rejection by the recipientÆs immune system, so these steps, taken together, should make the use of organs developed in pigs a reality, with clinical trials on humans expected to begin in about 2004.
The real risk of pig viruses known as PERVs (porcine endogenous retroviruses) attacking the patients is one that is being watched carefully and, during the next four years, a number of safety studies and safety systems will be put in place to minimise the possibility of introducing pig viruses into humans. There is already evidence, collected by Novartis from some 160 worldwide patients given porcine cells, that suggests viral transmission will not occur easily.
Researchers are also considering whether they can identify the small number of viral DNA sequences embedded in pig genes that are essential to the formation of active viruses. If these can be spotted, it may be possible to target these sequences and remove them from the cloned lines.
#
"Avoiding Bt resistance",1431,0,0,0
(Mar. '00)
The March issue of the journal \INature Biotechnology\i reveals that creating a refuge in a crop field reduces the chance of insects developing resistance to transgenic insecticidal plants. Previously, the notion was just a good theory, but this study shows that the whole concept of a refuge really works in a field situation. According to the lead author, Anthony Shelton, "This is all about managing resistance, and we found that, yes, it is important to have a refuge and to manage those insects within the refuge carefully."
The refuge is a section of plants within the field that have not been genetically engineered to contain the insecticide. It provides a place where insects with no immunity to the insecticide can survive. The insecticide is a protein derived from a bacterium, \IBacillus thuringiensis\i, and the bacterial protein, known as Bt, is regarded as the safest insecticide from an environmental standpoint.
Bt is not harmful to humans, so the only real worry about using it is that insects may be selected that are resistant to the Bt crops. Last year, farmers in the United States planted nearly 19 million acres of those transgenic Bt crops approved by the Environmental Protection Agency (EPA). Using Bt transgenic plants can greatly reduce the use of broader spectrum insecticides, but there is concern that this technology may be short-lived due to insect resistance, according to the article.
The study looked at the likelihood of the diamondback moth developing resistance to Bt broccoli plants. Theory said that a refuge was needed, and also predicted that the refuge should be kept separate from the main crop. In the study, the researchers compared a "20 percent mixed refuge," in which the Bt and non-Bt plants were mixed randomly, with a "20 percent separate refuge," in which a block of non-Bt plants was grown next to the Bt plants.
Then all they had to do was assess the insects' level of resistance over the course of the season. Their results backed up the theoretical models that indicated a separate refuge would be more effective in keeping the diamondback moth from becoming Bt-resistant.
The point of the refuge is to allow a large number of non-resistant insects to mate with the very few resistant insects that might survive and emerge from the Bt crops. Thus, the resistance genes would be diluted, avoiding the very real risk of getting an established population that is homozygous for Bt resistance, that is, carrying two copies of the resistance gene. As expected, the results demonstrated the importance of making sure sufficient insects are generated on the non-Bt plants to mate with any resistant insects that may have survived on the Bt plants.
Key names: Anthony M. Shelton, Elizabeth Earle, Juliet Tang, Richard Roush and Timothy D. Metz
#
"Marcus Wallenburg prize, 2000",1432,0,0,0
(Mar. '00)
In October this year, an Australian scientist will receive an award from the King of Sweden, HM King Carl XVI, for his pioneering work in building performance and safety. Dr Bob Leicester is to be recognised for the way his research has advanced the fundamental understanding of safety, fire performance and durability of wood as a building material. Only one winner is selected worldwide each year.
Leicester's work has led to the development of codes and standards for wooden structures in many countries and the construction of wooden high-rise apartment blocks in areas where previously such buildings were disallowed because it was considered a risky business to use timber in high-rise and prestige buildings.
Dr Leicester came to CSIRO in 1966 after completing his PhD in Structural Engineering from the University of Illinois, Champaign-Urbana in the United States, and he is now Chief Scientist at the CSIRO's Sustainable Materials Engineering division.
The Marcus Wallenburg Prize was established in 1980 by the Swedish forest products company Stora Kopparbergs bergslags Aktiebolag (STORA), now merged with the Finnish company Enso, to recognize, encourage and stimulate research and development of a pioneering nature that significantly increases the knowledge and technological progress in areas affecting the forest products industry.
#
"New RS President",1433,0,0,0
(Mar. '00)
Sir Robert May is to be nominated as the next President of the Royal Society, to replace Sir Aaron Klug, whose five-year term ends on 30 November 2000. Sir Robert, for many years best known as an Australian physicist who turned his interests to the applications of chaos theory, is currently Chief Scientific Adviser to the UK Government.
The election will take place in April and the result will be announced on 13 July. Sir Robert will step down as Chief Scientific Adviser to the Government on 1 September and return to research at the University of Oxford. The Presidency of the Royal Society is an honorary, unpaid position. The President chairs the Council, playing a leading role in the policy direction of the Society.
#
"Mathematicians: Double Soap Bubble Had It Right",1434,0,0,0
(Mar. '00)
In science, even the simplest sort of proof can be a problem, so scientists are always happy when they can use mathematics to prove the truth of something. The only problem is that sometimes even the simplest of problems can prove very challenging, like the four-color map problem (see \JM÷bius, August Ferdinand\j for more on this).
One apparently simple problem that has proved very difficult is the conjecture that the familiar double soap bubble is the optimal shape for enclosing and separating two chambers of air, giving the maximum volume of air for the minimum surface area. An engineer might argue that we know that soap bubbles always act to minimise their surface area for a given volume, that this is when they are most stable, so the answer is obvious. Scientists and mathematicians have to allow for the possibility that there may be an even more stable formation that the double bubble simply cannot reach in a natural way. They have to allow for possible 'wild' solutions, and it is this which makes the proof a challenge.
In mid-March, four mathematicians announced a mathematical proof of the Double Bubble Conjecture in an address to the Undergraduate Mathematics Conference in Indiana. In summary, they explained that when two round soap bubbles come together, they form a double bubble and, unless the two bubbles are the same size, the surface between them bows a bit into the larger bubble. The separating surface meets each of the two bubbles at 120 degrees.
The mathematicians have now shown that this precise shape has less area than any other way to enclose and separate the same two volumes of air, even wild possibilities in which the second bubble wraps around the first, and a tiny separate part of the first wraps around the second. (They showed that the wild possibilities are unstable by a new argument that involves rotating different portions of the bubble around a carefully chosen axis at different rates.)
The breakthrough follows a 1995 proof of the special case where the bubbles are equal, but a group of undergraduates has now also extended the theorem to 4-dimensional bubbles. Ben Reichardt, Yuan Lai, Cory Heilmann and Anita Spielman have found a way to extend the proof to 4-space and, in certain cases, 5-space and above.
A preprint of the whole mathematical paper is on the Web at http://www.ugr.es/~ritore/bubble/bubble.htm and the mathematical research announcement is also to be found on the Web at http://www.williams.edu/Mathematics/fmorgan/ann.html and
Key names: Frank Morgan, Michael Hutchings, Manuel Ritori and Antonio Ros.
#
"SKA: the next generation radio telescope",1435,0,0,0
(Mar. '00)
While it will not see 'first light' for some time yet, plans for SKA, the Square Kilometer Array radio-telescope, were advanced somewhat in a conference of more than 60 radio astronomers held at the end of February at Arecibo Observatory. SKA will be a radio telescope composed of perhaps 1000 antennas spread out over more than 1000 km (600 miles) and costing more than US$600 million dollars, making it probably the largest scientific instrument ever assembled.
The global astronomy community hopes that if funding and technical problems can be overcome, the massive instrument will be focusing on the distant universe within a decade or so. They expect it to make possible high-resolution probes of the outer edges of the universe, giving a window on the evolution of galaxies, the birth and death of stars and a detailed portrait of our own solar system.
Planning for SKA began in 1997 by institutes in six nations, and Holland, Canada and China have each produced a different telescope design. It will be made up of a large number of small antennas, each acting as a separate radio telescope, and it opens up a whole series of new challenges, like how to prevent radio interference from affecting such a hugely sensitive instrument, how to write the complex software needed to operate the telescope, where to site it and how to pay for it. The general assumption is that a third of the money will come from the United States, but the rest is open to guesswork.
Using as many as a thousand antenna "stations" would deliver superb imaging fidelity, with more accurate calibration than currently possible, and with each station capable of imaging objects in many directions simultaneously. The telescope would be an interferometer, which means the signals would need to be compared in real time, so the system would depend heavily on electronics. Estimates are that the array would need about 5000 km (3000 miles) of connecting fiber-optics cable. But while the telescope would extend over 1000 km, the actual area of all the antennas would be about a square kilometer, or about a half square mile.
In spite of the gaps in the array, processing allows the final image to be built up by taking radio signals from distant objects in the universe that have been captured by separate antennas and brought together at a central processor. The resolution would then be equivalent to that of an image produced by an antenna 1000 km in diameter. This would be 100 times more sensitive than today's most powerful radio telescopes, such as Arecibo and the Very Large Array (VLA), west of Socorro, New Mexico.
The location is also a matter still to be decided, since it must be relatively free of radio interference and in an accessible region of a politically stable country with no weather extremes. At the top of the list right now is the Upper Gascoyne-Murchison region of Western Australia, followed by the Southwestern United States. There is no indication at this stage of who will make this decision and how.
The big problems will be getting reliable costing and then staying within the budget, as the major technical problems facing the planners are mainly cost-related. Astronomers will need to recall that the superconducting supercollider overran its budget and was canceled by the U.S. Congress. If the budget can be met and if radio interference problems can be solved, SKA should be able to probe "by many factors of 10" more of the universe than is now possible with the Arecibo radio telescope, according to one speaker, James Cordes. It would allow the detection of tens to hundreds more pulsars in other galaxies, and may even answer such questions as "What is the endgame for neutron stars?" and "What is the relation of neutron stars to supernovae?"
An array like this would also be important for space exploration as well. If SKA had been in place when the Mars Climate Orbiter spacecraft was lost as it went into Martian orbit, said Sandy Weinreb of NASA's Jet Propulsion Laboratory, it could well have ensured that the mission's off-course problems would have been detected and corrected.
\BRelated Web sites\b: the SKA U.S. Consortium is at http://www.usska.org and the Arecibo Observatory is located at http://naic.edu
#
"A new spin on a strange class of pulsar",1436,0,0,0
(Mar. '00)
An Anomalous X-ray Pulsar (AXP) is reported to have experienced a quake, a sudden, catastrophic shifting of the star's interior, that is similar to quakes seen in regular neutron stars. This provides strong confirmation that the AXP is indeed a neutron star and has properties surprisingly similar to its "non-anomalous" cousins.
This was revealed by Victoria Kaspi at the Rossi 2000 meeting at NASA Goddard Space Flight Center in late March when she outlined work done with undergraduate student Jessica Lackey and Dr. Deepto Chakrabarty. There is speculation now that Kaspi's finding may also support the magnetar hypothesis, which predicts the existence of neutron stars up to a thousand times more magnetic than the already strongly magnetic neutron star.
According to Kaspi, the problem in the past has been the lack of the right instrument to check whether earthquake-like events might occur on these stars, since no X-ray astronomy satellite in the past had the ability to observe these objects as often and as regularly as was needed. She said in a NASA release that "Thanks to the Rossi X-ray Timing Explorer (RXTE) satellite, we now know for certain that glitches occur in AXPs, and we can study the interiors of these unusual objects using a form of 'seismologyÆ, rather like the way geologists study the interior of the Earth from earthquakes."
A neutron star is the fast-spinning remnant of a star once several times more massive than the sun that has exhausted its nuclear fuel and subsequently exploded its outer shell. The inner core, which is all that remains, is very massive, but it has a diameter of only about 15 km (around 10 miles). A pulsar is a type of neutron star that pulses, emitting radiation from its north and south poles, where it is chaneled by the star's strong magnetic fields. If the polar region points at us only some of the time, we see a series of blips, much as we see flashes of light when a lighthouse beacon points at us.
Pulsars have been known since 1967 as radio-wave emitters, but once the first X-ray satellites were launched, astronomers found that many pulsars also gave out X-ray radiation, and then they found a handful of pulsars that emit their light exclusively as X-rays. The slow-spinning AXPs are among these X-ray-only pulsars, and the puzzle astronomers are trying to unravel is just how AXPs are related to the well-studied radio pulsars, if at all.
One key piece of evidence would be to show that AXPs have many of the same quirks as the well-studied radio pulsars, oddities like occasionally, without warning, spinning faster than normal. These sudden spin-up events are called "glitches." These are probably rather like earthquakes, but their technical name is vortex line upspinning, and it is thought to happen beneath the stellar crust.
Kaspi's group has been monitoring the spins of several of the five known AXPs in the hope of detecting a similar spin-up event. Finally, one of their targets, called 1RXS J1708-4009, suddenly started spinning faster in a glitch almost identical to glitches that have been seen in the Vela radio pulsar, a well-studied neutron star in a nearby system that exploded some twenty thousand years ago, providing clear evidence that this AXP is a neutron star with an internal structure just like the radio pulsars.
Finding a glitch in an AXP supports the magnetar hypothesis, but does not provide irrefutable proof, and more searching is going to be needed. There may be many more AXPs, but they are hard to identify because they are isolated and emit only X-ray radiation. This rules out the use of radio and optical telescopes to locate them, and RXTE, launched in December 1995 and still going strong, is one of only three or four satellites best optimized for AXP hunts. See also \JMore magnetar news\j, September 1998.
#
"NEAR team reports exciting first month",1437,0,0,0
(Mar. '00)
NASA's Near Earth Asteroid Rendezvous (NEAR) spacecraft is now officially renamed NEAR Shoemaker, in honour of Dr. Eugene M. Shoemaker, a legendary geologist who influenced decades of research on the role of asteroids and comets in shaping the planets (see \JObituary for July 97\j). And after barely a month in orbit around asteroid Eros, the craft is providing evidence of geologic phenomena that could have originated on a much larger parent body from which Eros was derived.
The high resolution images coming back are surprising scientists with the abundance of ridges, chains of craters, and boulders. NEAR Shoemaker's first x-ray detection of Eros has demonstrated the presence of magnesium, iron and silicon, and possibly of aluminium and calcium. On March 2, when NEAR was 131 miles (212 kilometers) from Eros, a brilliant solar flare caused elements on the asteroid to react and emit fluorescent x-rays. The spacecraft's x-ray spectrometer was then able to view the asteroid and measure the x-ray burst from four times farther away than it was designed to do.
"It was only a 600-second window of opportunity but it is a huge bonus for the mission. This detection at the higher orbit gives us confidence in our ability to develop elemental maps when we're at our operational orbit of 50 kilometers," said Jacob I. Trombka of NASA's Goddard Space Flight Center. The spacecraft's laser rangefinder, operating 290 kilometers (180 miles) from Eros, measured topographic profiles of chains of pits or craters and, in the last two weeks, the NEAR multispectral imager has returned more than 2,400 images. The mission is due to end in February 2001. Between now and then, updates will be available on the Web at http://near.jhuapl.edu
#
"Pioneer 10 checks in",1438,0,0,0
(Mar. '00)
Pioneer 10 left Cape Kennedy (Cape Canaveral) aboard an Atlas Centaur rocket on March 2, 1972, headed out on a two-year mission to Jupiter and, 28 years later, the probe is still going. It is now about twice as far from the Sun as Pluto, headed more or less in the direction of the first magnitude star Aldebaran, 7.1 light years away, at 13 km/s (28,000 miles per hour). In about another 300,000 years, it will reach the star Ross 248 in the constellation of Taurus the Bull, some 10.3 light years away.
By that time, humans may well have got there first, or they may have sent probes there that travel much faster, and we will certainly have lost contact with Pioneer 10 by then. For now, though, we are still in contact, and during February, Pioneer 10 called home.
The craft is powered by electricity derived from the warmth of decaying plutonium 238 and, while this has a half-life of 92 years, the thermocouples that convert heat energy to electricity are degrading faster. As a result, mission controllers think that there will not be enough electricity to power Pioneer's transmitter for much longer. It survived dangerous passages through the asteroid belt and Jupiter's magnetosphere, but it cannot withstand the slow and steady breakdown of its components.
The signal that leaves the spacecraft has a power of 8 watts, but the signal reaching NASA's Deep Space Network antennas has a strength of .3 billionths of a trillionth of a watt, and contains very little useful information, which raises the question: why bother? The answer is that by tracking the craft, scientists contribute to a study of chaos theory and to learn more about conditions in the solar system beyond Pluto, at the edge of the heliosphere.
To improve reception, ground controllers sent commands to Pioneer instructing the craft to make a turn, but, because Pioneer is so low on power, its transmitter had to be switched off to allow it to execute the turn. It flew blind for 90 minutes, and then the transmitter was reactivated. Pioneer is now 11 billion km (6.8 billion miles) away, so radio signals take just over 10 hours to reach us, but after waiting the appropriate time, anxious ground controllers received a signal that the turn had been successfully completed.
Not everything has gone smoothly with Pioneer - on December 8, 1992, when Pioneer was 8.4 billion km (5.2 billion miles) away, the craft made an unexpected course change. The most likely explanation, astronomers say, is that the craft passed close to a Kuiper Belt Object, or KBO. KBOs are thought to be frozen asteroid-sized bodies, similar in composition to Pluto, circling the sun at vast distances beyond the outermost planets.
Yet even that was a victory for science, since this is just the second time in history that a solar system object has been discovered by its gravitational effect alone. The first, of course, was when Neptune was discovered in 1846, after its position was predicted by its gravitational tug on the planet Uranus. (The use of gravitational data is alive and well - see \JUranus' lost moons\j, this month, for a recent application.)
Pioneer 10 is not alone in moving out into space beyond the Solar System, but it is the only craft moving in the opposite direction to the Sun's motion through the galaxy. It is now probably close to the outer limits of the heliosphere, a bubble carved out of the gaseous interstellar medium by the solar wind, but these limits were once placed somewhere near Jupiter. Since then, the Pioneer and Voyager space probes have shown us that the heliosphere is much bigger, extending out to at least twice the orbit of Pluto. So by tracking Pioneer 10 for as long as possible, we may be able to set a limit to the bubble that holds us all.
#
"New small planets",1439,0,0,0
(Mar. '00)
In the past, the 30 planets of stars other than the Sun have all been gigantic, larger than Jupiter. Saturn-sized planets can now be found orbiting distant stars, and the discovery of two such stars during March offers hope that many stars harbor smaller planets in addition to the Jupiter-sized ones. The discovery was made by the usual team: Geoff Marcy and Paul Butler of the Carnegie Institution of Washington, and Steve Vogt of the University of California, Santa Cruz, using the mighty Keck telescope in Mauna Kea, Hawaii.
Their finds include a planet at least 80 percent the mass of Saturn orbiting 6 million km (3.8 million miles) out from the star HD46375, 109 light-years away in the constellation Monoceros, and a planet 70 percent the mass of Saturn orbiting 52 million km (32.5 million miles) away from the star 79 Ceti (also known as HD16141), located 117 light-years away in the constellation Cetus.
It is significant that these planets are very close to their stars and so have short orbits. They whirl around their parent stars with periods of 3.02 days and 75 days, respectively, which makes them easy to spot quickly. But even if these are oddities, finding them reinforces the theory that planets form by a snowball effect of growth, from small ones to large, in a star-encircling dust disk. This theory, which has been around for about 20 years, predicts there should be more smaller planets than large planets. But even if the theory does not hold up, we are reassured in another direction: there was always the nagging possibility that some of the extrasolar planets might really be stillborn stars, called brown dwarfs, which would form like stars through the collapse of a gas cloud. Now we are reassured that the "Jupiters" are accompanied by many more planets that are the mass of Saturn or smaller.
Marcy describes the process as being like looking at a beach from a distance: "Previously we only saw the large boulders, which were Jupiter-sized planets or larger. Now we are seeing the 'rocks,' Saturn-sized planets or smaller. We still don't have the capability of detecting Earth-like planets, which would be equivalent to seeing pebbles on the beach."
The detection feat is quite amazing. The change in the stars' velocities as they wobble under the attraction of the small planets, is just 11 meters per second (36 feet per second), slightly faster than a sprinter in the 100 meter dash, but seen from rather a long way away. But while these planets are smaller, they would not be good for life: the planet orbiting 79 Ceti has an average temperature of 830C (1530F), while the planet around HD46375 has an average temperature of 1130C (2070F). They are presumed to be gas giants, made mostly of primordial hydrogen and helium, rather than the rocky material Earth is made of.
Perhaps more significantly, the planets would have disrupted the orbits of any smaller terrestrial planets like Earth. These "marauding" gas giants seem more the rule than the exception among the planets surveyed so far because Marcy and Butler's detection technique favors finding massive planets in short-period orbits, and thus may be giving us an entirely biased picture of other solar systems.
#
"Uranus' lost moons",1440,0,0,0
(Mar. '00)
One day before the space shuttle Challenger tragedy, Voyager II scientists proudly announced the discovery of two small moons around the planet Uranus. Initially dubbed 1986 U7 and 1986 U8, the moons later officially received the Shakespearean names Cordelia and Ophelia; however, in the 14 years since their detection, astronomers have been unable to find them.
Now they have been found again, using physical theory and images from the Hubble Space Telescope. Astronomers believe these two moons keep Uranus' thin epsilon ring from gradual radial spreading and eventual dissolution, and this is why they are referred to as "shepherd moons." They are small, maybe only 30 to 45 km (20 - 30 miles) across, and so far away that even the powerful Hubble Space Telescope has had trouble seeing them.
While the orbits were calculated in 1986, Voyager II only provided data over two weeks before it swept on out of range, making it impossible to predict with useful accuracy where Ophelia and Cordelia would be in their orbits a decade later. Erich Karkoschka, a researcher with the University of Arizona's Lunar and Planetary Lab, found a way around this using Hubble Space Telescope images taken in 1997. He electronically stacked dozens of Hubble images on top of each other, matching them pixel for pixel and allowing for the orbital motions of the moons.
By comparing the images in this manner, Ophelia popped clearly into view, but Cordelia still was missing. Richard French and Philip Nicholson had already analyzed precise measurements of the radii of both ring edges obtained from stellar occultation data going back to 1977. They had been seeking wavelike distortions that might provide direct evidence of gravitational interactions between the shepherd moons and the ring. The two astronomers found a telltale pattern of ripples at the ring's edge, with amplitudes and wavelengths that matched the predictions of the shepherding theory. These ripple patterns revolve around the ring at rates that match the orbital motions of Cordelia and Ophelia.
When they examined the data, using Karkoschka's measurement of Ophelia's position, this matched their wave-derived measurements of where one of the moons should have been and, looking into their data, they could see where Cordelia must be.
Now the ball passed back to Karkoschka, who went back to the Hubble Space Telescope images and, sure enough, found Cordelia exactly at the expected position. Now the two new moons are permanently plotted and located.
#
"Lunar cratering and the Cambrian explosion",1441,0,0,0
(Mar. '00)
What triggered the "explosion of life" in the Cambrian era? So far, the jury is out, but a mid-March report in the journal \IScience\i notes that impact cratering on the moon (and, by inference, on the earth), after dropping away around 3500 million years ago, suddenly increased about 400 million years ago, just when life on Earth took off with a dramatic burst in the number and diversity of species.
While we assume that impacts are necessarily bad (see \JImpact devastation\j, this month), perhaps impacts at a lower level may act like pruning on a tree, but instead of stimulating growth and the production of more fruit, lower-level impacts could have had the effect of opening up new niches for unusual life forms to move into and flourish.
The earliest records of life on earth date from the period approximately 3.5 billion years ago, when the report shows the intensity of impacts was decreasing. This led one of the authors to speculate that life began on Earth many times, but the comets only stopped wiping it out about three or four billion years ago.
Dating the chronology of impact craters on earth is difficult because of erosion, sedimentation, and plate tectonics, but how do you form a picture of overall impact patterns on the Moon without visiting each one? The answer is simple: you collect a gram of lunar soil, and look for the impact record preserved in spherules, microscopic glass beads formed when droplets of molten basalt splashed out of a crater by the heat and force of an impact subsequently cooled and hardened.
The researchers obtained from NASA a gram of lunar soil inside of which they found 155 spherules, ranging in size from less than 100 microns to more than 250 microns. The age of each of these tiny spherules was determined at the Berkeley Geochronology Center with an ultrasensitive technique based on the ratios between two argon isotopes. The ratio of argon 40 to argon 39 was measured by neutron irradiation followed by laser-driven mass spectroscopy, and then it was just a matter of looking up a standard table to determine the age of each spherule.
The sample came from lunar soil picked up in 1971 by the Apollo-14 mission crew near Mare Imbrium (Sea of Rains), which is the dark crater that dominates the Moon's face. According to statistical analysis, the spherules came from 146 different craters. So even though the crater that any spherule came from will always be an unknown, they had a picture of the age distribution of craters on the Moon
The slow decrease in impacts is easy enough to explain: it comes from the action of Jupiter in slowly filtering the solar system, dragging the dangerous flying bits down to its surface. But how do we account for the sudden increase 400 million years ago? Perhaps, just perhaps, this is the evidence we have been seeking for a companion star to the sun. When the existence of such a star was postulated in 1984, Richard Muller, one of the authors of this report, suggested that it be called Nemesis, after the Greek goddess of retribution. Perhaps, says Muller, some other passing star perturbed Nemesis into a more eccentric orbit that would account for the increase in impacts.
Perhaps. Most scientists will probably want a bit more evidence than that before they take the Nemesis theory on board, but it is certainly a \Ipossible\i explanation.
Key names: Richard Muller, Timothy Becker, Paul Renne and Timothy Culler.
#
"Asteroid devastation",1442,0,0,0
(Mar. '00)
If or when a huge asteroid hits the Earth, the catastrophic destruction it causes and even the "impact winter" that follows might be only a prelude to a different, but very deadly, phase that starts later on, what two scientists call "ultraviolet spring." The analysis, published in \IEcology Letters\i, makes frightening reading.
First would come the enormous devastation, huge tidal waves, and a global dust cloud that would block the sun and choke the planet in icy, winter-like conditions for months, as spelt out in certain recent popular movies. But Andrew Blaustein and Charles Cockell examined an asteroid impact of a magnitude similar to the one that occurred around the Cretaceous-Tertiary, or K-T boundary, when there was a massive extinction of many animals, including the dinosaurs. That one is believed to have hit off the Yucatan Peninsula with a force of almost one trillion megatons.
After the immediate effects already listed, the atmosphere would become loaded with nitric oxide, causing massive amounts of acid rain. As a result of the acid, lakes and rivers would have reduced amounts of dissolved organic carbons, which would allow much greater penetration of ultraviolet light. At first, dust clouds would keep the ultraviolet out, but this just sets the stage for a greater disaster later on, since many animals depend on some exposure to ultraviolet light to keep operational their biological protective mechanisms against it. In the absence of any UV, these protective systems would cease to operate, so that they would be unprotected when the ultraviolet spring came.
The authorsestimate that the dust cloud would shield the Earth from ultraviolet light for just over a year, by which time UV levels would be as they are now, as the dust settled. Then UV levels would continue until they were at least double current levels, about 600 days after impact. They estimate the levels of ultraviolet-related DNA damage would be about 1,000 times higher than normal, while general ultraviolet damage to plants would be about 500 times higher than normal.
So why did the K-T event not have this sort of effect? Luckily, that impact hit a portion of the Earth's crust that was rich in anhydrite rocks. This produced a 12-year sulfate haze that blocked much of the ultraviolet radiation. It was a close call, though, as such rocks cover less than 1% of the planet's surface. We may not be so lucky next time, and the key point is to note that there will be many synergistic effects on the environment that go far beyond the initial impact. The collision will be devastating, and the "impact winter" deadly, but it will be the ultraviolet spring that will help finish off the survivors.
#
"Noisy snow",1443,0,0,0
(Mar. '00)
For anybody who has been out in falling snow, there is a sense of silence as the flakes drift slowly down to the ground, covering the ground in a soft, sound-absorbing layer, but water animals may have a slightly different view of snow. To those water animals with a keen sense of hearing, these falling flakes can create an enormous racket just below the surface of the water, according to a report in a recent issue of the \IJournal of the Acoustical Society of America\i. The flakes can also pose problems for electronic "ears" by blurring sensitive sonar readings.
The problem was uncovered by analyzing recordings made underwater during winter storms, and the noise is being blamed on oscillating bubbles that are too small and short-lived to be seen by the naked eye. As a snowflake falls onto the water surface, it deposits a tiny amount of air just beneath the surface. Then, before the bubble reaches the surface and pops, it sends out a piercing sound, ranging from 50 to 200 kilohertz, too high-pitched to be heard by human ears, which generally pick up nothing higher than 20 kHz, but well within the range used and heard by porpoises. When the pocket of air is submerged, it has to adjust its volume, and it will do so by oscillating. And when it oscillates, it emits noise, up to 30 decibels above the ambient level.
The proof came first from a motel swimming pool in Roanoke, Virginia, where sensitive hydrophones underwater microphones were used to capture the sound of snowflakes falling on the pool surface. The acoustical "fingerprints" of the snowflake sounds turned out to be identical to those of bubbles. At this point, Prosperetti, an internationally respected expert in bubble physics who had earlier conducted studies of underwater noise generated by raindrops, was drawn into the study.
His first step was to brush up on his snowflake science. "Snow is incredibly complex," Prosperetti explains in a news story posted on the Internet. "The shape and size of a snowflake depends very critically upon the temperatures to which it has been exposed, not just the temperature on the surface of the Earth, but also in the cloud where it formed."
Prosperetti says that snowflakes are made of many ice crystals arranged together in such a way as to leave a lot of space for air, making them about one-tenth the density of water. So when this bundle of 90% air and 10% ice hits the water, the ice part melts, and the air inside is released as a bubble.
Aside from the obvious defence aspects (the study was funded by the US Office of Naval Research), this snowflake noise can create electronic "clutter" for people who use sonar devices to track migrating fish. The researchers believe their work could help engineers develop equipment that can filter out sounds made by snow falling on water.
Key names: Lawrence A. Crum, Andrea Prosperetti, Hugh C. Pumphrey and Ronald A. Roy.
#
"Satellite images of massive Antarctic iceberg",1444,0,0,0
(Mar. '00)
A giant iceberg, close to record size, has just peeled off Antarctica's Ross Ice Shelf and is now adrift in the Ross Sea. The giant ice block is about 295 kilometers (183 miles) in length and 37 kilometers (23 miles) wide, with an area of some 11,000 square kilometers (4300 square miles). The US National Ice Center numbers icebergs as they are discovered, so the new iceberg has now been named B-15. (The rules used to name icebergs will be found in \JGiant iceberg spotted\j, October 1998.)
Images of the iceberg from two polar orbiting satellites are available at the following URL: http://www.news.wisc.edu/newsphotos/iceberg.html, while real-time satellite imagery can be obtained from the Antarctic Meteorological Research Center at http://uwamrc.ssec.wisc.edu/amrc/iceberg.html
Another iceberg, about 130 by 20 kilometers (80 by 12 miles), with an area of 2480 square kilometers (960 square miles) has since broken off, and predictions are that B-15 will collide repeatedly with the Ross Ice Shelf and cause other large bergs to split off. A small piece has also broken off B-15, which is now called B-16, and the newest berg already appears to have broken into as many as four smaller pieces.
#
"Better charcoal, better metals",1445,0,0,0
(Mar. '00)
Europe was changed in the Middle Ages as charcoal burners cut down forests to make charcoal for use as fuel, and also for use in winning metals from their ores. Later, coal became the fuel of choice, called sea-coal in Britain, since it was usually transported by ship, and coke made from coal was used to reduce metal ores to their metals.
In this way, the nations of Europe avoided wasting their forests. But now, in the name of avoiding forest wastage, Australia's \JCommonwealth Scientific and Industrial Research Organization (CSIRO)\j is looking at returning to charcoal as the reductant. The raw material for the charcoal in this plan is to be waste timber from forestry.
High-purity silicon demands the use of high-purity carbon, which means it really needs the cleaner and purer wood-based carbon to remove oxygen efficiently from silica ore, leaving a pure metal that can sell for around $2000 a ton. Australia's small, but burgeoning silicon industry is worth $20 million a year. High-purity silicon is used in products like chemical silicones and also further refined for integrated circuits and solar cells.
The charcoal is made by combining a process known as Fluidised Bed Carbonisation with one similar to briquetting. The small particles of carbon made from sawdust are useless as reducing agents in silicon production, which is where the briquetting part comes in. Step one is the CSIRO's unique Fluidised Bed Carbonisation Process, now licensed to Enecon Pty Ltd. This converts all forms of wood residue, bark, sawdust and chipped solid offcuts, to charcoal. There is no shortage of raw material, as 40 to 65% of a sawlog ends up as waste, requiring landfill space or incineration.
A major plus is that the Fluidised Bed Carbonisation Process also burns the smoke, which contains the volatile chemicals produced during carbonisation. Aside from reducing pollution, this releases the energy stored in the volatiles. The energy is recovered and used to generate electricity that can then run drying kilns at the timber mills or reduce non-renewable electricity consumption at power-hungry smelters.
Step two is to form the charcoal into lumps the right size for industry using the briquetting process. Briquettes for the metallurgical industry need to have low contamination of critical elements (ash content and other inorganic impurities), be impact resistant and have enough strength at high temperatures to support the weight of the charge on top. The CSIRO has now achieved this.
#
"The coal-powered supersonic transport",1446,0,0,0
(Mar. '00)
The 219th national meeting of the American Chemical Society was held in San Francisco in March, and one of the more implausible-sounding papers discussed how coal could be used to power jet aircraft. The simple fact of the matter is that coal-derived fuel is less likely to form engine-clogging coke deposits in jet engines at higher speeds and temperatures, according to John AndrΘsen, of the Energy Institute at Pennsylvania State University.
The petroleum-based fuel now used in commercial jet aircraft is typically exposed to operating temperatures below 300C (600F). As aircraft speeds increase, temperatures are expected to reach up to 482C (900F) and higher, said AndrΘsen. Coal-derived liquid jet fuels can withstand temperatures up to about 500C for several minutes because the coal-derived fuel is rich in cycloalkanes, which are more thermally stable than the straight-line linear alkanes found in petroleum.
The cycloalkanes are ring compounds, in which a number of carbon atoms form a closed loop, and they do not have the problem of linear alkanes, which are highly susceptible to thermal degradation at temperatures above 400C, which results in solid particle formation, or coke. Using coal-derived fuels would allow jets to operate at high speeds for much longer, and has the additional advantage that, in the US, coal is much more plentiful than oil, which could reduce reliance on imports.
The Penn State team are working on bituminous coal, the most abundant type of coal in the United States, for their research project. Other coal-rich countries like China and Australia could also benefit from this sort of work, but they are richer in brown coal, which is of a lower grade, and may not get the same advantages from the new technology. However, while more processing is needed to produce jet fuel from coal than from petroleum, AndrΘsen believes it is still economically viable and competitive.
Scientists at Bell Labs, the research and development arm of Lucent Technologies, reported in March at the Optical Fiber Communications (OFC) Conference that they had achieved the world's first long-distance triple-terabit data transmission. In fact, they went further, and sent a record 3.28 terabits (a terabit is a trillion bits) of information per second over 300 kilometers of an experimental Lucent TrueWave(r) optical fiber.
They did this using three 100-kilometer fiber spans, and they transmitted 40 gigabits (billion bits) over each of 40 wavelengths, or colors, of light in the conventional C-band frequency range and 40 gigabits (Gb/s) over each of 42 channels in the long-wavelength L-band range. They used both dense wavelength division multiplexing (DWDM), a technology that combines multiple wavelengths onto a single fiber, and distributed Raman amplification, a technique that allows optical fiber to amplify the signals traveling through it.
#
"Fossil snake with legs discovered",1448,0,0,0
(Mar. '00)
In March, a report in the journal \IScience\i described the discovery of a new species of fossil snake some 95 million years old, found in a limestone quarry north of Jerusalem. The remains were actually dug out of the quarry more than 20 years ago, but have only now been recognised for what they are. The snake is named \IHaasiophis terrasanctus\i after a Hebrew University professor named George Haas, who obtained it from quarry workers at Ein Yabrud quarry, and placed it in storage.
The well-preserved fossil sat largely unstudied on a shelf until Mike Polcyn, an American graduate student in paleontology, carried pictures of it and other undescribed specimens that he took while on a trip to Israel in 1996 back to the United States. Once the significance of \IHaasiophis\i, which is a little over a meter long and has hind legs about 2 cm long, was recognised, a group of scientists went back to mount a more detailed search of the quarry.
The original specimen has both the head and lower jaws intact, which is unusual, as snake heads usually get spread and scattered, but it is not the first snake with legs to come from the quarry: Haas described the first limbed snake, \IPachyrachis problematicus\i, in 1979, but at the time he placed it not as a snake but as a marine lizard known as a dolichosaur. It has long been widely regarded as the most primitive snake, and evidence of a link between mosasaurs, giant swimming lizards of the Cretaceous Period, and true snakes.
Now, the analysis of the head and jaws of \IHaasiophis\i presented in the \IScience\i article gives us a different insight, showing the two fossil snakes as closer to more advanced terrestrial snakes such as boas and pythons, which means they lose their status as potential snake ancestors, because they have head structures which are well along the way to snakes as we know them. On this reading of the facts, neither \IPachyrachis\i nor \IHaasiophis\i has anything to do with snake origins, and it now seems less likely that snakes evolved from mosasaurs.
Snakes like boas and pythons have a distinctively mobile skull structure. They can almost unhinge their jaws, and 'walk' their skulls over their prey, dining on meals larger than the diameter of their own heads. Both the Ein Yabrud fossils now appear to have skull architecture similar to these modern serpents, while earlier interpretations of \IPachyrhachis\i had concluded that this animal was incapable of such feats.
From this, they decided that \IPachyrhachis\i had a modified gape similar to that of the mosasaurs. So on this earlier interpretation, the \IPachyrhachis\i skull was an intermediate step between the rigid skulls of lizards and the mobile skulls of higher snakes.
Now, with the similar and almost-perfect \IHaasiophis\i to study, this interpretation has been ruled out, and the mosasaurs now seem less likely as potential ancestors of the snakes. The origin of snakes still lies somewhere among the lizards: during the evolutionary process, their head-bones changed, and they developed long bodies with specialized vertebra and muscles. They also lost their legs, but we now believe from what we can see in \IHaasiophis\i that the unhingeing jaw of the boas and pythons, the long body form and snake locomotor pattern all evolved before the hind legs disappeared.
Of course, it is always possible that snakes lost their legs more than once or that they re-evolved them. The only question now is what the tiny legs were used for, as they were far too short to be used in locomotion. One possibility is that they were somehow used in reproduction and stimulation, much as the spurs on anacondas are used today. These are mere stubs of cartilage, tipped in bone, but they are clearly derived from legs.
The researchers favour the idea that these snakes regained their legs after ancestors had 'lost' them, but this is not quite the same as suggesting that the legs were evolved a second time 'from scratch'. The point to keep in mind is that the 'leg genes', the genes required to form legs, are probably still there in today's snakes, but they are switched off, or repressed.
If so, under the right circumstances, the leg genes could be activated again. So we cannot entirely rule out the 're-evolution' theory simply because it seems unlikely. In fact, the researchers argue that losing legs is such a common process in snake and lizard lineages that it must be easy to 'switch legs off', and this probably means they could be switched on again just as easily.
The location of the quarry in the Middle East, in an area that was shallow sea in Cretaceous times, tells us that these snakes lived in the sea. Other fossils found at Ein Yabrud include sharks, turtles, primitive mosasaurs and plants.
Key names: Eitan Tchernov, Olivier Rieppel and Hussam Zaher.
#
"Dominant birds stay leaner",1449,0,0,0
(Mar. '00)
Perhaps it is true that the early bird gets the worm, but a research report in \IThe Condor\i recently suggests that it is better to be the late bird, to dine just before going to bed, packing in the food just when birds need it most, before a chilly winter night.
The curiously named (for his field) Professor Thomas Grubb and his colleagues have been looking at the weight of socially dominant birds and when they eat. They find that socially dominant birds are generally leaner than their subordinate peers of the same species, probably because they can eat when they want and do not face as great a risk of starvation. They stay lean during the day, which helps them stay more maneuverable during attacks by predators. The leaner birds also have more time during the day to watch for predators, rather than spending most of their time looking for food.
But while natural selection would tend to keep a bird as thin as possible, they need to have enough energy to get through the night. By eating late, the dominant birds can reduce their risk of predation without increasing their risk of starvation. Birds lower in the pecking order do not have such a predictable food supply during the day because they must look for food in places that dominant birds would not normally bother with. And if the dominant birds do move in on some lesser food supply, they can displace the subordinate birds, chasing them away from the food. This means the subordinate birds must carry more fat during the day as an 'insurance policy,' making them more vulnerable to capture by hawks and other predators.
The work focused on three woodland species in central and northeastern Ohio: the Carolina Chickadee, Tufted Titmouse and White-breasted Nuthatch, and used data gathered from November to March in the years 1988 to 1997. In these birds, dominant status depends on gender and age. In each species, adult males top the hierarchy, followed by juvenile males, then adult females and, finally, juvenile females.
Using the data from hundreds of birds, they showed that dominant birds remained lean throughout most of the day, while the subordinate birds tended to be heavier. During daylight hours, a bird can gain as much as 10% of its total body mass each day in fat. Gaining fat before nightfall can help birds survive in winter because they often go into hypothermia as a survival mechanism, dropping their body temperature from 40C (104F) to as low as 17C (62F), after losing their accumulated body fat during the night.
When the dominant bird feeds at nightfall, the extra supply of stored energy means its temperature doesnÆt drop so far, and that means it is more aware of its surroundings, therefore reducing its risk of being caught by predators in the night.
Key names: Thomas Grubb, Vladimir Pravosudov, Paul Doherty Jr., C.L. Bronson, Elena Pravosudov and Andrew Dolby.
#
"Neandertals not like us",1450,0,0,0
(Mar. '00)
A report in \INature\i at the end of March reports on mitochondrial DNA (mtDNA) analysis of material taken from a carbon-dated Neandertal infant. This is only the second time molecular analysis of a Neandertal has been possible and the first molecular analysis undertaken on a specimen that has been radio-carbon dated and shown to be alive at the same time as modern humans.
The results show that modern man was not in fact descended from Neandertals, supporting the out-of-Africa model of modern human evolution where modern humans emerged from Africa around 100,000 ago. They then replaced archaic predecessors such as the Neandertals who had come out of Africa in earlier waves. Researchers at Glasgow's Human Identification Centre, University of Glasgow and co-workers in Russia and Sweden have used molecular genetic techniques to compare mitochondrial DNA sampled from this infant, who lived 30,000 years ago, with modern human DNA, and concluded that Neandertals and modern humans diverged around 500,000 years ago.
To assess this, they focused on the hypervariable region of the mitochondrial DNA that evolves rapidly and is commonly used to establish evolutionary relationships between species. While this is generally seen as a conclusive proof of the out-of-Africa hypothesis, the theory's tenacious opponents are still able to argue that it does not rule out the possibility of Neandertal males mating with more modern human females, since the mtDNA story is only passed down in the female line.
#
"Thumb-length primates",1451,0,0,0
(Mar. '00)
Fossils of the world's smallest known primates, with one species estimated to have weighed only 10 grams, have been recovered from a limestone quarry in Jiangsu Province, China, according to a major paper in the April issue of the \IJournal of Human Evolution\i, released in late March. Found in 1996, these distant relatives of monkeys, apes and humans were the prey of owls around 45 million years ago, in the middle Eocene. There appear to have been about six separate species, though this report deals with just two.
For comparison, some insectivores alive today that weigh as little as 1.5 to 3 g (less than a tenth of an ounce), while the smallest primate known today, the mouse lemur of Madagascar (\IMicrocebus myoxinus\i), weighs 26 to 27 g (about a gram). Both of the fossils are related to a branch of primate evolution that eventually leads to humans. Primates are mammals, characterized by having bigger brains, grasping hands and feet, nails instead of claws and eyes located in the front of the skull.
Hundreds of animal fossils, including those of at least three tiny primate species, have been culled from a commercial limestone quarry 100 miles west of Shanghai. While the limestone itself is of Triassic age, dating back to the very beginning of the Age of Dinosaurs some 220 million years ago, the limestone was exposed and fissured in the middle Eocene, when a rainforest occupied the site. However, it was an odd rainforest, for it seems to have had only small primates, and no large ones.
Most of the analysis is based on calcanei, the heel-bones of primates. While this may seem like slim evidence, the calcaneus is a bone that can reveal a great deal to a careful observer and, while there are no complete skeletons, they have as many as 50 foot bones belonging to primates weighing less than 100 grams. Moreover, they say that markings on the fossils have led them to believe the primates were once the prey of owls. This could account for the lack of larger fossils, since it raises the possibility that only the small primates were carried by the owls to the fissured regions. This means there may be fossils of larger primates to be found in mid-Eocene deposits elsewhere in the province.
Key names: Daniel Gebo, Marian Dagosto, K. Christopher Beard and Qi Tao. See also: \JPrimate (biology)\j.
#
"World Petroleum Resources",1452,0,0,0
(Mar. '00)
With oil prices soaring around the world in March, the US Geological Survey was remarkably timely in its publication during that month of its latest assessment of undiscovered oil and gas resources of the world. The USGS reports an increase in global energy resources, with a 20 percent increase in undiscovered oil and a slight decrease in undiscovered natural gas. (Their definition of 'undiscovered' means the volume of oil and gas, exclusive of the U.S., that may be added to the world's reserves in the next 30 years.)
The report says that about 539 billion barrels of oil have been produced outside of the U.S. since oil became an important energy source about 100 years ago. It estimates the total amount of future technically recoverable oil, outside the U.S., to be about 2120 billion barrels.
The assessment results suggest that there is more oil and gas in the Middle East and in the offshore areas of western Africa and eastern South America than previously reported, but less oil and gas in Canada and Mexico, and significantly lower volumes of natural gas in the former Soviet Union.
The assessment is based on a rigorous geologic foundation on which to estimate undiscovered energy resources for the world. According to a USGS background document, the results have important implications for energy prices, policy, security, and the global resource balance. They add that such an overview provides exploration geologists, economists and investors with a general picture of where oil and gas resources are likely to be developed in the future.
The study divides the world into about one thousand petroleum provinces, based primarily on geologic factors, that are then grouped into eight regions roughly comparable with the eight economic regions defined by the U.S. State Department. At the moment, significant petroleum resources are known to exist in 406 of the geologic provinces.
The study also took into account the growth of reserves that can arise from a number of sources:
1. As drilling and production within discovered fields progresses, new pools or reservoirs are found that were not previously known.
2. Advances in exploration technology make it possible to identify new targets within existing fields.
3. Advances in drilling technology and oil recovery techniques make it possible to recover oil and gas not previously considered recoverable in the initial reserve estimates.
While the preliminary results were issued in March, the final report will be released at the World Petroleum Congress in Calgary in June. Additional information is available on the Web at http://energy.usgs.gov
Summary 1: Between 1993 and this study, the estimates of undiscovered reserves have changed as follows:
1. Undiscovered oil went from 539 billion barrels to 649 billion barrels.
2. Undiscovered natural gas went from 915 BBOE (billion barrels of oil equivalent) to 778 BBOE.
3. Undiscovered natural gas liquids went from 90 BBOE to 207 BBOE.
4. The world total went from 1544 BBOE to 1634 BBOE.
Summary 2: The volumes of undiscovered oil and undiscovered natural gas in various regions changed as follows, where oil is measured in billions of barrels (bb) and natural gas in trillions of cubic feet (tcf) (the figures in brackets indicate the percentage of world totals, excluding the United States):
1: Former Soviet Union: oil 116 bb (17.9%), gas 1611 tcf (34.5%)
2: Middle East and North Africa: oil 230 bb (35.4 % ), gas 1370tcf (29.3 %)
"Diversity of trees and soil pathogens",1453,0,0,0
(Mar. '00)
One of the most amazing features of a forest, as compared with a plantation, is the way that species are dotted around. The farms of humans, with their monoculture fields, are marvellous hunting grounds for disease organisms and pests, where once they arrive in a field, they have a wealth of hosts to prey on. Now it appears that the diversity in some forests may be driven by the microbes in the soil beneath a parent tree. An article in \INature\i in mid-March by Alissa Packer and Keith Clay suggests that the soil microbes may kill most of the tree's seedlings in the area close to the tree. This would open the way for unaffected seeds of other species to flourish near the parent tree, promoting the diversity of trees in forests.
The researchers looked at the patterns of seedling mortality in a temperate tree, the black cherry, and showed that a soil pathogen causes high mortality close to the parent tree and low mortality farther away. They believe that animal predators and herbivores may be less important than microbial pathogens in the soil in creating the diversity of tree species in temperate forests. Trees attract a range of herbivores, parasites, fungi and diseases, many of which are specialized to attack just that one species. Typically, these attackers gather around their target species. While a mature tree is usually up to the challenge, seedlings are not, and other species, immune to the predators and pathogens, will find the area more hospitable.
This idea is not entirely new, since one of the best-known ideas in ecology, the Janzen-Connell hypothesis, argues for just such an effect in tropical forests where host-specific natural enemies kill offspring around parental trees, making openings for other species to become established. However, this is the first time the effect has been shown in temperate forests.
Black cherry (\IPrunus serotina\i) produces large numbers of bird-distributed fruits that are deposited throughout the forests of eastern North America. Earlier studies had revealed high mortality on black cherry seedlings growing in soil collected from beneath black cherry adults, but low mortality in soil collected from beneath adults of other species. As well, the distance from adult trees had a greater effect on seedling survival than the density of seedlings, the more traditional factor. Survival of black cherry seedlings beneath adult trees is 35 percent less than survival farther away.
Sterilization of the soil changes the survival rate of seedlings when the soil came from beneath a black cherry tree, but made no difference when the soil was collected from further away, suggesting that a microbe, rather than any chemical exuded by the parent plant, was responsible. Packer and Clay isolated the fungus \IPythium\i sp. (The abbreviation 'sp.' after a genus name reveals that the species is still unidentified) from the roots of dying black cherry seedlings and injected the fungus into healthy seedlings. This decreased the seedling survival rate by 65 percent, making it likely that this is the group of soil organisms that make it so difficult for black cherry seedlings to survive within a few meters of their parent tree.
This is borne out by a count of the saplings beneath three black cherry trees that had high densities of black cherry seedlings. The researchers found only four black cherry saplings over half a meter high within 10 meters of the tree, but 41 saplings of this size from other species, including multiple individuals of beech, sugar maple, dogwood and ash, growing within that distance.
The distance-dependency may help to explain why trees go to so much trouble to spread their seeds far and wide, often wrapping them in tempting fruit that birds and other animals will carry away. Interestingly, Packer argues that, if any one tree begins dominating a woodland, the seeds may escape their parents, but fetch up under another tree of the same species, reducing the number of members of the dominant species, until it no longer dominates.
Significantly, the forests further from the equator, in places like Canada or Sweden, for example, may be composed of a small number of species of conifer over large areas, but these cool areas are probably less able to support pathogens. However, it is unlikely that this is the whole story. Biodiversity is just too diverse to be driven by a single cause, but it appears to be at least a chapter in that story.
#
"Logging and monkey health",1454,0,0,0
(Mar. '00)
A Purdue University doctoral student, William Olupot, found some unexpected differences in weight between gray-cheeked mangabeys living in primary forest, areas that have never been logged, and those living nearby in a forest that was logged in the 1960s and 1970s. After further study, he has now reported that secondary forests, areas that have been logged and allowed to regenerate, may provide second-rate habitat for primates, even after decades of regeneration.
Olupot looked at a total of 31 male mangabeys and found that those living within the secondary forests weighed, on average, 15 percent less than males living in relatively untouched primary forests. The differences were not related to the skeletal measurements or the age of the animals, which suggests that the cause was nutritional. The details, uncovered in Olupot's doctoral research, will be written up in an upcoming issue of the journal \IConservation Biology\i.
It is significant that the logging was done 20 to 30 years ago, but the mechanism causing the weight loss is not completely clear. Olupot believes that the groups living in the logged forests are less stable, and male mangabeys in these groups are much more likely to strike out on their own. Mangabeys spend most of their lives in groups made up of a stable core of females and several males, but the mangabeys in secondary forests tend to break into smaller groups, with a higher rate of turnover in males entering and leaving the groups.
Melissa Remis, an assistant professor of anthropology at Purdue who studies the impacts of human activities on gorillas and other wildlife in central Africa, believes that logged forests, with fewer trees and resources to provide shelter and food, may create a more stressful environment for the primates.
#
"Biodiversity a century from now",1455,0,0,0
(Mar. '00)
A major article in \IScience\i in March, with 19 authors (from the United States, Latin America, Europe and Australia), offers the first systematic look at how biodiversity is likely to be affected by several agents of human-caused global change. It also sounds an ominous warning: global warming and climate change aren't necessarily the principal factors that we should be worrying about.
The report identifies five primary influences, or "drivers," in global change that, in turn, affect biodiversity: global atmospheric carbon dioxide, climate change, biotic change (the introduction of new species to an ecosystem), nitrogen deposition and land-use change (for example, development and agricultural and forestry practices.)
In the next hundred years, land-use change is likely to be more important than climate change: if climate change ceased tomorrow, land-use change would still drive down biodiversity. If we are to maintain biodiversity, we need to reduce land-use change and biotic introductions, says the report.
The researchers also identified 10 biomes (terrestrial biological communities) and tried to assess how sensitive each was to a particular driver. The biomes included alpine, arctic, boreal forest, deserts, grasslands, Mediterranean systems, northern temperate forest, savanna, southern temperate forest and tropical forests. The researchers also examined the effects on freshwater ecosystems, both lakes and running water, and on soil.
Biotic change is a much greater threat to freshwater systems, particularly lakes, than to terrestrial biomes. Aquatic species have typically evolved in relatively isolated habitats, and introduction of a new species often produces stress on the food chain or some other aspect of native species' existence. Freshwater ecosystems are very sensitive to poor land-use practices because, as low points on the landscape, they accumulate damaging silt and excess nutrients from runoff.
Subsurface species, from bacteria to worms, currently are the life forms most harshly effected by land-use change. This is because it can take a century to make a couple of centimeters of soil, and just a few minutes to rip a ditch across an entire tract, disturbing an ecosystem containing thousands of species. Even cultivating a garden disturbs the local balances.
Carbon dioxide levels are probably the hardest global change drivers to assess in terms of their direct influence on soil biodiversity, but they will probably be indirectly significant because of the indirect effects through plants of increased carbon dioxide concentrations.
The Mediterranean-climate and grassland ecosystems will probably experience the greatest proportional change in biodiversity because all five factors affect them. The northern temperate ecosystems will probably experience the least biodiversity change, mainly because they have already been so extensively affected by major land-use change that there is little scope for further damage. The lowest changes will be in the arctic and alpine biomes, and the greatest will be in the tropical forests and the temperate forests of South America.
On the other hand, climate change will have its smallest effects in the tropics, with intermediate effects in most other biomes, except at high latitudes, which will be affected the most of all by climate change. Nitrogen deposition will have its greatest effects near northern cities in temperate zones and its least in the arctic and southern temperate forests.
Their conclusion is that land-use change is the dominant driver, and that comes back to human population. It is followed by climate change, which again is dependent on human population levels. The only exception, they say, is in fresh water, where land-use change has the most effect on rivers, while biotic change has the most affect on lakes.
#
"Fungus versus cocaine plants",1456,0,0,0
(Mar. '00)
Cocaine is prepared from coca plants: if there are no coca plants, there will be no cocaine, and, while it is possible to attack coca plantations with chemical sprays, there is nothing to stop farmers from moving to a new site, from planting a new crop in the same place, or scattering it in among another crop. To eliminate coca plants, what is needed is some sort of biological control agent, a watchdog that can kill just the one crop, and which then waits around in case the crop returns.
A strain of the fungus \IFusarium oxysporum\i that attacks coca plants has been located in an experimental coca crop being grown in Hawaii, and there are now moves afoot to spread this strain in the coca fields of Colombia, according to a report in \INew Scientist\i during March. The fungi of the genus \IFusarium\i are common around the world, where different species and strains attack different plants by invading their roots.
Trials are now being proposed, aimed at finding out whether the fungus attacks any other plants, and also at finding a way to distribute the fungus, such as coating it on rice grains that could be spread from a plane. Objectors argue that any such attempt may lead to more deforestation and more migration, displacing people even deeper into the Amazon. Greenhouse tests have shown no problems with 50 related species of plant, but the report says that coca farmers in Peru claim (not very surprisingly) that a \IFusarium\i outbreak there recently damaged coca plants and spread to food crops. The report quotes a supporter of the trials as saying (equally without surprise) that investigators who looked into the reports found that the food crops were not infected with \IFusarium\i after all.
#
"Potato late blight hits Russia",1457,0,0,0
(Mar. '00)
Potato scientists at Cornell university's CEEM (Cornell-Eastern Europe-Mexico) Potato Late Blight Program have reported the emergence of new virulent types of the potato late blight pathogen, \IPhytopthora infestans\i, in Russia. Just as the Irish depended on the potato crop in the 1840s, so millions of people in Russia depend on the potato today, making this yet another problem to be faced by the floundering Russian economy. The strain is particularly aggressive, having evolved through sexual mating and, unlike the old strains, the new pathogen can survive harsh winters in the soil to endanger potato crops in later years.
The strain will be hard to control, but at least we know how it got there and what contributed to its development. The blight originated in Mexico, where there are two mating types of this organism, the A-1 and the A-2, both of which are fairly short-lived on their own. The Irish famine was caused by an escape of the A-1 strain and, over time, the blight subsided gradually during the late 19th century with the discovery of Bordeaux mixture, the first fungicide effective against the fungus.
In 1976, there was a drought-caused potato shortage in eastern Europe. As a result, the USSR and satellite countries in the communist bloc that existed then, imported 25,000 tons of potatoes from Mexico, and these potatoes carried the A-2 strain. During the 1980s, fungicides kept the blight in check, but Russia's current economic problems make fungicides too expensive for the nation's many small farm holdings.
Once the A-2 strains arrived in Europe in 1976, the A-1 and A-2 organisms were able to reproduce sexually and create oospores, the resting state of the pathogen. The oospores can not only over-winter more easily, they can also reproduce sexually, allowing new combinations of genes to be developed when two mutant forms meet and mate. This makes it far more likely that fungicide resistant blight strains will develop
Up until now, fungicides have proved fairly effective when farmers could afford them, so that, over the past 40 years, world potato production has climbed from 30 million tons to 85 million tons around the world. Russia's small farmers plant 3.4 million hectares (8.4 million acres) with potatoes each year for an average yield of 10 tons per hectare. Total Russian potato production each year is between 34 million and 39 million tons.
In the last 10 growing seasons, the St. Petersburg region of Russia has seen at least seven blight years, the Moscow region has seen five, and Siberia has seen three. The federation's Sakhalin Island, north of Hokkaido, Japan, has seen blighted potato harvests every year for the past decade. According to CEEM scientists, there are no quick fixes, and it will take some years to develop new resistant strains (see \J???\j, February 2000)
See also: \Jpotato blight\j, \Jpotato\j \JPotato blight returns\j.
#
"Methane cleans up nitric oxide",1458,0,0,0
(Mar. '00)
Methane is an important greenhouse gas, but it also has some useful side-effects, according to an article in a recent issue of the journal \ICatalysis Today\i. In it, Umit Ozkan Junko Mitome and Enrique Aceves describe a way to use methane to remove toxic nitric oxide emissions from the stack gases of coal-burning power plants. The method extracts up to 100% of the nitric oxide from stack gases more safely and less expensively than with any currently available method.
Nitric oxide is a common byproduct of combustion, and cars and fossil-fuel burning power plants are two chief sources of the molecule. In the lower atmosphere, it falls out as acid rain, while in the upper stratosphere, it depletes ozone. At the moment, nitric oxide is converted by injecting ammonia into the exhaust gases, and then, with a vanadium-oxide-based catalyst, the pollutant is converted into nitrogen and water. The drawbacks are that ammonia is expensive and difficult to handle, and it is highly corrosive and toxic if released. Worse, if the temperature inside a smokestack climbs too high, the ammonia may react with oxygen over this catalyst to form more nitric oxide, not less.
Given the right catalysts, methane, often already present in the smokestack emissions, can play the role otherwise played by ammonia without any of the risks associated with ammonia. The palladium catalyst is expensive, but only a small amount is required to thinly coat the interior surface of a ceramic support that contains thousands of tiny honeycomb-shaped passages. The result is carbon dioxide, water and nitrogen.
There is one problem: sulfur dioxide reduces the effectiveness of the catalyst so that, after six hours, it only removes 85% of the nitric oxide. The trick, say the researchers, may be to have two catalytic reactors and switch between them, giving the catalyst time to recover. Research is continuing on this process.
#
"The acid-loving microbe and mine pollution",1459,0,0,0
(Mar. '00)
Old metal mines cause large amounts of environmental damage, largely through a process called acid mine drainage, the primary environmental problem associated with the extraction of metal ores from the earth. Now, according to a mid-March report in \IScience\i, it seems the blame rests with a newfound microbe that eats iron and lives in acid-drenched conditions.
The microbe, \IFerroplasma acidarmanus\i, is an archaeon that thrives when metal sulfide ores are exposed to air and water, conditions that mimic hot battery acid. In fact, it goes further, making its own acid when it converts remnant sulfide found in metal ores to sulfuric acid. This acid then contaminates mining sites and drains into nearby rivers, streams and groundwater.
The microbe offers an interesting puzzle, as it has no cell wall, giving the lie to the standard view that microorganisms tough it out in nasty environments with the help of durable external walls to shield themselves from extreme conditions. And while the microbe was found in large numbers in one mine in Redding, California, its discoverer, Katrina Edwards, suggests that it may be found all over the place, living off ore bodies exposed naturally to air and water and geochemically impacting iron and sulfur cycles. Normally, it would be hard to detect, but under the conditions created by the mining of metal ores, where many tons of ores and tailings are exposed, the organism thrives and revs up the production of sulfuric acid.
Of the dozen or so microbes in the mine, this one appears to be the key player. It was the most active and numerous in the mine, it oxidizes iron, forms slimes and grows on pyrite sediments. It will take more research before this organism is fully understood, but the lack of a cell wall will be a key issue. It may be that its seemingly fragile cytoplasmic membrane confers an advantage that is yet unknown to science.
Key names: Katrina J. Edwards, Jillian Banfield, Brian Fox and Charles Kaspar.
#
"April, 2000 Science Review",1460,0,0,0
\JOne human gene and many diseases of old age\j
\JLearning to suture\j
\JNew anticancer drugs\j
\JHypermethylation marks cancers\j
\JCancer cures, cancer quackery\j
\JBreast cancer and diet\j
\JBreast cancer vaccine hopes\j
\JAVAX in Australia\j
\JTelomerase and the cancer vaccine\j
\JPatterns of prostate cancer recurrence\j
\JPeter Duesberg at it again\j
\JSouth African AIDS panel meets\j
\JGetting drugs into cells\j
\JGulf War syndrome dizziness linked to nerve gas\j
\JKeeping antibiotics out of GM crops\j
\JRice genome complete\j
\JCelera and the Genome Race that never was\j
\JThree human chromosomes complete\j
\JHuman chromosome 21 update\j
\JMouse junk DNA has uses\j
\JThe tobacco industry and second-hand smoke\j
\JSkeleton from royal tomb is not King Philip II\j
\JThe causes of coronal mass ejections\j
\JBOOMERANG yields good returns\j
\JImages of Jupiter's inner moons\j
\JInternet2 astronomy\j
\JIsing Model fails in three dimensions\j
\JNew modulator to speed communications.\j
\JColor maps for the color blind\j
\JFossil dinosaur heart\j
\JAncient toothpicks tell a tale\j
\JGenomic imprinting in the platypus and opossum\j
#
"One human gene and many diseases of old age",1461,0,0,0
(Apr. '00)
A number of age-related diseases such as cancer, atherosclerosis, arthritis and Alzheimer's disease, may have a common genetic link, according to a report in a mid-April issue of the \IProceedings of the National Academy of Sciences\i. This finding suggests that it maybe possible to control or prevent all of these diseases using a drug (or drugs) designed to target this common link.
According to the report, the activation of a single gene known as p21 may contribute to the development of a multitude of diseases of old age. The gene works like a brake, stopping cells from growing when they are damaged by toxins or radiation. While this gives the damaged cells a chance to repair themselves, during normal aging, the p21 gene also stops cells from dividing as they grow old (or in the language of science, senescent).
The researchers used recombinant DNA methods to start up the p21 gene in human cells grown in the laboratory. Then, using the tools of modern genomics, along with a knowledge of the human genome, to identify the effects of p21 on thousands of other genes, they showed that this single gene affected a wide range of genes previously shown to be involved in aging and age-related diseases.
Once the p21 gene was switched on, the cultured cells showed all the signs that are seen in aging cells. They stopped growing, they became flat and granular in appearance and, most importantly, they started making enzymes typically produced by senescent cells. In total, the p21 gene selectively inhibited more than 40 genes that are known to be involved in DNA replication and cell division, and it was this which brought cell growth to a halt.
At the same time, though, the p21 gene increased the activity of about 50 other genes, and about 20 of these turned out to be genes that manufacture proteins that are secreted by the cell into its environment. Some of these proteins emitted by the newly senescent cells either inhibited the death of neighboring cells, or stimulated the growth of those neighbors.
This odd stimulation effect is significant because it resembles the functions of certain cells found in human cancers, causing the researchers to speculate that the p21 gene may be involved in promoting the development of tumors.
Back, though, to the aging issue. A number of the genes stimulated by p21 produce proteins that either cause or are associated with several age-related diseases such as atherosclerosis, arthritis and amyloidosis. As well, the proteins include the precursor of the beta-amyloid peptide, the building block for the main component of plaques found in the brains of Alzheimer's patients. The plaques form after beta-amyloid is modified by an enzyme called transglutaminase, which the researchers found was also stimulated by p21.
Key names: Igor Roninson, Bey-Dih Chang, Keiko Watanabe, Eugenia Broude, Jing Fang, Jason Poole and Tatiana Kalinichenko.
#
"Learning to suture",1462,0,0,0
(Apr. '00)
All patients expect skilled treatment, leading to the question of where health workers go to learn and practise their skills without causing howls of outrage and anguish from patients? Health workers have traditionally looked for alternatives, like the orange, traditionally used to receive the first tentative punctures of a beginners' hypodermic needles. Oranges have the right 'feel', and they do not feel pain.
Oranges, however, are less useful for trainee surgeons, who need to learn to wield needles of a different sort. Suturing needles need to stitch wounds neatly so that they heal with minimal scarring. Now, Rod Cooter, from Australia's Adelaide University, has come up with synthetic human flesh, made from plastic, which can be used to teach the skills of wound care. Cooter's work has been performed though the Cooperative Research Centre (CRC) for Tissue Growth and Repair. (The concept of the CRC is a highly successful Australian innovation in research management that brings together funding and the skills of people from various institutions. The nation's CRCs are located in many cities across Australia.)
The material is part of a kit called Practical Skin Wound Management, and it is an interactive multi-media educational aid. It includes a CD-ROM which offers eight modules of demonstrations and self instruction, along with a manual and facilities for self-testing, covering topics ranging from skin anatomy and biology, to anesthesia and surgery, suturing, dressing and scar management.
The new material is formed into a suturing pad which reproduces the texture of human flesh, and which can be cut in a variety of ways to simulate different wounds. Surgical instruments and materials are supplied, allowing the students to practise suturing and hand tying, copying examples presented on the CD-ROM.
Rod Cooter, who is also the Director of Plastic and Reconstructive Surgery at the Royal Adelaide Hospital, believes that it is better for medical and nursing students to learn about skin anatomy, pathology, anaesthesia and surgery at the same time, if the aim of training is to get the best results in wound management. The idea actually began with workshops aimed at training general practitioners in wound management and, after feedback at that level, the kit was developed and tested on a group of 80 medical students.
So, rather than learning the skills in an apprenticeship situation or from two-dimensional diagrams, the students have been able to learn from three-dimensional examples on the CD-ROM. As often happens with work generated in a CRC, the success of this program has prompted the CRC to form a new enterprise, Innovative Surgical Technologies, and they say there are other projects in the pipeline.
Meanwhile, their first kit is now being used in a distance-learning program of the Royal Australasian College of Surgeons, while the Australian military and the Royal Flying Doctor Service have expressed interest. The kit will also be marketed in Asia.
There should be a Web site available through the CRC's home page in July 2000.
#
"New anticancer drugs",1463,0,0,0
(Apr. '00)
A report in early April in \IOrganic Letters\i describes the synthesis of two anticancer drugs, each estimated to be at least 100 times more powerful than the potent anticancer medication Taxol« (paclitaxel). The work was carried out by Elias J. Corey, winner of the 1990 Nobel Prize for chemistry.
The main drug , ecteinascidin, is not new. It was first produced in 1996, but it has been little used until now because it is so difficult to make. Corey's achievement is to find a simpler way of producing the molecule, and his new method should speed and simplify mass production of the antitumor substance, potentially making the drug more available to patients suffering from soft-tissue sarcomas like cancers of the muscles, tendons and blood vessels.
The work was not easy. According to Corey, ecteinascidin is probably the most complicated molecule ever to be made on a commercial scale. Ecteinascidin and a simpler, easier-to-make form called phthalascidin seem to act differently from every other cancer cell treatment. Mainly, unlike chemotherapy, they prevent tumor cell division without killing off the cells.
Ecteinascidin is currently undergoing phase II human testing in the USA in Boston, New York and Houston. The human trials are expected to finish by the end of 2000, and phase three testing will begin soon after. Phthalascidin is equally effective clinically, and more stable, but has not yet begun human trials.
The manufacturer that owns the rights to the drug, Pharma Mar, reports that it will seek 'fast-track' approval of the medication through the US Food and Drug Administration. Barring unforeseen complications, ecteinascidin could be on the market in 2002.
The natural form of the drug is found in a reef-living tunicate (or 'sea squirt') called \IEcteinascidia turbinata\i that is found in the West Indies. Extracts from the animals were reported to have immunosuppressive and antitumor activities as far back as 1975 (Lichter et al., \IProc Soc Exp Biol Med\i \B150(2)\b: 475-8, 1975, with other even earlier reports involving the authors of that 1975 paper). Previous attempts at chemically creating sufficient amounts of the substance failed. Even though just 14 milligrams (mg) can last a patient up to six months, yields were too small to meet even this level of demand, so the drug had remained impractical.
The world need, given that requirement for each patient, is for about 5 kg (11 pounds) of the drug each year. This should be feasible, and the news should make the sea squirts of the Caribbean breathe a little more easily.
Key terms for web searches: '\IEcteinascidia turbinata\i' and Lichter
#
"Hypermethylation marks cancers",1464,0,0,0
(Apr. '00)
In a Johns Hopkins study, researchers report a significant molecular alteration in breast cancer. In 91% of breast cancers, researchers detected either increased methylation or hypermethylation (which turns genes on or off) of a gene known as 14.3.3 sigma (F). In the breast cancer cells where there was no F expression, due to hypermethylation of the gene, the cells showed higher accumulations of chromosomal breaks and gaps when exposed to radiation.
(A gene on a chromosome can be 'turned off', or prevented from coding for a protein, by the attachment of one or more methyl groups, which are made up of a carbon atom and three attached hydrogen atoms. Hypermethylation is the addition of extra methyl groups to the section of a chromosome where the target gene is found. When a gene \Idoes\i code for a protein, it is said to be expressed. When it is switched off, geneticists say that there is no expression of the gene.)
It appears that the alteration of this gene leads to breast cancer by inhibiting cell cycle checkpoints, thus allowing genetic defects to accumulate without being detected. The researchers suggest that it would be worthwhile targeting the methylated 14.3.3 sigma gene, perhaps using drugs to demethylate it and turn on its expression as a way of fighting tumors. They say that hypermethylation and the resulting loss of 14.3.3 sigma gene expression are the most consistent molecular alterations identified in breast cancer to date.
Hypermethylation markers have also been found for prostate cancer. In yet another Johns Hopkins report, hypermethylation of a specific gene called GSTP1 is said to be a common and early genetic alteration occurring in prostate cancer. The alteration is found in the majority of prostate cancers, but is absent in normal tissue.
In this study, the researchers studied GSTP1 methylation-positive prostate tumors and urine samples obtained from 22 prostate cancer patients. DNA from prostate cancer cells shed in the urine were isolated and examined for GSTP1 alterations. The researchers only detected GTSP1 methylation in a third of the urine samples, but this is the first time that molecular diagnosis of prostate cancer has proved feasible.
#
"Cancer cures, cancer quackery",1465,0,0,0
(Apr. '00)
In the same broadside of cancer reports that has yielded the last two items, Johns Hopkins researchers reported a successful conclusion to a 12-week, multi-center randomized Phase II study of a prostate cancer drug called ABT-627. This compound works by blocking endothelin, a protein believed to be involved in the development and progression of prostate cancer.
Of the 131 patients in the study, 50 were given the compound. The rate of rise of PSA (prostate-specific antigen) in those patients stayed constant, and improvement in biologic markers was observed. ABT-627 was well tolerated and displayed anticancer activity in the patients examined.
But if ABT-627 has been doing well, the folk remedy of shark cartilage lost some of its prestige during April. The basis for sales of shark cartilage lies in claims that sharks never get cancer, but scientists at a news conference at the annual meeting of the American Association for Cancer Research noted that sharks can even get chondromas, cancers of the very cartilage now being sold as a cancer cure!
Using very strict diagnostic criteria, scientists were able to find 40 cases of tumors in sharks and related fishes like skates and rays. It looks very much as though those selling and taking shark cartilage pills are doing so on very faulty data, with no proof that it works, and taking a top-level predator out of an ecosystem that is at risk.
Of course, there is always the outside possibility that there \Imight\i be a cure lurking there inside the cartilage of sharks or other animals; however, for the moment, there is no justification for believing so. The cost to the environment, the high cost of the pills, and the cost to patients who are passing up better established treatments cannot be justified.
The myth that sharks do not get cancer sprang up in part because of their isolation from humans, which reduces both their exposure to carcinogenic pollutants and the likelihood that sharks with tumors will be caught. As well, those who argue for the cartilage cure use scientific evidence very selectively, pointing to the failure of a single attempt to give sharks tumors in a laboratory. However, several different efforts to give tumors to English sole also failed, even though that fish has high cancer rates in habitats such as Puget Sound.
The advocates for shark cartilage have claimed that angiogenesis, a tumor's ability to encourage growth of new blood vessels, is the secret of a tumor's success, and that cartilage is a tissue with relatively few blood vessels. Sharks have skeletons made, not of bone, but of cartilage and, taken together, these facts have been combined to make what appears to be an unjustified conclusion, because none of those facts means a thing, given the observed facts of cancer tumors in sharks and their relatives.
Key names: Gary Ostrander and John Harshberger.
#
"Breast cancer and diet",1466,0,0,0
(Apr. '00)
A study of at Johns Hopkins of women with varying risks of breast cancer has shown that those women who ate flame-broiled food more than twice a month significantly increased their risk of developing breast cancer. The study, which involved 110 cases and 113 matched controls, points the finger squarely at carcinogens known as heterocyclic amines, or HCA, known to be produced when meat is cooked in direct heat, such as flame-broiling.
The risk was greater among women who carried genetic risk factors such as alterations of the gene NAT2, which is involved in the activation of HCA, or with alterations of GSTM1, GSTP1, and/or GSTT1, genes which may break down or detoxify the carcinogens produced by flame-broiling.
Another Johns Hopkins study has found that carotenoids, which are found in many fruits and vegetables, especially green leafy vegetables, may protect against the development of breast cancer. The research involved looking at a number of micronutrients, substances the body requires in small amounts, and their relationship to any later development of breast cancer.
To do this, the researchers checked for the effects of retinal, retinal palmitate, gamma-tocopherol, alpha-tocopherol, lutein, cryptoxamin, lycopene, and the carotenoids alpha-carotene, beta-carotene, and total carotene on a sample of 295 women who donated blood for a serum bank in 1974 and again in 1989.
Although some of the other micronutrients showed some protective properties, carotenoids were the only ones shown to offer significant protection against breast cancer.
#
"Breast cancer vaccine hopes",1467,0,0,0
(Apr. '00)
AVAX Technologies, Inc., the company that is about to open a melanoma vaccine facility in Australia (see \JAVAX in Australia\j, this month), has reported good results from the use of an autologous cell cancer vaccine in a mouse model of breast cancer. According to a presentation to the 91st Annual Meeting of the American Association for Cancer Research (AACR), the dosage used in these animal experiments is similar to those being used in two current US human trials: the company's registration trial of M-VaxÖ for metastatic melanoma and in Phase 2 studies of O-VaxÖ for ovarian cancer.
The mouse model study compared the therapeutic potential of vaccination with irradiated tumor cells modified with the hapten DNP (this is AVAX's AC Vaccine technology, explained in \JMelanoma vaccine trials succeed\j, November, 1999) with treatment with tumor cells that were not modified via the AVAX process. It used a specific murine (mouse) mammary carcinoma that closely models human breast cancer in terms of its immunogenicity, metastatic properties and growth characteristics.
Treatment with the DNP-modified irradiated tumor cells improved relapse-free survival of the mice, while treatment with the unmodified cells had no effect. The study also revealed that vaccination with the modified tumor cells caused an immune response, which was identified by activation of CD8+ T-cells, and showed that these cells are a necessary component for the vaccine to achieve its effect in this model. The vaccine is already undergoing a Japanese study that is the equivalent of a Phase1/2 study in the US.
This study was supported by a grant from AVAX and was conducted at the University of Illinois, Chicago, by Dr. Margalit B. Mokyr.
#
"AVAX in Australia",1468,0,0,0
(Apr. '00)
AVAX Technologies' Australian subsidiary, AVAX Australia Pty. Ltd., has signed a manufacturing agreement with Bio Enterprises Pty. Ltd., Sydney, for the manufacture of AVAX's AC VaccineÖ (autologous cell vaccine) M-VaxÖ for the Australian market. AVAX received approval in 1999 for M-Vax to be made available for commercial sale for the treatment of Stage III melanoma, subject only to licensure of a manufacturing facility, which is now very close to completion. (See \JMelanoma vaccine trials succeed\j, November, 1999 for an earlier story.)
This represents a major breakthrough, and the plant has been placed appropriately in Australia, where a combination of fair skins, temperate to tropical climate, and an emphasis on outdoor lifestyles has ensured that melanoma rates are higher than anywhere else in the world.
According to AVAX, M-Vax, a therapeutic cancer vaccine, is an essentially non-toxic, experimental post-surgical treatment for Stage III and Stage IV melanoma. To date, over 350 patients have been treated with M-Vax on an outpatient basis in the US. In Phase 2 studies, patients with Stage III melanoma who were treated with the vaccine demonstrated a five-year survival rate of 55%. It is one of several products being developed by the company - see \JBreast cancer hopes\j, this month, for more details.
#
"Telomerase and the cancer vaccine",1469,0,0,0
(Apr. '00)
Telomerase is an enzyme commonly found in a variety of human tumors, where it serves to make the cancer cells 'immortal'. But now the enzyme looks set to become a target that may be used to provoke the body's own immune system to attack and kill those same cancer cells, according to a report in an early April issue of the \IProceedings of the National Academy of Sciences\i.
Telomerase works by maintaining the ends of the chromosomes, the telomeres, which usually fray and slowly unravel, marking the chromosome and its cell as 'old'. In a cancerous cell, the telomeres are built up as fast as they break down, so the biological markers for aging are missing. That is why cancer cells are effectively immortal.
Maurizio Zanetti and his UCSD colleagues, in collaboration with the Institut Pasteur in Paris, has now successfully used a prototype vaccine in cancer cells to activate a type of lymphocyte called cytotoxic T-lymphocytes (CTL), or killer cells, to destroy cancer cells using telomerase as a target. So far, they have only done this \Iin vitro\i, in the test tube, but at least it is a start.
Lymphocytes are white blood cells that patrol the body. When they meet foreign cells, they launch an attack against the invader. Killer cells target infected or cancerous cells by recognising and binding to proteins, called antigens, on the cell surface. "In cancer, the immune system becomes increasingly weakened and ineffective against rapidly proliferating malignant cells," said Zanetti in a recent press statement. "We wanted to see if the immune systems of individuals with cancer retained the ability to recognise telomerase, and if we could boost the immune response using telomerase in a prototype vaccine to expand CTL activity against cancer."
The vaccine works by bolstering the reaction, and it does this by providing enough of the antigen to start an immune response, amplifying production of CTL against specific targets. In short, killer cells targeting telomerase are generated and, because telomerase levels are higher in cancer cells, telomerase peptides could then serve as a beacon for CTL, which would zero in on the cancer cells and destroy them.
The researchers created a vaccine made from CTL-specific pieces of telomerase reverse transcriptase (hTRT), and tested this against blood cells from prostate cancer patients and, for comparative purposes, from healthy individuals. They report that the lymphocytes from prostate cancer patients were readily activated into CTL following immunisation with the prototype vaccine, attacking and killing the cancer cells.
Given that telomerase is over-expressed in the vast majority of all human cancers, they reasoned that CTL produced against one type of cancer would recognise and destroy other types of cancer as well. They found that the killer cells targeted the hTRT peptides in other human cancer cells - breast, colon, lung and melanoma - as well, and destroyed them just as effectively.
They then tested the vaccine on transgenic mice provided by the Institut Pasteur. The mice are genetically engineered to mimic the human immune system, expressing a common type of human transplantation antigen. The prototype telomerase vaccine induced a CTL response in these mice with no apparent negative side effects, demonstrating the potential of this vaccine in a live model.
Of course, telomerase is also essential in the normal process of cell division, so Zanetti and his colleagues also looked for negative effects or damage from this vaccine on normal human stem cells, which have a rapid reproduction rate and therefore higher levels of telomerase than normal cells. No adverse activity was detected. If the stem cells are in the clear, it is probable that normal cells, with low telomerase levels, would be completely safe, but this aspect will need more study, they say.
If there is ever going to be a universal cancer vaccine, then this work is likely to be a milestone in the progress towards that vaccine.
#
"Patterns of prostate cancer recurrence",1470,0,0,0
(Apr. '00)
Up until now, few studies have been carried out on the long-term effectiveness of external beam radiation treatment for prostate cancer. A study reported in the May issue of \IUrology\i suggests patients treated with radiation therapy who are disease-free at four years will remain disease-free. Or, in the slightly more conservative language preferred in medical counselling, if a patient remains disease-free after four years, the chances of his cancer coming back are incredibly slim.
Patients in the study who were treated with external beam radiation alone or in combination with short-term hormone therapy (androgen deprivation) show little risk of failure after four years. Ninety-nine percent of patients who show no evidence of recurrence five years after their treatment remain disease-free. None of those patients who lasted six years without a recurrence has since shown any sign of the disease.
If PSA (Prostate-Specific Antigen) levels rise, indicating that radiation treatment has failed, this happens in the first three to four years. As well, if radiation treatment fails, it happens earlier for men who had a higher PSA level before undergoing radiation therapy, justifying the normal medical assumption that higher PSA levels indicate a poorer prognosis.
The risk is greatest between 12 and 36 months, tapering to a low rate of failure at four years. Treatment for men with an initial PSA of up to 10 failed at a median of 28 months. Treatment for men with an initial PSA of 10 to 19.9 failed at a median of 25 months, and treatment for men with an initial PSA of 20 or higher failed at a median of 22 months.
There is a different pattern of failure when external beam radiation is combined with hormone therapy, but the time frame is similar for attaining a low risk of treatment failure. The highest risk of failure occurs immediately following treatment, but declines to a low risk of failure at 48 months.
Key name: Alexandra L. Hanlon.
See also: \JProstate cancer: a special report\j, September 1999.
#
"Peter Duesberg at it again",1471,0,0,0
(Apr. '00)
The history of science is full of stories of brave scientists who went it alone, taking a stand against the accepted view, like Galileo arguing for a heliocentric system of planets, or Wegener contending with scientific ridicule as he argued for something very similar to what we now call plate tectonics.
Most of the tales we read of heroic mavericks triumphing over the odds and having the last laugh are mere concoctions, hero-worshipping fictions with no validity whatsoever. Nobody laughed at Columbus for believing the world was round, for all educated people knew that the world was round. No serious physicist laughed at Einstein. While it is true that many physicists had their reservations about his ideas, people did not laugh at Einstein.
Nor did they laugh at Newton, or most of the other scientists who changed the world by their ideas and discoveries. Nonetheless, the image of the lone scientist, derided by the mob and finally emerging triumphant when his (it is always 'his' in these myths) theories are vindicated, is a powerful one, and perhaps a dangerous one. It is also the image that, directly or indirectly, lends comfort to the lonely maverick who takes a viewpoint opposed to the standard version. Most of the time, they fade away unrecognised and unvindicated, but science needs the maverick minds that will question the standard wisdom, so long as they do no damage with their questioning.
The American molecular biologist Peter Duesberg, is better known right now for his claim that HIV, the human immunodeficiency virus, is not the cause of AIDS. Soon, though, his reputation may rely more on his new controversial claim that cancer is not caused by a series of genetic mutations that drive a cell into wild, uncontrolled growth. Instead, says Duesberg, cancers result from aneuploidy, the disruption of the normal number of chromosomes in a cell, usually by the duplication of one or more chromosomes.
At one level, there may seem to be a logical case for his view, as you would expect from a highly competent scientist like Duesberg, and it is the logic, plus his competence, that allows him to argue his notions in the \IProceedings of the National Academy of Sciences\i. So Duesberg's views are not so much laughed at as looked at with considerable doubt by the majority of scientists, even as they sift through the evidence. Open minds are the order of the day on this issue, because the claim does no harm, may do some good, and it is open to being tested.
Aneuploidy is found in nearly every solid cancer studied to date, but it has always been considered a side effect of cancer, not the cause itself. Duesberg argues that if aneuploidy started the cancer rolling, rather than the other way around, this would explain many aspects of cancer that the genetic mutation theory cannot. He says the weakness with the mutation theory of cancer is that no one has successfully turned a normal human cell into a cancer cell by inserting mutated genes. Such a demonstration would definitively prove that mutations cause cancer, but it has yet to happen. "No one has found, even once, a combination of genes from any cancer that when inserted into normal cells turns them into cancer cells," he says.
There was a paper published in \INature\i in 1999 which claimed that Robert Weinberg at MIT and his colleagues had done just that at the Whitehead Institute, although Duesberg questions this. Weinberg and his colleagues said they had inserted two oncogenes, cancer-causing genes, into normal human cells and generated cancerous cells, showing that these genetic mutations "suffice to create a human tumor cell." Duesberg, on the other hand, requested and obtained samples of the cancer cells from Weinberg and found that all of them also had numerical chromosome alterations, or aneuploidy. Given this, the cause could have been either aneuploidy or genetic mutation, he argues.
The aneuploidy theory is not new: Duesberg has been pointing out problems with the genetic mutation theory of cancer since the mid-1970s. But now that this theory has become almost a dogma of medicine, he has been arguing more loudly, and gaining a certain amount of support, which \Imight\i lead, in the end, to a paradigm shift. The idea may also die out, but whatever happens will be based on the evidence, as assessed by rational scientists.
If Duesberg is right, a number of things will change, among them the form of cancer screening. Researchers would stop looking for mutations in biopsied cells, and start looking for aneuploidy as a sign of early cancer instead. As a test, a group of physicians at UC San Francisco is now screening for developing melanomas by looking for chromosomal anomalies in skin cells.
Another telling argument put forward by Duesberg is that nearly half of all cancer-causing chemicals appear not to cause mutations at all. Asbestos, arsenic, some hormones, urethane, nickel and polycyclic aromatic hydrocarbons all are known to induce cancer in humans, but none of them is a mutagen; however according to the mutation hypothesis, all carcinogens should be mutagenic. Equally, if genetic mutations cause cancer, then the cancer should arise immediately after a mutation. Instead, cancers appear decades after exposure to a carcinogen.
Carcinogens all have a long latency period, or lead-time, before they cause the actual cancer. Duesberg comments that "Scientists argue that this is because cancer is a multi-step, epigenetic phenomenon, but that exactly describes aneuploidy." One of his more interesting and significant arguments is that the cellular disruption caused by having too many copies of an entire chromosome is much greater than that expected from a handful of mutated genes, and that this level of disruption is much more likely to affect the many cellular processes known to be fouled up in cancerous cells.
Together with David Rasnick, a visiting scientist in his laboratory, Duesberg has been looking at the evidence, and reported in 1999 that cancer cells exhibit massive overproduction and underproduction of a large number of proteins. They found thousands of proteins whose expression was doubled in cancer cells. If this finding is confirmed, it would certainly favor the view that aneuploidy is present in the cancerous cells being studied; however, this is not yet proof that the aneuploidy was the cause of the cancer rather than the other way around.
In 1998, \INature Biotechnology\i science editor Harvey Bialy noted that the only solid tumors whose cells contain a normal number of chromosomes are those very rare ones caused by retroviruses. Otherwise, some 5,000 known types of solid tumors show chromosome disruptions. Once again, there is the risk that an association may be mistaken for a cause, but this caution applies to both sides in the debate, who both tend to treat associations as causes.
On the Duesberg model, carcinogens enter cells and disrupt the spindle apparatus that drags chromosomes apart during normal mitotic cell division. As a result, the division is uneven, and some of the cells end up with excess chromosomes, leading to abnormalities. A single extra chromosome 21 causes \JDown's syndrome\j, while an extra of most other chromosomes is fatal when it is found throughout the body. Perhaps significantly, Down's syndrome carries with it a 100-fold increased risk of leukemia. This condition, where there is a single extra chromosome, is called a trisomy, but few of them are observed in living organisms, because most trisomies are fatal and lead to spontaneous abortion at an early stage in the development of a fetus.
Just as most trisomies are fatal, Duesberg believes that many aneuploidies are fatal as well, so they do not lead to cancers. But occasionally, he says, the chromosome abnormalities will generate a cell that survives better than the normal cell, and it will grow into a cancer. Duesberg and his colleagues have shown that, of the non-mutagenic carcinogens, most of them do in fact cause aneuploidy, even though they do not create genetic mutations.
One problem commonly found by the maverick scientist who goes against the flow is getting funding to test alternative theories, because they do not accord with today's received wisdom. If Duesberg is later proved wrong, this conservatism in funding will be seen by the very few who recall it as a wise action; however, if he turns out to be right, the decision-makers will be ridiculed for their blind adherence to a scheme which, in hindsight, is easily seen to be flawed.
Fifty years ago, aneuploidy was seriously considered a possible cause of cancer, but now it does not even rate a mention in the textbooks. According to Duesberg, it was dismissed too soon; according to others, it should be given a decent burial and left alone. On balance, the aneuploidy theory seems to raise some interesting questions and explain some odd effects, but whether it is the cause of cancer or an effect of cancer is still to be determined.
Meanwhile, Duesberg's old claim to fame, his view on AIDS and HIV, continues to wreak havoc, but truth is also suffering. Reports in early May claimed that "South African president Thabo Mbeki had barred the use of AZT to treat pregnant women because Mbeki, lacking any medical training, and flying in the face of clear evidence, has accepted Duesberg's claim that AZT causes AIDS, rather than preventing HIV from causing AIDS".
In fact, Duesberg blames recreational drugs, not AZT, for AIDS, and Mbeki, as a result of what he read "on the Net", has concluded that AZT is toxic. It is also expensive, and with soaring rape rates in South Africa, the South African government is under pressure to provide AZT to rape victims and pregnant women. So the direct cause of the problem here is information posted on the World Wide Web by people who may or may not be qualified to comment and which appears to be in direct conflict with the available evidence. There can be no doubt at all that Mbeki has been inspired by the ideas set out by Duesberg.
The direct result of this failure to use AZT will be an increase in the number of children acquiring an HIV infection from their mothers, and eventually dying of the disease. (See \JSouth African AIDS panel meets\j, this month.)
We need our maverick theories, because sometimes they turn out to be right or partly right, after all,. It is unfortunate when a partly baked maverick theory is taken up by people and declared to be correct, no matter what evidence is available. Science does not flow from political power or votes, and that is why Duesberg is free to argue his theories. But if the cost is innocent deaths, that may be a step too far.
#
"South African AIDS panel meets",1472,0,0,0
(Apr. '00)
South Africa's national AIDS advisory panel, which was to meet in early May, includes both Peter Duesberg and David Rasnick (see \JPeter Duesberg at it again\j, this month), as well as other 'AIDS skeptics' from the US and Canada, and a number of more mainstream researchers like Luc Montagnier, the co-discoverer of the HIV-AIDS link.
More than two-thirds of the people who carry HIV, the virus that most scientists associate with AIDS, live in Africa. About 4 million South Africans, or about 10% of the population, are HIV-positive and will die within a decade unless a cure is found. Jerry Coovadia, a South African medical researcher and chairman of the world's Thirteenth International AIDS Conference, set for July in Durban, commented as the panel was convening, "We don't need to prove any longer whether HIV causes AIDS," Coovadia said. "That's an old and discredited debate. What this panel is doing is cruel, because it deflects our attention from the real urgent problem of finding real treatments for the disease."
This provoked strong responses. For example, Dr. Mark Wainberg, president of the International AIDS Society, said there is little doubt that the statements of HIV deniers have caused "countless" individuals to contract the deadly immunodeficiency virus. "People have died as a consequence of the Peter Duesbergs of this world," he said, criticising the claim that HIV is harmless and that AIDS is caused by drugs, including those used to treat HIV.
What really angers public-health advocates most is the dissidents' view that condoms and safe sex are "irrelevant." Dr. Wainberg said he believes in free speech, but limits to free speech are justified when it grossly undermines public-health efforts. "If we could succeed and lock a couple of these guys up, I guarantee you the HIV-denier movement would die pretty darn quickly," he said in a speech at the closing of the annual conference of the Canadian Association for HIV Research.
Coovadia also acted to bar ACTUP (the AIDS Coalition to Unleash Power) from exhibiting at the Durban conference in July, which led to ACTUP writing to Mbeki in protest, complaining that "ACT UP San Francisco feels that if the 13th International AIDS Conference is not committed to open scientific debate and the free exchange of ideas it is nothing but a farce and a source of embarrassment to the hosting nation."
This story is by no means over yet.
#
"Getting drugs into cells",1473,0,0,0
(Apr. '00)
Over the eons, cells have evolved powerful mechanisms to take in the molecules they need, while refusing entry to all others. Even drugs that might save the cell from disease are refused admittance in many cases, forcing scientists to engage in trickery to get the necessary drugs into the cell, generally by disguising the drug molecule in some way. The problem is becoming even more important because gene therapy in the future will require synthetic DNA to be carried into the cell.
One method described in a recent issue of the \IJournal of the American Chemical Society\i is likened by the authors to enclosing the molecule in an umbrella - on a molecular scale. A molecule in the watery solution outside a cell has trouble getting to the watery cytoplasm inside the cell because water-soluble substances are not compatible with the fabric of the cell membrane.
Cell membranes are made up of back-to-back layers of chain-like fatty molecules called lipids. Each lipid molecule has a hydrophilic ('water-loving') end and a hydrophobic ('water-hating') end. The membrane is made up of a double layer of these lipid molecules, with the hydrophilic heads on the surface, and the hydrophobic tails on the inside. Peptides and nucleic acids cannot easily enter this 'oilyÆ inner region of the membrane because these water-soluble molecules donÆt dissolve in oily solvents.
Past methods for introducing molecules into cells have included hollow, channel-like molecules inserted into the cell wall which can act like a pipeline to carry small molecules through, or using lipid-like molecules that can carry DNA past the oily region. There are also some viruses which have been specially engineered to use their natural mechanism that evolved to inject their own nucleic acids into cells, with the engineering conscripting them to the alternative task of injecting the important drug molecules instead. Another possibility involves giving the molecules an oil-soluble coating to get them through the membrane, but then they need to be able to shed it on the other side.
This brings us to the 'molecular umbrella', which closes around the molecule, protecting it as it passes through the membrane, but opening up to release the molecule once it has crossed over. The difference is that a normal umbrella protects against water when it is open, while this 'umbrella' offers no protection against water or anything else when it is open, but it offers protection against the lipid layer of the membrane when it is closed.
To create an umbrella that will open and shut automatically in watery and oily environments, the researchers gave their umbrella a water-loving inner face and a water-fearing (oil-loving) outer face. The canopy is two hydrocarbon molecules shaped rather like gently curved leaves, on the concave faces of which are attached water-loving molecular groups. The æleavesÆ are joined together at one tip by a short linking chain, the æhubÆ of the umbrella.
The peptide to be carried through is attached to this hub, from which it hangs like the handle of the umbrella. In water the assembly floats with the æleavesÆ open, exposing the water-loving faces. But in an oily medium like the inside of a cell membrane, the leaves close around the peptide so that only the outer hydrocarbon faces show.
The hydrocarbons make the closed molecular umbrella compatible with the lipid tails, which means the molecule can pass freely through the cell wall. On the far side, the umbrella opens when it comes into contact with the cytoplasm. The bond holding the peptide to the umbrellaÆs canopy can then be cut to release the ædrugÆ inside the cell.
The researchers have shown that this system works for transporting peptide ædrug mimicsÆ into æmodel cellsÆ called vesicles, cell-sized synthetic æbubblesÆ with lipid walls. They are now testing the system with genuine peptide and nucleic-acid drugs.
Key names: V. Janout, C. Di Giorgio and S. R. Regen.
#
"Gulf War syndrome dizziness linked to nerve gas",1474,0,0,0
(Apr. '00)
The argument about what causes 'Gulf War syndrome' continues, with scientists still undecided about whether the effect is real or not and, if it \Iis\i real, what causes it. Many things including stress, pollutants from the burning of oil wells, unspecified biological weapons, pesticides and either leaking nerve gases or drugs given as treatments-in-advance for nerve gas attacks that never happened all are also listed as possible suspects.
The University of Texas Southwestern Medical Center at Dallas has long housed a major research group dealing with the syndrome. In a paper published in the March issue of \IOtolaryngology-Head and Neck Surgery\i, UT researchers have reported that, in medical tests analyzing brain function, Gulf War veterans who complain of dizziness showed test results similar to those of victims of the 1995 Tokyo subway nerve-gas attack.
Using similar medical tests, the researchers found similar evidence of brain-stem damage in Gulf War veterans as victims of the sarin nerve-gas attack in Japan, which injured nearly 6,000 people. In other words, it now seems more likely that the veterans were exposed to chemicals and nerve agents in the Gulf War, which tends to rule out stress as a cause. A 1996 (US) Department of Veterans Affairs study published in the \INew England Journal of Medicine\i revealed that Gulf War veterans were 50% more likely to die in a motor vehicle accident than were military personnel not sent to the Gulf War. This new evidence seems to offer an explanation for why Gulf War veterans face a greater risk of death in car accidents.
The highly specialized tests measured eye movements and electronic brain measurements to assess the brain and middle ear function of the veterans, whose condition involves dizziness and loss of balance, among other symptoms, which are a bit vague, but which seem to come together as a syndrome.
"We find that ill Gulf War veterans don't usually mention their dizzy spells because they are not as incapacitating as the severe fatigue, body pain and memory problems," Robert Haley, one of the authors says. "After several in-depth interviews with these veterans, we realized the problem. We tried these specialized tests, and the tests located the cause of the symptoms in the brain."
In a groundbreaking paper in the January 1997 issue of the \IJournal of the American Medical Association\i, Haley identified three distinct syndromes, each linked to various Gulf War chemical exposures, including pesticides, insect repellants, chemical-warfare agents and pyridostigmine bromide (PB) tablets soldiers took to combat the effects of nerve gas.
Then in June 1999, Haley and fellow researchers published a paper showing that genetically low levels of an enzyme, type Q paraoxonase, may have caused some Gulf War soldiers to be more sensitive to low levels of nerve gas. This work is described in \JA genetic cause for Gulf War syndrome?\j, June 1999.
And in an abstract presented at the November 1999 meeting of the Radiological Society of North America, Haley and colleagues presented evidence found through sensitive brain scans that shows brain damage in sick Gulf War veterans. Haley is currently working on a study of amyotrophic lateral sclerosis (ALS), or Lou Gehrig's disease, in Gulf War veterans.
#
"Keeping antibiotics out of GM crops",1475,0,0,0
(Apr. '00)
When an organism is being given new genes, the process of genetic manipulation often involves adding two genes. One of these is a desirable gene, while the other is a gene that helps plants to resist some poison or other. While this matter is fully explained in \JAdding new genes\j, July 1999, the purpose of the second gene is to allow scientists to weed out the 'failures', the organisms which have not been genetically altered.
The most common second gene is one for resistance to some minor antibiotic, since a number of these genes exist in nature and they have been fully studied. The process involves adding a desirable gene and the gene for antibiotic resistance and, after a suitable wait for the genes to 'take', the results are treated with an antibiotic. This kills off those organisms which have not received the gene for antibiotic resistance, which means they probably do not have the other gene as well. The resistance gene serves as a flag to identify successful conversions
The antibiotics used are ones that are resisted by a large number of bacteria in the wild because the world's bacteria have been subjected to these antibiotics over a long period of time, while rice plants, potato plants and young trees have no such natural resistance. If any bacterium needed to gain the gene for resistance, it can do so very easily by gene transfer from other bacteria, which has been shown to happen very easily, even between bacteria that are only distant relatives.
There have been no known cases of bacteria obtaining genes from higher organisms such as plants, and the prospect of it happening is somewhere between highly unlikely and impossible - certainly far less likely than the chance of the gene being gathered from some other bacterium in the wild. But like so many things in science, we cannot assert that the prospect is absolutely impossible, and this careful use of scientific reserve can be seized upon by opponents.
At this point, politics and public relations become more important than science, because protesters have already interfered with development work, claiming that the practice is dangerous (see \JThe activists and the poplars\j, July 1999). While there is no identifiable scientific need to remove the resistance gene from crops, there is a clear PR reason to do so, and doing so can do no scientific harm.
So, to avoid this sort of protest and vandalism, scientists would be well advised to make sure that the modified plants do not carry any extraneous genes. While the likelihood of any risk from antibiotic trees is extremely low, there is always the chance that there \Imight\i be a problem of some sort. Therefore, applying the precautionary principle (which can be simply stated as 'better safe than sorry'), scientists in Leeds reported in \INature Biotechnology\i recently that they have now developed a way of clearing these unneeded marker genes from GM crops before they leave the laboratory.
What the group has done is both elegant and clever, since it involves the plants actually cleaning up after the researchers. They now have a strain of GM tobacco (\INicotiana tabacum\i) plants that neatly snip out these surplus genes from their own genomes. The technique relies on DNAÆs strong self-attraction. DNA regularly splits and then recombines with itself at a different place, and certain æhomologousÆ regions where the base sequences are very similar are more likely to get together.
The team created a DNA string, or æconstructÆ, containing an antibiotic-resistance gene placed between two homologous sections. The antibiotic involved was the common GM tool, kanamycin, and an effector gene (that is, one that was needed to improve the crop) was placed outside the homologous 'bookends'.
When these constructs are successfully spliced in, kanamycin resistance develops, but a few generations down the line, the resistance disappears in about half the plants when the gene is deleted, with a few of the plants also losing the desirable effector gene, which is also deleted. This means that the deletion is a bit 'hit and miss', but since the various strains can easily be tested, only the tobacco plants which have lost their kanamycin resistance would be used in the field as crops.
Tobacco plants have long been a favourite for genetic research, since they grow easily, but the researchers believe there " . . . is no general reason why this technique shouldnÆt work with all plant types that GM technology has been applied to, though there could be specific problems with individual species."
While this is not the first way of removing antibiotic-resistance marker genes to be developed, it is quicker, simpler and more convenient than anything else offered so far. Previous techniques required either the addition of another foreign gene to the plant to express æhelperÆ proteins that induce DNA deletion, or cross breeding plants and selecting those which have received the effector gene but not the antibiotic resistance gene.
Key names: Peter Meyer, Charles Scutt.
#
"Rice genome complete",1476,0,0,0
(Apr. '00)
In news that will make opponents of genetic manipulation feel concern, researchers sponsored by the Monsanto Company have produced a working draft of the rice plant genome. The good news is that, spurred in no small part by the criticism aimed at the company by opponents, Monsanto was quick to establish its credentials as a good world citizen in this matter, making the data freely available to the world.
One or two qualifiers are needed here. First, the draft map is complete, and can be understood in its totality, but there are small pieces missing, like a word or two missing in paragraphs. But according to Gregory G. Mahairas, who headed the project, with this information ". . . you can read and start to understand the entire book of life for rice." Second, Monsanto has agreed to share the rice genome sequencing data with members of the International Rice Genome Sequencing Project (IRGSP), a consortium established to sequence the entire genetic make-up of rice. In fact, the information has been transmitted to Japan's Ministry of Agriculture, Forestry and Fisheries (MAFF), which serves as the lead agency of the IRGSP, but a MAFF spokesman indicated that it will be shared with all members of the IRGSP. To all intents and purposes, the genome is publicly available where it is needed.
So the potential now exists for even greater steps to improve the production of rice, a vital food source for half of the world's population, along the lines of work described in \JNew rice strains and vitamin A and iron deficiency\j, August 1999 and \JModified rice could end food shortages\j, March 2000.
Rice is important because it has the largest genome and because it is a model species for learning about traits such as yield, hybrid vigor, and single and multigenic disease resistance for all grass plants including wheat and corn. It is the first plant to be mapped in working draft form, and deciphering the genetic code of rice is expected to lead to development of new varieties of rice that will produce greater yields, be more resistant to pests and disease, and grow in different types of climates and soils.
The Monsanto Company financed the research project that also tested a method for rapidly sequencing large genomes. The method had been described in an earlier scientific paper written by Leroy Hood, who was quoted as saying "It is gratifying to see the successful application of the theory we developed more than three years ago. It will reduce by several years the schedule for creating a complete detailed map of the rice genome."
Next up: the functional genomics phase of this project, when scientists begin to develop an understanding of what the genes really do.
Key names: Gregory G. Mahairas and Leroy Hood. (More than 200 other staff, 80 high-throughput DNA sequencers, robotic machines and powerful data processing computers were also involved in the project.)
Web information: this site is a good set of links for searchers wanting technical information about genomics: http://ars-genome.cornell.edu/other_sites.html, while the site at http://www3.ncbi.nlm.nih.gov/Entrez/Genome/main_genomes.html is an excellent resource for further information.
#
"Celera and the Genome Race that never was",1477,0,0,0
(Apr. '00)
Celera Genomics reported during April that it has finished sequencing the entire human genome. The news came in a surprise announcement at a US Congressional hearing that had been scheduled to discuss the future of the Human Genome Project. "We've finished the sequencing phase," Celera president Craig Venter said at the hearing. This means that the unofficial race between Celera and the public Human Genome Project is now over, several months ahead of Celera's own schedule. This, however, is just the genomic sequence, the raw genetic code, and not a gene sequence, which is what needs to follow.
While the 'race' has been entirely unofficial and deniable by the losers, scientists and companies aligned with Celera have been praising the effort and the victory, while those associated with the HGP have been playing the achievement down, and saying that it is a rather small step in a long process. The truth probably lies somewhere in the middle, and Celera's work will only begin to make sense when they assemble the mapping annotation, as the fragments are linked together. The end result will be the location and mapping all 3 billion units of a human being's DNA.
Then, once the fragments are pieced together, researchers will be able to use the information for genetic diagnosis and to develop gene therapies. Venter also said that Celera will have a complete annotated map by the end of the year, and that Celera will make the genome available to researchers who want to use it, despite repeated accusations that the company wants to restrict access to the information. At the same time, Celera plans to patent up to 500 genes. Celera filed for 6,500 patents on gene sequences last year, but now says it has only submitted preliminary paperwork on the genetic information and will only ask for patents on genes that may be significant for drug development.
To map the genome, Celera used what it says is the largest civilian supercomputer and their 'shotgun' technique (see \JFruit fly genome is published\j, March 2000, for details on the technique) to gather the sequences that make up the double helix of DNA. Operating more slowly, but with greater precision and certainty, the Human Genome Project says it has mapped 80% of the human genome, and will soon have a first draft of 90% of the human genome with four to five times coverage (redundant mapping for accuracy). It expects to have a final draft in 2002 or 2003.
#
"Three human chromosomes complete",1478,0,0,0
(Apr. '00)
With Celera claiming the whole genome during April, the announcement that researchers at the Department of Energy's Joint Genome Institute in Walnut Creek, California, have decoded in draft form the genetic information on human chromosomes 5, 16 and 19 went missing from most of the news pages. These chromosomes contain an estimated 10,000 to 15,000 genes, including those whose defects may lead to genetically linked diseases such as certain forms of kidney disease, prostate and colorectal cancer, leukemia, hypertension, diabetes and atherosclerosis.
The US Secretary of Energy, Bill Richardson, applauded the work at the 25th Annual American Association for the Advancement of Science Colloquium on Science and Technology Policy in Washington, D.C. "Three chapters in the reference book of human life are nearly complete," he said. "Scientists can already mine this treasure trove of information for the advances it may bring in our basic understanding of life as well as applications such as diagnosing, treating and eventually preventing disease."
The three chromosomes contain more than 300 million base pairs, or an estimated 11% of the total human genome. So far, the researchers have sequenced a working draft of the three chromosomes, leaving some scattered gaps in less gene-rich areas. Further research will continue to improve both the completeness and accuracy of the genetic information as the final sequence of the chromosomes is developed over the next several years, but already researchers have been able to provide a summary of the 'contents' of each chromosome:
\BChromosome 5\b contains an estimated 194 million bases, or about 6% of the human genome. Disease-linked genes on this chromosome include those for colorectal cancer, basal cell carcinoma, acute myelogenous leukemia, salt-resistant hypertension and a type of dwarfism.
\BChromosome 16\b contains about 98 million bases, or about 3% of the human genome. Studies have implicated genes on this chromosome in the development of breast and prostate cancer, Crohn's disease and adult polycystic kidney disease, which affects an estimated five million people worldwide. Half the affected people require dialysis or kidney transplant.
\BChromosome 19\b contains 60 million bases, or about 2% of the human genome. Genes involved in repair of DNA damage as well as those associated with atherosclerosis and diabetes mellitus are located on chromosome 19.
The information on chromosomes 5, 16 and 19 is freely available without restrictions to researchers in academia and industry through the public database, GenBank. Details about the chromosomes' draft sequence are expected to be published in mid-2000 as part of scientific articles describing the entire draft sequence. Chromosome 22 is also complete (see \JHuman chromosome 22 sequenced\j, December 1999), and the rest is coming fast, perhaps even faster, given the announcement by Celera (see \JCelera and the Genome Race that never was\j, this month). See also \JHuman chromosome 21 update\j, this month.
#
"Human chromosome 21 update",1479,0,0,0
\BLate news\b: on May 8, the journal \INature\i announced that a team of 62 scientists had completed the mapping of human chromosome 21. It turns out to have around 225 discernible genes, many fewer than were anticipated from what \INature\i calls 'in silico' calculations - a play on the more usual scientific terms \Iin vivo\i and \Iin vitro\i, and referring to the output of computers. The chromosome contains around 1% of the entire human genome on its long arm, while the short arm of the chromosome appears likely to be mainly 'junk DNA' (but see \JMouse junk DNA has uses\j, this month). If we considered this to be a typical chromosome, an extrapolation suggests that the total number of genes in humans could be as low as 40,000, rather than the normal estimate of 100,000.
As we shall explain in more detail next month (\JChromosome 21 published\j, May 2000), there are good reasons why we should wait before getting too excited about the low value obtained by extrapolation. It is contradicted by a comparison with chromosome 22, which is almost the same size as chromosome 21, but carries 545 genes. However, even that needs to be carefully qualified because different researchers use different criteria to identify genes on the chromosomes they are studying.
#
"Mouse junk DNA has uses",1480,0,0,0
(Apr. '00)
One issue to keep in mind about recent genomics work is that much of it is based on the assumption that there are two kinds of DNA: coding DNA, about 5% of the DNA in humans, and widely interpreted as 'the genes', and non-coding DNA, widely interpreted as 'junk'. A \IScience\i paper in early April reveals that the comparative analysis techniques used to identify DNA sequences coding for genes in mice and humans can also be used to identify sequences that regulate the 'expression', or activation, of genes.
By comparing human and mouse sequences scientists can now identify those segments of the genome that contain information which instructs surrounding genes on when and where they are to be active. Identifying these sorts of regulatory sequences using classical biological approaches is normally labor-intensive and difficult, so this is a most useful discovery.
The key to the discovery lies in evolutionary conservation of non-coding DNA sequences that play an important role in regulating gene expression, just as organisms have conserved the DNA sequences that code for equivalent genes across different species. It has probably been 70 to 90 million years since the mice and the human lines diverged, so if evolution has conserved a sequence over that huge period of time, it most probably is useful genetic material of some sort, rather than junk. In other words, it either determines the structure of a protein coded for by a gene or it regulates gene expression in some way. Even if the function is unclear, these are very clearly \Bnon-junk\b pieces of 'junk'.
It has long been known that some of these sequences have important duties, including the regulation of gene expression. Scientists also believe that these non-coding sequences have been conserved between related species such as mice and humans, just like sequences that code for genes. To search for conserved non-coding sequences (CNSs), the researchers examined a stretch of DNA about a million base-pairs in length from mice and humans that contained the same 23 genes in both species, including three interleukin genes (IL-4, IL-13, and IL-5). Previous studies indicated that these interleukin genes are similarly regulated and that their regulatory sequences may be conserved in mice and humans.
The Berkeley researchers looked for CNSs that were at least 70% identical in both species over at least 100 base-pairs. Of the 90 CNSs they identified that met this criteria, the researchers found that most of these were also present in DNA from a cow, a dog, a pig, a rabbit and a rat. Because they were also present in the other mammals, this suggests they have been conserved because they perform an important biological function.
The largest of the 15 sequences, CNS-1, contains 401 base-pairs and is located between IL-4 and IL-13. Using multiple lines of transgenic mice, the researchers have concluded that CNS-1 is a 'coordinate regulator' of the three interleukin genes, activating them by modulating the structure of chromatin. There is, the authors say, no standard in-vitro assay that could have been used to make this determination. Only computational analysis of mouse and human sequences could reveal the interleukin regulatory element CNS-1, and it points to a great deal of interesting analysis which can be launched once the entire genomes of mice and humans are sequenced.
Scientists are often asked what the use of their research is and, at the time, they may find it difficult to answer. Developments like this, converting the sequence information coming from the genome program into meaningful biology, could never have been predicted at the start of the rush of genomic analysis that has happened in the past decade.
Key names: Edward Rubin, Kelly Frazer, Gabriela Loots, Cathy Blankespoor, Richard Locksley, Zhi-En Wang, and Webb Miller.
#
"The tobacco industry and second-hand smoke",1481,0,0,0
(Apr. '00)
An April paper in \IThe Lancet\i claims that an unprecedented tobacco industry campaign undermined a report on second-hand smoke and cancer. Two researchers at the University of California at San Francisco, Elisa Ong and Stanton Glantz, say that a 10-year study conducted by the International Agency for Research in Cancer (IARC) examining the links between second-hand smoke and cancer was subverted by what they call a misinformation campaign coordinated by the tobacco industry and that the campaign resulted in misleading media reports of the European scientific study even before it was published.
Glantz, a long-time scholar and critic of tobacco industry strategies and professor of medicine at UCSF, commented on the Internet: "The extent of tobacco industry money and effort spent to discredit a single study is unprecedented". Ong is a medical student at Stanford University who collaborated with Glantz.
The reason the industry was so concerned about the paper, Glantz believes, is that while scientific reports on second-hand smoke had already stimulated legislation on clean indoor air in the U.S., European countries have been slower to change. Tobacco industry strategists were apparently trying to head off the possibility of sentiment growing for similar restrictions in their European markets, and that is why they hit this report with all they had, he believes. Glantz adds: "There seems to be little regard for the truth in the information they tried to spread."
In their paper, Ong and Glantz describe how the tobacco industry worked to undermine the conclusions and potential impact of the largest European study of passive smoking, which was conducted by IARC, the research arm of the World Health Organisation. They say the Phillip Morris tobacco company spearheaded a three-pronged inter-industry strategy in the mid 1990s to subvert IARC's work. The scientific strategy attempted to undercut IARC's research and to develop industry-directed research to counter the anticipated findings; the communications strategy planned to shape opinion by manipulating the media and the public; and the government strategy sought to prevent increased smoking restrictions. The IARC scientific study cost roughly $2 million over 10 years, while Philip Morris planned to spend $2 million in one year alone and up to $4 million on research, the authors report.
Part of Philip Morris' strategy was to use consultants sympathetic to the tobacco industry who were asked to find out more about the IARC report. Those consultants did not always disclose their industry links while seeking information from IARC investigators. The documents and interviews suggest that the tobacco industry continues to conduct a sophisticated campaign against conclusions that second-hand smoke causes lung cancer and other diseases, "subverting normal scientific processes," the authors conclude.
The IARC study demonstrated a 16% increase in risk in lung cancer for non-smokers from second-hand smoke, a result consistent with earlier studies. Although the results were clear and comparable to those found by others, the number of people in the study was too small to reach statistical significance (at the 95% confidence level).
As Ong and Glantz have documented, the tobacco industry exploited the degree of statistical uncertainty by providing selected newspapers with the misinformation that the study had demonstrated "no risk" of cancer from second-hand smoking, which was clearly not the study's finding. These incorrect conclusions were published in the British press before the scientific study was published and at the same time as an official British report reviewing second-hand smoke's health effects was released.
#
"Skeleton from royal tomb is not King Philip II",1482,0,0,0
(Apr. '00)
A skeleton, previously thought to be that of King \JPhilip II (of Macedon)\j, the military leader better known as the father of \JAlexander the Great\j, seems likely now to be no more exciting than one of Alexander's half brothers, a much less prominent figure in the royal line-up of ancient Greece, according to a paper by Antonis Bartsiokas which appeared in \IScience\i in mid-April.
The undoubtedly royal tomb was found in 1977 at \JVergina\j, the site of the ancient capital of \JMacedon\j, Aigai. Most of the other Vergina tombs have long since been thoroughly looted, but this one contained a cremated male and female skeleton and a variety of royal Macedonian artefacts.
The skeleton was originally identified as King Philip II because the skull seemed to show traces of an injury Philip is known to have suffered when an arrow sliced through his right eye. By default, the female was assumed to be the wife of Philip, who reigned in the years 359 to 336 BC. Philip is important mainly because he quelled the military and political turmoil in Macedonia and also took control of Athens and Thebes, thus laying the groundwork for his more famous son to conquer his own vast empire.
More recently, research has indicated that the artefacts in the royal tomb are from approximately 317 BC, a generation after Philip's assassination in 336 BC. Since Alexander the Great was buried in Egypt (see \JDeath of Alexander the Great\j, June, 1998), this would suggest that the male in the tomb was one of Alexander's successors, except for the evidence of the apparent wound in the skull.
Bartsiokas used a technique called macrophotography to study the skeleton in meticulous detail, and he has concluded that the marks on the bone are no more than normal anatomical quirks, accentuated by the effects of cremation and incorrect reconstruction. He used photographs of the bones magnified at the same scale as a conventional microscope lens to examine the sites of the supposed wounds in the skull.
Two marks on the roof of the skull's right eye socket have been offered in the past as evidence of the Philip II's famous eye injury. One was a groove in the inner corner of the arch near the nose, which was interpreted to be an indentation caused by the arrowhead. The other feature was a bump closer to the center of the arch, thought to be a healed-over nick from the incoming arrow. According to Bartsiokas, both the groove and the bump are simply normal anatomical features of the eye socket.
The bump, he believes, is part of the supraorbital notch, an opening in the skull's frontal bone through which a bundle of nerves and blood vessels pass. Most people can feel this notch by pressing their fingers underneath the ridge of bone beneath the eyebrow.
Bartsiokas also rejects the theory that the skull's general asymmetry and a crack below the right eye socket provide evidence that an arrow had smashed into the cheekbone after hitting the eye. He believes these are not related to an injury, since there is no evidence of healing in the bone fabric. Rather, he suggests, the cheekbone probably cracked while being cremated and was later glued back together imperfectly. Incorrect reconstruction and the effects of cremation were also responsible for the skull's asymmetrical distortion, Bartsiokas claims.
So whose tomb was it? After Alexander died, the throne went to one of Philip's other sons, Alexander's half brother, Philip III Arrhidaeus. Arrhidaeus, who was probably mentally ill or physically disabled, was king in name only because Alexander's friends divided the empire among themselves. According to the ancient historian \JPlutarch\j, Alexander's jealous mother Olympias caused Arrhidaeus' condition by poisoning him at a young age.
According to tradition, Arrhidaeus' skeleton was cremated under somewhat unusual conditions after he was assassinated by Olympias or a conspirator. Historians think Arrhidaeus' successor, Cassander, later exhumed, cremated, and re-buried the skeleton as a gesture of honor intended to promote his own legitimacy as king. But as forensic scientists know, bones cremated 'dry' show key differences from bones cremated 'fleshed.' The 'dry' bones are only warped slightly if at all, and they have a few small, straight fractures. The 'fleshed' bones are affected by the retraction of the relatively fresh collagen part of the bone during cremation, which causes warping and curved fractures.
The bones in the royal tomb, Bartsiokas reports, were clearly burned dry, as Arrhidaeus' bones would have been. In this way, the anthropological evidence helps settle the question raised by the age of the artefacts in the tomb, says Bartsiokas. Further, because the skeleton in the tomb is probably King Philip III Arrhidaeus instead of King Philip II, many of the artefacts may have been inherited from Alexander the Great. Among the treasures were a gilded silver crown, an iron helmet, an elaborate ceremonial shield, and an iron and gold cuirass that closely resembles the one worn by Alexander in a famous Pompeii mosaic depicting him.
The chances of anybody ever finding the real King Philip II in the future seem slim, according to the experts. Excavation at Vergina does continue, but looters have long ago emptied the contents of most of the tombs.
#
"The causes of coronal mass ejections",1483,0,0,0
(Apr. '00)
Coronal mass ejections, or CMEs, are large-scale eruptions on the Sun. They can cause bright auroras, damage sensitive satellite instruments in space, and even disrupt power generation and transmission by inducing strong electric currents below the surface of the Earth.
A new source for some of these eruptions was reported in \IGeophysical Research Letters\i in mid-April. The authors report that shock waves launched from solar flares can cause CMEs elsewhere in the solar corona. While most CMEs come from structures directly above the eruptive flare, these less common CMEs come from off to one side of the flare.
The authors are both currently working at Japan's Institute of Space and Astronautical Science (ISAS) in Kanagawa, where they conducted their investigations using the Yohkoh ('sunbeam') satellite. Using X-ray images of the Sun, they found that very large hot loops of solar material sometimes connect sunspot regions in the Sun's northern and southern hemispheres. These loops, called interconnecting X-ray loops, can simply disappear suddenly, ejecting huge amounts of material.
They noted three similar disappearances on May 6, 8, and 9, 1998, and in each case, the loops disappeared, only to be followed by loop-like CMEs. Using images obtained with another instrument, the Large Angle Spectroscopic Coronagraph (LASCO) on the Solar and Heliospheric Observatory or SOHO satellite, each CME event could be tracked far out into the corona .
The link between these disappearing loops and solar flares is novel and interesting, according to the report, and may shed some light on what causes the ejections. Solar flares push out large amounts of energy across the electromagnetic spectrum, while prominence eruptions are the ejection of large suspensions of cool material in the Sun's hot outer atmosphere, or corona. Some, but not all, prominence eruptions and CMEs are associated with flares.
Prominence eruptions and CMEs are clearly tied in to the X-ray brightenings seen in the corona below these ejections, but it is less certain which causes which - or if both are caused by some other effect. If we knew the causal linkage, scientists would know which is the more important phenomenon to study.
So it is significant that the flare is not located directly below the CME in the cases under study here. Rather, it is off to one side, in a sunspot region outside the structures that erupt to become part of the CME. In other words, the mass ejected in the CME does not come from the structures directly above the flare. From the Yohkoh X-ray images, Khan and Hudson were able to determine fairly accurately the timing of the disappearance of the interconnecting loops. The solar flare comes first, before the loops disappear, which means before the start of the coronal mass ejection.
Then, by comparing the X-ray data and simultaneous data from radio telescopes, they found evidence for shock waves. In every case they looked at, they found that the interconnecting loops disappear when the shocks cross their vicinity. So it now appears that, for these unusual CMEs at least, a shock wave generated by the flare crosses a large interconnecting loop, causing it to become unstable and erupt. The eruption then ejects hot X-ray material, which becomes a significant fraction of the coronal mass ejection.
Key names: Josef I. Khan and Hugh S. Hudson.
#
"BOOMERANG yields good returns",1484,0,0,0
(Apr. '00)
BOOMERANG, short for 'balloon observations of millimetric extragalactic radiation and geophysics', is a study of the \Jcosmic background radiation\j, now usually abbreviated to CMB, for Cosmic Microwave Background. In January 1999, the international BOOMERANG consortium sent a balloon aloft which circumnavigated the South Pole in just 10 and a half days, gathering the most detailed measurement yet made of the CMB. The results of the analysis of the data appeared in \INature\i in late April 2000.
The analysis reveals that the curvature of the universe is flat, neither positive or negative, but this conclusion could only be reached after huge amounts of computing: like so much else in science, work like this relies heavily on computing power to extract the information buried in the data. During the flight around the South Pole, the BOOMERANG Long Duration Ballooning mission carried instruments which made close to one billion measurements of tiny variations in the temperature of the CMB across a wide area of the sky.
The trick is to build a map of the CMB's temperature fluctuations, and then use this to derive a 'power spectrum'. This is a curve that registers the strength of the fluctuations on different angular scales, and which, in the words of the scientists, " . . . contains information on such characteristics of the universe as its geometry and how much matter and energy it contains."
Andrew Jaffe, a member of the team, describes the calculation of the most likely power spectrum as " . . . a task that is conceptually simple but computationally challenging." So challenging, in fact, that it took a 696-node Cray T3E supercomputer and a software package that Julian Borrill developed, called MADCAP ('microwave anisotropy dataset computational analysis package') about three weeks to complete. The same task would have required some 50,000 hours of processor time, almost six years, to complete if run on a desktop personal computer. The time was worth it: with the accurate data obtained from the flight of the balloon, the researchers have now been able to determine a number of fundamental cosmic parameters to within a few percent.
About 300,000 years after the moment of the Big Bang, the universe cooled down enough for protons and electrons to form hydrogen atoms. When that happened, photons were freed from what had been a hot primordial soup of subatomic particles and, ever since then, these energetic photons have been traveling through space, their wavelength now stretched to microwave scale and their frequency reduced to the equivalent of radiation from a black body at only 2.73 degrees Kelvin. The purpose of CMB experiments is to gather as much information as possible about the state of the universe at that critical point.
The first step is to produce a map of the tiny fluctuations, variations in the background radiation, and we are talking about seriously tiny temperature differences of less than 1 part in 100,000. These differences may be tiny, but they reflect the tiny inhomogeneities in the early universe, at a time when the universe was in a much simpler state than it is today.
Confusing the search, there is a fair amount of spurious information, some of it 'instrument noise' caused by random effects, and some of it from foreground sources of microwave radiation, such as dust. Only after these have been pinned down and eliminated can the rest be taken account of as genuine variations in the CMB. A telescope on the balloon sweeps across the sky, making 50 million observations for each of 16 channels at four frequencies. However, these are not entirely independent of one another, and that is where the processing comes in, painstakingly building up a map of the universe, pixel by pixel, by milking the measurements for the information that lies hidden within.
The map still contains the foreground sources like the plane of the galaxy and distant quasars, but these are easily spotted, and so are the dust clouds, by their spectral signature - which is why maps are built from a variety of frequencies. The signal-to-noise ratio can be improved further by comparing observations made at different times. This is computationally expensive, but less so than the next stage, deriving the power spectrum.
Essentially, this involves ignoring everything else, and just finding the contribution of the cosmic microwave background. That means reducing thousands of pixels to a dozen or so numbers that represent points on the power spectrum curve, and then comparing those measured points with the curves that would be generated by different models of the universe. So what the scientists are doing is asking what the CMB would look like on that patch of sky if the universe had such-and-such a shape and history, and then looking for a match.
Inside the computer, this means comparing each pixel in the chosen dataset with every other pixel, a dozen times or more, in order to find each of the dozen or so points on the power spectrum. With today's algorithms and today's supercomputers, say the scientists, they have gone about as far as they can go, However, NASA's planned MAP satellite, to be launched later in 2000, and the European Space Agency's PLANCK satellite will deliver much larger datasets and, right now, we have no real way of coping with them.
Still, with what we \Ican\i do, at least we know that the universe is 'flat', that its geometry is Euclidean, not curved. The emerging consensus among scientists is for the 'concordance model' of a flat universe filled with dark energy that may correspond to the cosmological constant first proposed by Albert Einstein in 1917.
BOOMERANG's website is at http://www.physics.ucsb.edu/~boomerang/ (mirrored in Europe at http://oberon.roma1.infn.it/boomerang/).
Key names: Julian Borrill, Andrew Jaffe, Andrew Lange and Paolo Bernardis.
#
"Images of Jupiter's inner moons",1485,0,0,0
(Apr. '00)
Jupiter's inner moons are hard to reach because they lie inside the planet's lethal radiation belts. This helps to explain why it is only now, late in the Galileo mission, that the spacecraft has ventured to capture the highest resolution images yet of three of the planet's four innermost moons, Thebe, Amalthea and Metis (\JAdrastea\j is the missing one).
Two views of Jupiter's 250-kilometer-long (155 miles), irregularly shaped moon Amalthea, obtained by Galileo's Solid State Imaging camera last August and November, show for the first time that a bright surface feature named Ida is a streak of bright material about 50 kilometers (31 miles) in length.
The most recent images of the three moons show surface features as small as 2 kilometers (1.25 miles) across. A prominent impact crater on Thebe is about 40 kilometers across and has been given the provisional name Zethus (in Greek mythology, the husband of Thebe). A large white region near the south pole of Amalthea is the brightest patch of surface material seen anywhere on the three moons. Its composition is unknown, and it sits inside a large crater named Gaea.
The images dribbled back at 40 bits a second over several months using a single low-power antenna because early in the mission, the main, umbrella-shaped, high-gain antenna (100,000 bits per second) on the spacecraft had failed to open. The images were then enhanced, with filtering to remove the 'noise' caused by charged particles striking the camera's light-sensitive charge-coupled device. Then through a computer process of enhancement, the full quality of the images of the moons was slowly revealed.
Gathering the images was only possible because the Jet propulsion Laboratory (JPL) decided to lower the orbit and risk three flybys of the volcanically active moon Io, 300,000 kilometers closer to Jupiter's center than Europa, which is 700,000 kilometers from the planet's center, and previously the closest that the craft had come to the magnetic fields and charged particles in the close-in radiation belts. It was this which brought Galileo close to three of the inner moons.
The image data were stored on a tape recorder, but this only offers a temporary form of storage, and the JPL staff needed a clever method to strip the data they needed from the tapes before the tapes were needed for other mission projects. As the spacecraft sped toward the far extreme of its elliptical orbit after capturing the images of the moons, the data were relayed in highly compressed form, sacrificing detail but greatly reducing downlink time, giving researchers a chance to learn where the moons were located within each camera frame.
With this information to hand, the team was able to select the portions of each frame that they needed to see in high resolution so that data could be downloaded before the next fly-by needed to use the limited tape storage. In this way, a set of views captured on January 4 were relayed in their compressed form by January 25, with data selected for transmission at full resolution being downloaded by February 14.
Ida, the previously observed feature on Amalthea, is now revealed either as ejecta from a nearby meteoroid impact crater or possibly the crest of a local ridge. In either case, this is an advance on previous images, which made Ida appear as a round, bright 'spot.' There are other patches of relatively bright material to be seen elsewhere on Amalthea, although none has Ida's linear shape.
The images of Amalthea also reveal a large meteoroid impact crater about 40 kilometers (25 miles) across. There are also two ridges, tall enough to cast shadows, which extend from the top of the crater in a V-shape, rather like two rabbit ears.
See also: \JThebe\j, \JMetis (astronomy)\j and \JAmalthea\j. The Galileo Project home page is located at http://www.jpl.nasa.gov/galileo
Key names: Michael Belton, Damon Simonelli, Joseph Veverka, Peter Thomas, Nirattaya Khumsame and Laura Rossier.
#
"Internet2 astronomy",1486,0,0,0
(Apr. '00)
The University of Hawaii and the Association of Universities for Research in Astronomy (AURA) announced the connection of 11 of the world's leading astronomical observatories to Internet2 networks via the Mauna Kea Observatories Communication Network (MKOCN). With a capacity of 45 million bits per second, they say the new link will dramatically expand the capacity of astronomers around the world to remotely use telescopes located on the Hawaii mountaintop.
The only snag for the public is that the Internet2 connection, nearly one thousand times faster than a typical modem, is only available to professional astronomers. In certain applications it will be possible for astronomers with access to Internet2 networks to 'observe' with the Keck telescopes from authorized mainland sites, but only from those sites.
Astronomers around the world are now also able to connect in real time to the Gemini North control center in the University of Hawaii at Hilo Research Park, and a high-performance connection to the Gemini South facility in Chile is planned as well.
It is not all bad news for the public, though. The high-performance connection will allow the observatories to share more of their findings with the public through techniques such virtual observatory tours and live video from Mauna Kea to museums, planetaria and classrooms worldwide. The University of Hawaii at Hilo will be developing a new Mauna Kea Astronomy Education Center in its Research Park, which will use the high-performance connections to the observatories along with a planetarium, videoconferencing and instructional facilities.
#
"Ising Model fails in three dimensions",1487,0,0,0
(Apr. '00)
Flocking birds, water molecules forming ice crystals, neural networks in computers, and many other effects have been assumed for many years to behave in accordance with something called the Ising Model, developed by Ernst Ising in 1926 as part of his PhD dissertation. He proposed the model to describe magnetism in crystalline materials, but events as different as iron particles forming into domains, proteins folding, and even economic questions and the behaviour of glassy substances have been solved in the past by applying the Ising Model of phase changes.
Enthusiastic investigators have used the model in one, two and three dimensions, but its validity has only been proven in one and two dimensions. Now computational biologist Sorin Istrail reports that he has shown that the solution of Ising's model cannot be extended into three dimensions. His paper is to be published in May in the \IProceedings of the Association for Computing Machinery's (ACM) 2000 Symposium on the Theory of Computing\i. The implication of this negative proof is that exact solutions can never be found in three dimensions, but it may also mean that the model may finally be shown to be incorrect.
The Ising model sets out to describe how a large array of æagentsÆ will act when each can exist in two different states. The agents are arranged on a regular lattice (like a chessboard), and the state of each æagentÆ influences that of its immediate neighbours. To understand Ising's own one-dimensional model, think of a chain of particles like little magnets, able to take either an 'up' or a 'down' position, where each particle's position acts on the particles to either side of it.
In the Ising model of a magnet, each agent is a magnetic atom that can have its magnetic æpolesÆ pointing in one of two opposite orientations. In ferromagnets (such as iron), the interactions between neighboring atoms will make them line up with their magnetic poles in the same direction. When all the æatomic magnetsÆ are orientated in the same way, they add up to give the material a net magnetization.
When iron is heated to around 770░C (1043 K), the thermal motion of the atoms is strong enough to break this alignment. But as the temperature falls below this temperature (the \JCurie point\j), certain kinds of material switch in a sudden æphase transitionÆ from a non-magnetic to a ferromagnetic state, and the Ising model describes this.
It took almost 20 years for Lars Onsager to extend the model to the dimensions of a two-dimensional (chessboard) lattice, where the magnetic moments, or spins, of each magnet influence the behavior of other nearby magnets. Although this model offered a wider application in the material world than the simpler chain, it still did not cover most real-life situations, where materials are three-dimensional.
When the model is expanded into three dimensions, the properties of the model can be calculated to a high degree of accuracy, but the figures are not exact. There were nice neat mathematical solutions for one or two dimensions, but nobody has been able to find an exact solution to any three-dimensional lattice problem.
This did not deter modelers across a wide range of fields: one estimate says more than 8000 papers were published in this area from 1969 to 1997. And given the usefulness of the model, it is hardly surprising that the proof for three dimensions has been pursued so hard. Nobel laureate Richard Feynman was probably the most famous hunter, but many others, including Onsager, have given time and energy to seeking a solution to the three-dimensional model.
The sad fact is that the Ising Model does not stand up in three dimensions. Even that, however, is not a complete loss: as Istrail commented on the Web: "Naturally, it's not as useful as finding the Holy Grail. We all 'wanna be like' Lars [Onsager]. But at least no one now needs to spend time trying to solve the unsolvable."
Istrail used a method called computational intractability, which identifies problems that cannot be solved in humanly feasible time, to show that the solution could not be extended. There are some 6000 such problems known in all areas of science and, as these are mathematically equivalent to each other, a solution to any one of them would be a solution to all of them, but this is not feasible. What Istrail did was show that the Ising problem, for any lattice, is one of these problems. For this reason, it is computationally intractable.
This proof does not mean that a computer cannot find the transition by sheer number-crunching alone, since this can clearly be done. What it tells us is that any attempt to formulate an exact equation relating the transition temperature below which a substance becomes magnetic to the modelÆs basic parameters would take longer than is humanly feasible. And that means it can't be done.
Footnote 1: Ernst Ising, generally regarded as a genius by the world's finest minds who have worked on aspects of his model, had the misfortune to be a German-Jewish scientist, so he was barred from teaching when Hitler came to power, and was restricted to menial work. He survived the Second World War and taught afterwards in the United States, but he never published again.
Footnote 2: Some years ago, two CERN scientists wrote a paper called 'Does the one-dimensional Ising model show intermittency?' This in itself may not be notable, but the abstract of the paper must surely be a world record for brevity. It read 'No.'
#
"New modulator to speed communications.",1488,0,0,0
(Apr. '00)
A report in \IScience\i in early April describes a new device that operates at less than one volt that can convert electric signals into optical transmissions at a rate of 100 gigabytes per second. At that sort of speed, the notion of download time would disappear from the Internet, and many other applications that rely on fiber optics would be revolutionised. For example, the new electro-optic modulator could also be applied to technologies like aircraft navigation and anti-collision radar in cars.
A modulator like this codes electrical signals into flashes of light, ready for transmission over optic fiber. While other electro-optic materials such as inorganic lithium niobate crystals can do the same sort of conversion, they can only do so at a high operating voltage, and that limits the strength of the signal and increases its level of distortion.
The modulator consists of organic molecules called chromophores embedded in a polymer matrix. Chromophores have excellent electro-optic capabilities, but whenever previous researchers tried to use chromophores in such a device, they were troubled by interactions between the electrical fields of the chromophores that reduced their efficiency.
The researchers say they got around this obstacle by turning back to some of their earlier theoretical work, when they showed that changing the shape of the chromophores can minimize the clashes of their electrical fields. After that, it came down to a bit of tinkering to convert the structure suitably to achieve the elusive goal of a high-bandwidth polymer modulator device below one volt.
According to Larry Dalton, the team leader, the device offers the ability to take " . . . telephone signals, computer data, TV signals, any type of signal you can think of, put it on fiber optic, route it around the world with almost no optical signal loss, and accomplish this with infinite bandwidth. It has the potential of revolutionizing the way we all function."
That aside, Dalton also sees a future for the device in fields as different as radar and satellite communications, navigation systems for commercial and military aircraft, and even to 'smart cars' that warn drivers about possible collisions on the road ahead.
As well, says Dalton, these new modulators can be packed together in a variety of sophisticated, high-density packages without optical energy leaking between them or overheating. This may make them useful components in increasingly powerful computers that are jam-packed with heat-generating transistors, and it also distinguishes them from their lithium niobate competitors, which generate significant amounts of heat and so cannot be integrated directly on to silicon chips.
#
"Color maps for the color blind",1489,0,0,0
(Apr. '00)
Around 8% of men and a much smaller proportion of women suffer from color blindness because they have a deficient gene which is carried on the X chromosome, or two deficient genes on both X chromosomes for the women. To people who do not have this impairment, it is a matter for amusement, but to those who are color blind, it is not just a fashion inconvenience, but an impairment that makes reading maps and other visual data difficult, if not impossible.
Now Cynthia A. Brewer, an associate professor of geography at Penn State, has developed color schemes that allow most color-blind people to interpret the images. Instead of using the rainbow schemes that are very popular in maps designed to convey information, she has found a better way to present information using a set of colors that are just as effective for people with standard color vision, while being clear to most of those who are color blind.
There are a number of types of color blindness, with only a very few people viewing the world in shades of grey alone. The majority of color blind people are red/green color blind because they are missing either the red or the green receptor cone cells in their retina, which means they miss some of the cues that other people use when they compare colors.
These people can still recognise traffic lights because there are other clues, like position, brightness and hue to go on; however, they may have problems recognising a single port or starboard light on a boat, even though they are able to distinguish between them when shown both. If two lights or colored objects differ only in the intensity of their green component, a person lacking green receptors will see no difference, while somebody lacking red receptors will have the same problem with stimuli which differ only in their red intensities.
This is where the rainbow schemes fail, because they make a totally unsystematic use of the red and green components to produce a set of shades that contrast nicely for those with all of the color receptors, but leave the color blind viewers floundering. Although a few people confuse shades of yellow and blue, around 95% of all color blind men are red/green blind, and it is this large majority that Cynthia Brewer has targeted with her new color range.
"I am only attempting to accommodate those who are red/green color blind because combined they are the most common forms of color blindness," says Brewer. "It would be a different and more difficult task to accommodate everyone on the same map."
Brewer explains that color confusion occurs along lines through the full color space enclosed by the spectrum. Different sets of confusion lines are specific for the type of color blindness. In fact, the confusion lines for the two types of red/green color blindness are different, but luckily, they are similar enough to allow colors to be chosen to serve both sets of individuals.
The usual textbook explanation that red/green color blindness is just a problem with confusing red and green is less than the whole picture. The color blindness also causes problems with an unlimited pairing of colors that fall on the confusion line. For example, problems can occur distinguishing between blue-green and pink or blue-green and purple. Color-blind individuals may not be able to distinguish between olive-colored and rust-colored socks, while they could distinguish between bright green and olive socks, rust and red socks or rust and bright green socks.
One way to avoid confusion is to alter the lightness and darkness of the colors. The color-blind person may still see the same color, but they can tell that the areas colored with these colors are different. "We do not know what color-blind people see," says Brewer. "Actually, we do not know what color vision looks like to anyone but ourselves. However, if a map has adjacent patches that someone sees as the same color, the information stored in the map will be inaccessible."
Brewer has recently had the opportunity to design red/green color-blind-friendly color schemes for the \IAtlas of US Mortality\i, produced by the National Center for Health Statistics. These epidemiological maps contain large amounts of information that provide a thousand words in one picture - if the viewer can see the picture as it is intended. One way to accommodate red/green color blindness and still use a rainbow scheme is simply to skip over one of the offending colors. "If we use a scheme that goes red to blue and leaves out the greens then it is not a problem," says Brewer.
Other successful color combinations include blues and yellows, magenta-violets and yellow-reds and blue, green, yellow sequences. Again, combining alterations in color purity and hue can also help color-blind people to distinguish between colors. The key in choosing color schemes is to choose colors that do not lie on confusion lines. For example, in a color scheme using blue, magenta, green, red-orange and yellow, the last three colors fall on the same confusion line and would cause problems. A solution would be to choose blue, light green, greenish white, grey magenta and dark green, a scheme that avoids confusion lines.
Cartographers typically spend time designing demographic maps for deaths, births, populations, agricultural and industrial production and disease. If they consider the color blind when making their decisions and choose colors that do not lie on the same confusion line, the information they are trying to communicate will be accessible to nearly everybody who sees it.
An example of the color schemes in use can be found on the Web in a series of maps of mortality across the US, stored at http://www.cdc.gov/nchs/products/pubs/pubd/other/atlas/atlas.htm
#
"Fossil dinosaur heart",1490,0,0,0
(Apr. '00)
According to standard theory, fossil finds are made up of two parts: fossilised bones, and matrix, the \Icompletely useless\i rock that surrounds the skeleton and lies inside it. On this understanding, matrix is just the stuff that has to be stripped away with rock drills, jack hammers, probes and needles, acid baths, or whatever else seems appropriate for the task. And on this understanding, the matrix is just so much junk to be sent to the tip.
Now, a computerized tomography (CT) scan of the chest cavity of a new dinosaur fossil has given the world's palaeontologists reason to wonder if they have not been throwing away the best part all of this time. The scan has revealed something that looks remarkably like a heart preserved in the supposedly useless matrix of a dinosaur nicknamed 'Willo'. It was only a grapefruit-sized reddish clump, but the structures within the clump are remarkably heart-like. As well, X-ray diffraction analyses confirmed the presence of iron in Willo's heart but not in the sediments surrounding the heart or skeleton, supporting the view that the fossilized concretion in Willo's chest was a heart.
According to a report in \IScience\i in mid-April, the heart more closely resembles a bird or mammal organ rather than a modern reptile's heart. This observation will yield comfort to those palaeontologists who have long argued that the dinosaurs, or some of them, were warm-blooded, with a relatively high metabolism. At the very least, it seems to suggest that this dinosaur, an unclassified two-legged, 4-metre (13-foot) thescelosaur, a member of the 'bird-hipped' ornithischian group of dinosaurs, was warm-blooded.
\IThescelosaurus\i means 'marvellous lizard' and, although scientists have not yet conclusively identified which species of \IThescelosaurus\i Willo is, it is most likely to be \IT. neglectus\i. The species name \Ineglectus\i translates as 'neglected one', and was assigned because the first fossil, in 1891, was considered so unremarkable that it sat, unidentified and unstudied, in its packing crate at the Smithsonian Institution for 22 years. It was only in 1913 that palaeontologist Charles Gilmore examined the fossil and discovered it to be a previously undescribed type.
After this, it is unlikely that this remarkable and marvellous lizard will be neglected any more. Willo is the only \IThescelosaurus\i ever found with a complete skull and with remnants of the soft tissues usually lost to decay. Tendons are still connected to its spine, and fossilized cartilage remains attached to its ribs. As well, shadows and shapes revealed by the scanning suggest that Willo may contain other fossilized organs as well, so the story of this 66-million-year-old dinosaur probably still has a few chapters to be written.
The search was set off by Michael Hammer, a professional fossil collector and co-author of the \IScience\i paper, who discovered the remains of a dinosaur in South Dakota. Hammer thought, given the unusually well preserved ribs of the specimen, that the chest cavity might hold some traces of the animals' internal organs. This was a reasonable suspicion, since it has long been known that in oxygen-poor environments such as the sediment beneath a streambed, an animal's soft tissues can sometimes become fossilized, although the process involved is still unknown.
So, rather than just clearing away the bones from the matrix, Hammer cleaned the surface and, before sending the dinosaur to its new owners at the North Carolina State Museum of Natural Sciences, physician Andrew Kuzmitz had its chest region CT scanned.
The scan shows two neighboring cavities and a single tubular structure positioned like an aorta. The cavities are probably the heart's lower chambers, the ventricles, as these are the heavily muscled second chambers that push blood around the body. The upper chambers, the atria, have thin walls, and so they are assumed to have collapsed when the dinosaur died. The aorta is a fairly good indication that there would have been two separate ventricles.
Most modern reptiles have a single, imperfectly divided, ventricle. This 'hole in the heart' condition is often described as primitive compared to that of the mammalian heart. This is despite the fact that the condition is of immense use to the sea snake, which is able to dive much deeper and avoid suffering the effect we call \Jdecompression sickness\j, or 'the bends', in humans by making good use of the link between the two halves of the ventricle.
In mammals, body blood goes back to the heart, to the lungs, to the heart and back to the body. However, the sea snakes can bypass the lung, sending blood instead to capillaries in the skin, where carbon dioxide and nitrogen are lost and oxygen is gained from the sea water. So when we speak of the heart of modern reptiles as 'primitive', it is important to keep in mind that it may simply reflect an 'advanced' adjustment to the conditions under which snakes evolved (see \JFossil snake with legs discovered\j, March 2000).
Recognising the trap that lies in terms like 'primitive' and 'advanced', the dinosaurs have long been considered by traditionalists to be related more closely to the modern reptiles, which have two aortas and a simpler heart structure that allows some mixing between the oxygen-rich blood from the lungs and the oxygen-poor blood from the body. With no evidence of what dinosaurs' heart structures were, the traditionalists have argued that the dinosaurs must have had hearts like modern reptiles, and that the mixing reduced the overall oxygen content of the blood supplied to the reptiles' tissues. From that, they concluded that the dinosaurs could not have a fast metabolism or be warm-blooded.
The four-chambered hearts of birds and mammals deliver completely oxygenated blood to the body, which fuels relatively fast metabolism, while the living reptiles tend to be more sluggish and can get by with less oxygen. On the new evidence, 'Willo' may be more like birds and mammals than like any crocodile, snake or turtle, even though it is an ornithischian.
Most palaeontologists believe that the saurischians, the 'lizard-hipped' dinosaurs, rather than the ornithischians, or 'bird-hipped' dinosaurs, are actually the ones that eventually gave rise to the birds. So, finding a bird-like heart in a dinosaur far from the bird line of descent suggests that four-chambered hearts may have been quite common among the dinosaurs. That raises the question of what else we may have missed over the years, according to Alex Ritchie, the Australian palaeontologist who carried out the scientific work that opened up the famous Canowindra fish-kill site.
The authors of the \IScience\i paper are well aware of this, and they urge that fossil hunters make a point of looking for traces of soft tissue in their discoveries. 'Willo' now has a home at the North Carolina Museum of Natural Sciences in the museum's new $71-million building, which opened in early April. 'Willo' can also be found on the Web at http://www.dinoheart.org
Key names: Michael Hammer, Andrew Kuzmitz, Michael Stoskopf and Reese Barrick
#
"Ancient toothpicks tell a tale",1491,0,0,0
(Apr. '00)
The teeth of modern humans suggest that we are natural omnivores because our teeth are half-way between those of committed carnivores like the cat family and the equally determined herbivores like cattle, sheep and goats. But somewhere along the way, as humans moved out of the trees and onto the plains, they must have changed from eating plants to eating meat and, when they started doing this, their teeth would still have been the teeth of a herbivore.
So how do we tell what our ancestors ate? If we are only looking back a few centuries, the teeth have the story of what we \Icould\i eat, while fossilised cesspits in places like York can reveal remnants of the foods eaten a thousand years ago. But our ancestors at the dawn of humanity were less organised, and have left no deposits for scientists to sift.
Sometimes, micro-scratches on fossil teeth may yield a few clues about the food that might have made the scratches, but Peter Ungar believes he has something better: the oldest evidence of toothpick use, extracted from a tooth unearthed in excavations at the bottom of a gully at Olduvai Gorge in Tanzania. Ungar, a University of Arkansas associate professor of anthropology, presented his findings at the American Association of Physical Anthropology annual meeting in San Antonio, Texas in mid-April.
The tooth from Olduvai Gorge was found by Mary Leakey and her colleagues, and it dates back as far as 1.8 million years. When Ungar and his colleague were cleaning the tooth for examination, they noticed some strange grooves on it. Under the electron microscope, these showed up as tiny, parallel lines that repeated along the sides of the tooth.
According to Ungar, the picture is of somebody trying to shove something narrow into a small space between their teeth, in other words, a toothpick. The object would have to have been sharp to leave marks on the enamel and the dentine, and Ungar believes they may have used pieces of bone, or perhaps grit on a stick may have caused the grooves. More importantly, grooves like this are only found in hominid teeth.
Modern apes, even chimps which use sticks as tools do not do this, and Ungar believes that the key element was meat turning up in the diet of early hominids. Scientists have already noted the sudden appearance at about this time of cut marks on animal bones in the archaeological record. Hominids' teeth are not well designed for eating meat, so our early ancestors had to use toothpicks, Ungar believes.
Key names: Peter Ungar, Fred Grine, Mark Teaford, and Alejandro Perez Perez.
#
"Genomic imprinting in the platypus and opossum",1492,0,0,0
(Apr. '00)
The simple Mendelian model describes how organisms have genes in two alleles. This has been part of the standard view of genetics during the 20th century, except among the few people who know that sometimes, just sometimes, the thing that really counts is whether a gene came from the mother or the father. This principle, called genomic imprinting, is not new - it was demonstrated for the first time in mice in 1984. But interest in it has really only grown in the past few years, and it is still only a minority, even of science graduates, who are familiar with the idea.
Genes exist in pairs, one allele from the mother and one from the father. Most of the time, one cannot be told from the other, and the two are equally active. But this is not the case with imprinted genes, where each carries a marker that reveals which parent the gene came from, and also determines whether or not the gene will be active inside the cell. In other words, some genes are active \Ionly\i if they come from the sperm, while others are active \Ionly\i if they come from the egg's original supply of genes.
Many biologists found this idea of genes behaving according to their parentage hard to believe, but by 1990, the first imprinted gene had been pinpointed in mice and humans, the gene for a growth hormone called insulin-like growth factor-II (or IGF2, for short). At least 15 more imprinted genes have now been identified in mice and humans, with the true total probably somewhere in the hundreds, and the time has come to try to explain what is going on.
The standard theory is that genomic imprinting represents a genetic 'battle of the sexes', a competition between paternally imprinted genes that lead to enhanced fetal growth and maternally imprinted genes that restrict growth, saving nutrients for the mother herself. If this is the case, then the \Jduck-billed platypus\j and the \Jopossum\j should be able to shed some light on the mechanisms of genomic imprinting, since the genes in animals without a lengthy fetal development stage were not expected to be imprinted. Marsupials like the opossum have infants that leave the womb while still embryonic and develop in external pouches, while the platypus is one of the egg-laying monotremes.
So much for theory. In a mid-April issue of the journal \IMolecular Cell\i, a research report indicates that a particular gene called M6P/IGF2R is imprinted in the opossum but not in the platypus. While the platypus finding is in line with the expectation that genomic imprinting is part of a 'battle of the sexes', the opossum finding is a surprise, since it suggests that that M6P/IGF2R imprinting evolved a lot farther back than anybody thought likely.
The M6P/IGF2R gene codes for the receptor for IGF2, whose gene is also imprinted in many mammals. Both genes are involved in growth, but imprinted M6P/IGF2R is maternally expressed, while IGF2 is paternally expressed.
Significantly, opossums may produce as many as 50 young in a litter, but the female only has 12 nipples in her pouch, and once the young opossum is attached, it cannot let go of the teat until it is developed enough to open its mouth. So it looks as though the genomic imprinting in this case may have more to do with competition for survival among the young, and it may be this which triggers the genomic imprinting rather than a lengthy fetal development period.
One problem with the researchers' discussion of their work is that they make the assumption, now regarded as a fallacy by most Australian researchers, that the marsupial is a more 'primitive' form of mammal compared with the placental. Recent Australian research has shown that the placentals were in Australia at the same time as the early marsupials, and were out-competed by the marsupials. It is easy to step from there to the assumption that the more 'primitive' marsupials are in some way closer to the ancestral mammals than the more 'advanced' placentals.
Certainly the researchers recognise that their finding has implications for understanding the mammalian evolutionary tree. Some people still want to see the monotremes and marsupials as one 'twig' on the tree because this evidence supports the proposed version of the mammalian evolution in which monotremes branched off the main evolutionary trunk before the marsupials and the placentals split apart. The only alternative would be that imprinting would have had to evolve twice, once on the marsupial twig and again on the placental branch, in a process called convergent evolution, and that is generally regarded as unlikely.
Nonetheless, the discovery is an important one, and it should trigger some interesting further research, especially in marsupials of the Australian region, which may or may not show the same pattern (Australia's marsupials are all far closer, genetically, to each other, than they are to any of the opossums of the Americas - even the apparently similar 'possums' of Australia are closer to the kangaroos, the Tasmanian devils and the wombats than they are to the American opossums.) It does not, however, bring us any closer to an indication of whether or not humans need these imprinted genes. Many of them are growth promoting and growth inhibiting genes, and it is possible that, if one copy of a gene is switched off, the other could become a target for developmental diseases and cancer. That means, if the balance is altered, somebody could lose a tumor-suppressing gene or turn on an oncogenic one.
Mutations of IGF2 and M6P/IGF2R appear to be early steps in a wide variety of cancers. In many tumors, both copies of growth-inducing IGF2 are turned on, despite the gene being imprinted in humans. M6P/IGF2 is not imprinted in humans, and both copies normally function, but in more than 60% of human liver cancers, 30% of breast cancers and 50% of lung cancers, at least one copy of this growth-suppressing gene does not work.
So if humans have two functioning M6P/IGF2R genes, but mice have only one due to imprinting, it would be reasonable to expect that humans should be more resistant to cancer-causing agents, and research is going on right now to test this prediction.
Curiously, the entire region of DNA that controls the imprinting of M6P/IGF2R in mice was missing in the opossum, according to other researchers, which means that the actual controlling region has yet to be found. According to Jirtle, "Either the opossum has a completely different, unique, and as-yet-unidentified mechanism to control imprinting of this gene, or the proposed region in the mouse is not the actual controlling region."
Key names: Keith Killian and Randy Jirtle.
#
"The armpit effect",1493,0,0,0
(Apr. '00)
People may prefer to call it 'self-referent phenotype matching', especially when it refers to humans, but researchers feel free to call it 'the armpit effect' when it applies to hamsters, which, like humans, are able to distinguish strangers from unfamiliar kin by comparing body odors to their own. The matter has been aired (if we may use that expression) in the \IProceedings: Biological Sciences\i of the Royal Society in early April.
Of course, hamsters do not really have armpits, and the scent involved comes from glands on their flanks, according to two Cornell University psychologists, Jill M. Mateo and Robert E. Johnston. They have managed to demonstrate the effect in golden hamsters (\IMesocricetus auratus\i), which use their own scent to distinguish unrelated hamsters from their biological siblings.
A phenotype is a trait, and like most traits, it is inheritable, so that genetically similar individuals with similar genotypes will have similar traits. That is, they will have similar smells, and phenotype matching is a matter of comparing two individuals' traits, often for purposes of kin recognition.
As Mateo explains it, "To determine if someone is related to you, you might compare their traits to your memory of what your relatives are like. Self-referent phenotype matching is when you use yourself as a referent, rather than your close relatives."
Mateo and Johnston devised a separated-at-birth test for laboratory hamsters. They took newborn laboratory hamsters from their mothers and siblings, separating them before their odor-sensing capabilities had developed, and placed them with unrelated mothers and unrelated young hamsters. Raised among strangers, the hamsters had no kin smell cues except those obtained by checking their own smell.
At the age of seven weeks, the young females were sexually mature and their odor-sensing capabilities were as good as they would ever be. Given a choice of flank-gland scents from a variety of other hamsters, some of them related, some of them completely unrelated, the females consistently 'preferred' unrelated strangers over unfamiliar biological siblings or unrelated foster siblings (the unrelated hamsters they were raised with). In other words, their tendency was to behave in a way that would favor out-breeding, a choice which is highly important in any species, because it favors the mixing of genes.
To avoid confusion with other cues, such as the sight of other hamsters, the scent chemicals were presented on glass slides; donors of the flank-gland smells, which are odorless to most humans, were out of sight.
Then, while researchers stood by with stopwatches, the hamsters indicated their preference by moving quickly to scents of unfamiliar non-relatives and spending the most time sniffing those. They took somewhat longer to approach the scents of their foster siblings and they spent less time smelling them. They took the longest amount of time to approach odors of their unfamiliar, biological siblings, and spent the least time smelling them.
Hamsters engage in agonistic marking behavior, when they rub their flank glands to mark territories and warn off undesirable animals. In a test, the females were less likely to engage in marking after smelling unfamiliar non-kin. In other words, they were less agonistic toward potential mates and more likely to warn off close relatives at mating time.
Curiously, the hamsters were drawn from a highly inbred group of laboratory animals with a very similar genetic make-up: all of the world's laboratory golden hamsters (and most hamster pets) trace their ancestry back to several wild \IM. auratus\i that were captured in Syria in the 1930s. Even so, there is enough genetic difference between individual hamsters to permit the production of sufficiently different smells.
As in humans, the scent production in hamsters is believed to be influenced by the major histocompatibility complex (MHC), the set of genes underlying animals' immune systems and responsible for recognition of self and non-self protein.
But does the effect really apply to humans as well? Tests have shown that blindfolded mothers can tell the smell of their newborn babies. Other tests have shown that a woman will prefer the smell of T-shirts worn by men who are genetically dissimilar to her, making them a potential mate or an unrelated partner, over the smell of males genetically similar to her.
The researchers are not keen to be drawn into a discussion of how or where the armpit effect is used, preferring simply to record that it exists, and is not an impossibility in an evolutionary sense. They are still puzzled as to how the hamsters use the armpit effect to remember their own scent, and they say they have never actually seen the hamsters sniffing themselves, yet they certainly know what they smell like. And given the design of the experiment, they say there can be no doubt that there was only one way for them to know the smell of family, and that was by comparing the smell of other hamsters with their own.
#
"How the insects lost their abdominal legs",1494,0,0,0
(Apr. '00)
The standard theory for the evolution of insects comes from a variety of fossils collected from around the world. The fossil story sees the insects evolving from a many-legged animal, perhaps something like the velvet worm \IPeripatus\i or a centipede, combining some segments with legs attached into the thorax, or central body region, and more segments with legs attached into the abdomen, or tail section. Then, at some time before all of the modern insects diverged, the number of legs was reduced to just six, all attached to the thorax.
According to a report in the \IProceedings of the National Academy of Sciences\i during April, it is now possible to make some informed guesses of how the other legs were lost. In fact, Randy Bennett at Brigham Young University has been able to turn back the evolutionary clock and tweak a developing beetle so it grows many legs, like a centipede.
BennettÆs specialty is evolutionary developmental biology (commonly known in the trade as 'evo-devo'), which has only really opened up in the past decade as the concept of Hox genes has developed. These operate, right across the animal kingdom as a set of master controls, switching other genes on and off, whether the animal is a human, a fruit fly, a mouse, or a worm.
Changing even one of the Hox genes in a tiny way can lead to huge changes in the final form of the animal. The two huge changes which made the insects successful would seem to be the reduction of the legs to just six and the addition of wings, and it is reasonable to assume that such variations involve Hox genes in some way.
Drawing on previous work done in fruit flies, Bennett was aware that two Hox genes, abdominal A (Abd-A) and Ubx, had something to do with limb development in insects' abdomens. However, because flies are very specialised insects, he chose to use the beetle, which is generally regarded as a more primitive insect, for his study.
Removing the Ubx gene produced beetles with no abdominal legs, and removing just Abd-A, led to a limb-like stub on the beetles' abdomens, but still no legs. Removing the two genes at once, an obvious next step, has been difficult until recently because the two genes are close together on the same chromosome in beetles. But using a new technique, only recently developed by other researchers, Bennett injected a strand of RNA that invaded the beetle embryo and specifically neutralized both target genes. The result: "legs everywhere", according to an interview Bennett gave afterwards.
Bennett argues that the natural blueprints for abdominal limb development are still present in insects, but along the way, the gene Abd-A has evolved to turn off those instructions. As well, Ubx has evolved to modify any abdominal limbs into something else, but if both of these genes are taken out, the beetle can return to the ancestral model and sprout legs out of each segment on its body. But this still leaves us wondering what brought about the changes in the roles of these genes.
Human Hox genes are very similar to those in insects, so there is a chance that the discovery may lead to practical medical applications in the future. But for now, the study is just a matter of fascination for the devotees of evo-devo. And we are still left with the challenging question: why six legs, and not four or eight, or some other number?
Key names: Randy Bennett, David L. Lewis and Mark DeCamillis.
#
"Mount Usu erupts",1495,0,0,0
In late March 2000, Japan's National Coordination Committee of Volcanic Eruption Prediction warned of the imminent eruption of the Mount Usu volcano. There were more than 1600 volcanic earthquakes on 29 March, and almost 2500 on 30 March, with the hypocenter of these earthquakes located on the northwestern slope. Approximately 10,000 people living in surrounding towns were required to evacuate by the afternoon of 29 March and, the next day, cracks as long as 100 m (330 feet) were visible on the northwestern part of the caldera rim. The first eruption lasted approximately 2 hours and came from a new vent about 1.5 km (a mile) to the northwest of the summit of Usu in a small valley. No large pyroclastic flows occurred.
By 3 April, Japan's Meteorological Agency had discovered three new fault lines near the crater of Mount. Usu. On 4 April, smoke plumes soared to 1200 meters (3900 feet) and more fault lines were discovered. The possibility of deadly pyroclastic flows remained high. On 5 April, officials said the dome of hardening lava was growing, and the next big eruption could occur within the next two weeks. Rain then triggered a large-scale mud and debris flow down Mount Usu's western slope. More faults were reported to be developing on the north and northwestern sides of Mount Usu.
On 17 April, the Mount Usu Volcano exploded yet again. Craters on the west side of the volcano were active, expelling black smoke due to underground magma explosions. Residents of the remaining Usu communities were moved to evacuation centers as seismic activity began to increase. So far, no serious eruptions have occurred and, by the month's end, some were moving back again.
#
"Mount St Helens recovers",1496,0,0,0
(Apr. '00)
Mid-May will see the 20th anniversary of the explosion that destroyed Mount St Helens in 1980, and a number of scientists have been posting information, both about the eruption and about the recovery of the area after the eruption as life begins to return to areas wiped out by lava and ash.
A Washington State University Vancouver biology professor, John Bishop, has been studying the re-emergence of life on Mount St Helens for the past 10 years, working in the most devastated area on the mountain's north side, between the crater and Spirit Lake, an area aptly named the Pumice Plains. Here, there was no biological legacy, says Bishop. Plants, animals, bacteria and organic matter were all blown away, burned or buried by rock.
The first invader arrived in 1981, when a single lupine plant (\ILupinus lepidus\i) was found in the Pumice Plains, much to the surprise of ecologists. This lonely plant was several kilometers from any surviving vegetation. The question of how the seeds were dispersed still remains, as lupine seeds are rather large, and lack any adaptation for dispersal. But they are important first arrivals because lupines are part of the legume family and add nitrogen to the soil, thus enabling plant growth.
Today lupine patches of more than 20 acres in size exist on the Pumice Plains, and new vegetation has grown in areas where lupine first settled. Lupine was thought to act as an 'ecosystem engineer', accelerating and encouraging revegetation. In the language of classical ecology, a plant succession is developing.
At the same time, the lupines have not spread as people expected. Bishop explains: "As a graduate student in the early '90s, I saw that the lupines weren't spreading and became interested in why. I found lupines here and there, but they weren't flourishing like the original patch. Now a lot of things like to eat lupine. I found that insects were preventing lupines from reproducing so fast. They were running the revegetation show."
He had found caterpillars, boring and living in the root of lupine plants, feeding on lupine seeds and weaving little protective tunnels out of leaves and gravel. The idea that herbivores control revegetation is a new one, which raises the larger question of whether herbivores in general control plant populations. Like Krakatoa, Mount St Helens looks likely to provide the ecologists with many years of interesting study.
#
"Bitou bush in Australia",1497,0,0,0
(Apr. '00)
Bitou bush, or \IChrysanthemoides monilifera\i subspecies \Irotundata\i, has become an invasive weed that is choking 70,000 hectares of south-east Australia's coastal ecosystems and, as such, it is about to come under concentrated attack. Listed in 1999 as a 'Key Threatening Process to Biodiversity' in NSW and a 'Weed Of National Significance', the plant was originally introduced by accident.
It is thought to have arrived in Australia through the dumping of ballast by ships coming from South Africa. After the plant became established, the Soil Conservation Service planted bitou bush extensively along the NSW coast to prevent erosion and rehabilitate land following mining operations. This practice stopped when the plant's weedy nature was recognised, but by then it was a serious pest. Now it is all-out war, as the Cooperative Research Centre for Weed Management Systems (Weeds CRC) and the Australian government research body, CSIRO, have released a best practice management guide to help control bitou bush.
The guide is aimed at land managers and community groups and provides background information on bitou bush, illustrations and descriptions to aid identification, and simple but comprehensive recommendations for management. The Weeds CRC is also releasing management guides for six other environmental weeds, each of which is costing Australia millions of dollars in control and impacts on biodiversity. The primary aim of these guides is to bring the latest research and ideas on integrated weed management directly to the land managers who have responsibility in this area.
The bitou bush's range stretches from Queensland's Sunshine Coast down to the south coast of NSW. Small patches survive in inland areas around the Menindee Lakes, in western NSW, and near Melbourne, in Victoria. The woody shrub displaces native plants, leading to decline in floral diversity and consequent changes in the diversity of birds, mammals and ground-dwelling insects.
Mature plants can produce up to 48,000 large (5-7mm long) egg-shaped seeds per year. In heavily infested areas, one square metre of soil can contain up to 5000 seeds. To help control the weed, the Weeds CRC released two insects from South Africa to attack bitou bush. One of these, the bitou seed fly (\IMesoclanis polana\i), destroys the seeds developing inside the flower head.
The fly was released in 1996 and within two years it had spread across more than 1200 kilometers of the east Australian coastline, occupying virtually the entire range of bitou bush. According to Dr John Vranjic, author of the Bitou management guide, "Detailed studies are still under way to determine the impact of the insect, but certainly along the north and central coast of NSW it is beginning to exert a strong influence on the soil's seed bank".
The usual problem with a single control agent is that the host develops some form of genetic resistance to the control agent. To help prevent this, there are applications for approval to bring out other biocontrol agents for assessment and release, if they are found to be suitable. For now and in the medium-term though, management is relying on the more costly and labour intensive methods of chemical spraying, mechanical removal and burning.
A field trial currently under way in the Eurobodalla National Park, near Moruya on the NSW south coast, is testing a management strategy that integrates fire, herbicides and biocontrol agents. A section of the park with dense bitou bush was aerially sprayed with herbicide, then marked areas were later burnt. The burnt areas exhibited almost 10 times the amount of regermination of bitou bush than unburnt areas. Respraying then occurred. This combination of treatments has reduced the soil seed bank.
Preliminary observations indicate that the seed fly is attacking seeds on both the burnt and unburnt plots. To date, about one third of seeds produced on the burnt plots are affected. The other best practice management guides to be released are on the blackberry, boneseed, bridal creeper, broom, horehound and St John's wort.
In all, 17 weeds are now considered to be capable of totally and permanently modifying the natural ecosystems they invade. Most of them arrived on the island continent either as ornamental plants (like rubber vine, \ICryptostegia grandiflora\i and the shrub lantana, \ILantana camara\i), or to provide pasture for the country's livestock (like buffel grass, \ICenchrus cilaris\i or \ICenchrus pennisetiformis\i). The best known weed in Australia is the giant sensitive plant, \IMimosa pigra\i, but bitou bush and water hyacinth, \IEichhornia crassipes\i, are high on the list of future targets.
#
"Drought strikes Ethiopia again",1498,0,0,0
(Apr. '00)
Ethiopia, with a population growing at 3% each year, is facing another drought and famine, which may yet prove as serious as the huge famines of 1986-87 reported widely in the media. A small amount of coverage appeared on the world's TV screens and front pages during April, but as people begin to die, the world remains largely oblivious to it all.
Typically, the region around Ethiopia receives rain twice a year: the \Ibelg\i in February or March (which failed in 2000), and a more significant rainfall, the \Imeher\i, in June to September. The area has now been largely without rain for three years, and desperately needs some rain to kick-start the recovery. The country now needs 800,000 metric tons of food, and funding of around $200 million.
While aid is one answer, help and self-help is even more important, according to the organisation Canadian Physicians for Aid and Relief (CPAR), which has been working with farmers to develop ways to improve soil conditions in the more temperate mountains of the northern highlands region of Ethiopia.
The northern mountains of Ethiopia have always had more rainfall than the southern, mostly pastoral, lands. In Ethiopia, the southern regions have been exhausted and hit hard by droughts over the last decade, and people in the country have been migrating into the highlands. The problem is that the highlands can no longer support the populations that are flocking to these already stressed areas. There is now a desperate need to develop feasible ways of rehabilitating soil quality, even as the increasing population reduces the sizes of small holdings and puts even more pressure on the land.
The plots have lost their nutrients, and all the firewood has been harvested from the hills nearby, so rather than spreading manure on the fields, the manure from livestock is being burned as fuel. At the same time, farmers have terraced the mountainsides in the hopes of bringing some 'new' land into productivity, and farmers are concentrating on high-yield crops so they can get more nutrients from their seed and grow more for every acre they have. This raises the risk of huge expanses of the same crop, which are always vulnerable to pests and disease, or even of climate change. The farmers have begun to diversify, to grow a variety of different plants, from traditional forest trees to crops, that will feed their families.
Some exotic species are being introduced to the mix to give them a chance to make good the losses the soil has suffered already. Trees such as the Australian \IEucalyptus\i, already well adapted to arid environments and already familiar to Ethiopians for most of the 20th century, are being brought in from Australia to provide future firewood and building materials.
But drought does more than damage soil and, in one state, 90% of the oxen have died, making ploughing almost impossible. Worse, farmers in a number of regions have either needed to eat their seed stock, or they planted it hopefully, only to lose it when the \Ibelg\i rains failed. The prospects of some Ethiopians recovering before the drought following this one look remote indeed.
The CPAR has a Web site, located at http://www.web.net/~cpar/ that provided some of the information used here and is updated on a regular basis. Web searches for 'meher AND belg AND Ethiopia' should provide relevant information, especially if you only select recent pages. Meanwhile, the Ethiopia-Eritrea war continues, costing more than $1 million a day for armaments alone, and killing more than are dying of drought - for the moment.
#
"Spring ozone highs in the Arctic",1499,0,0,0
(Apr. '00)
In the miserable cold of an Arctic spring, where ground temperatures are as low as -30░C, scientists are looking at an odd phenomenon, given what we read every day about ozone holes and ozone depletion. The February-May mission is led by the (US) National Center for Atmospheric Research (NCAR), and flies out of Colorado to staging points inside the Arctic Circle once a fortnight. The C-130 Hercules military transport plane is packed with scientists and specialized instruments that are designed to assess the annual springtime rise in lower-atmosphere ozone levels.
As people use more fossil fuels, ozone plumes form in polluted cities and drift around the world, and background levels continue to rise in the lower atmosphere even when they are falling in the upper levels. Ozone levels in the Arctic \Jtroposphere\j (the lower eight kilometers, or five miles, of the polar \Jatmosphere\j) increase from 30 to 40 parts per billion (ppb) in winter to 50 to 60 ppb in the spring - about half the concentration above Los Angeles on a bad day. At the same time, in the \Jstratosphere\j above, the returning springtime sun triggers chemical reactions that deplete ozone, creating a smaller, northern version of the Antarctic ozone hole.
The peculiar chemistry of the Arctic spring is key to understanding ozone and pollution processes across the northern latitudes. The researchers fly first to Churchill, Manitoba, on the Hudson Bay, 1400 land miles (2250 km) away from Colorado, rising and falling to measure chemical compounds at various altitudes along the way. From Churchill, the team sometimes flies another 1400 miles to Thule, Greenland, and then on to Alert, the last settlement on the northern tip of the last piece of North America - Ellesmere Island.
So long as the levels stay within normal bounds, there are no problems, but more work is needed to find out what these levels are - and if they are being met. "Ozone is produced and destroyed all of the time, but if the balance gets too skewed, we may end up with more pollution than we can tolerate," says NCAR's Elliot Atlas, chief scientist for the experiment.
Scientists suspect that some ozone sinks from the stratosphere into the troposphere, but how much? As springtime weather changes circulation patterns, ozone and ozone-producing compounds travel into the far north from the polluted regions of northern and central Europe. To what extent does this influx speed up the chemical processes that accompany the return of sunlight? Scientists believe measurements of 20 or so chemical species throughout the troposphere will provide answers. Already they have found surprises in the levels of important compounds.
To complicate matters more, scientists have found ozone-free bands about 50 km (30 miles) across at ground level. To explore these areas, the C-130 coasts just 30 meters (100 feet) above Hudson and Baffin Bays and the Arctic Ocean sampling chemistry occurring over the ice and open leads. The NCAR researchers and their colleagues hope to find the crucial data to explain why ozone builds up in the lower atmosphere even as it vanishes entirely from some surface areas.
#
"May, 2000 Science Review",1500,0,0,0
\JThe newest volcano\j
\JThe first humans out of Africa?\j
\JAstrophysicists detect cosmic shear\j
\JMAXIMA experiment confirms BOOMERANG\j
\JNew photos of Mercury's surface\j
\JThe origins of Eros\j
\JHigh winds in space\j
\JIo scrutinized\j
\JSmart antennas for a wireless Net\j
\JNew magnet tested in Japan\j
\JReduced genetic variation can be good\j
\JNo sex for 40 million years\j
\JMorning sickness has benefits\j
\JSecretin and autism\j
\JMeasuring autism by MLU\j
\JMultidrug-resistant tuberculosis\j
\JThe pill turns 40\j
\JMarathon runners can drink too much water\j
\JIndoor hot tubs and lung disease\j
\JCPR: a new look\j
\JMutations to order\j
\JChromosome 21 published\j
\JGM foods and vitamin A (1)\j
\JGM foods and vitamin A (2)\j
\JGenetic brothers populate the Middle East\j
\JBioMed Central brings a new era\j
\JThe Internet and old books\j
\JRestaurant noise\j
\JColonial spiders and web space\j
\JHarpin: a protein that protects plants\j
\JResidential radon exposure and lung cancer\j
\JCounting Grizzly Bear Numbers\j
\JThe ancient Antarctic and global climate change\j
\JEvidence found for ice-age El Ni±o\j
\JEl Ni±o Update\j
#
"The newest volcano",1501,0,0,0
(May '00)
Australia's government oceanographic research vessel, the R. V. \IFranklin\i, returned to Australia in early May from a Pacific cruise with two highlights: film of the birth of a new island, and a geological trophy, lifted from the ocean floor. The international science team on board the vessel witnessed the dramatic birth of a new volcanic island on top of the Kavachi seamount off the Solomon Islands in the Bismarck Sea region of the Pacific Ocean. It brought back spectacular video images of the event, as well as hauling up a world record size 'black smoker' chimney from the bottom of the Bismarck Sea.
They collected the chimney on the first leg of their cruise, and then found by chance that the Kavachi seamount had entered a new phase of eruptive island-building activity after nine years of apparent dormancy. According to expedition Chief Scientist Brent McInnes of CSIRO Exploration and Mining, "We arrived at the seamount site to find waves breaking on the volcanic peak. Violent eruptions were taking place every five minutes."
The eruption was throwing molten lava up to 70 meters above sea level, and sulfurous steam plumes mushroomed to 500 meters. The ship approached to within 750 meters (as close as it was safe to go), and the crew found that the volcano had grown dramatically since it was last surveyed in 1984. They were able to do something most unusual, to systematically collect samples of freshly formed volcanic rocks from the flanks of an erupting submarine volcano.
One interesting aspect was the discovery of sulfide-rich volcanic samples similar to gold ores from other volcanic centers like the Lihir mining operation in Papua New Guinea, another site investigated by the CSIRO-led team. The volcano has had a significant effect on the chemistry and turbidity of the ocean surrounding it. Gary Massoth, a New Zealand scientist from the New Zealand Institute of Geological and Nuclear Sciences reported detecting numerous chemical and particle plumes in the water column that extend at least 5 km from the center of the volcano.
Dr Ray Binns of CSIRO Exploration and Mining and his team recovered the huge submarine chimney from the bottom of the Bismarck Sea. The world record chimney weighs a tonne (800 kg in the water, where displacement provides some degree of support), stands 2.7 meters (9 feet) tall and 80 cm (32 inches) across at the base, and was dragged from an active volcanic hot spring at a depth of 1700 meters. It came up "swarming with remarkable microbes", according to Binns, and it is expected to be rich in zinc, silver and gold.
The chimney was formed by the slow formation of solid material from the mineral-rich water pouring from a black smoker, a sea floor vent. It is more than just a splendid trophy, having been collected as part of a probe to understand better the way in which giant Australian ore bodies like Mt Isa and Broken Hill were formed. Their origin seems to lie in vast hydrothermal systems on ocean floors that spew out plumes of superheated mineral-rich fluids that somehow create the mineral deposits we can later mine.
This search is what brought the scientists to an eerie landscape almost two kilometers below the surface of the ocean. Across the submarine plain, smoking undersea chimneys pump mineral fluids from deep in the earth's crust into the surrounding seawater. The chimneys are tubular encrustations that build up slowly around the vents until some calamity destroys them. These dot the plain with shattered mineral columns that resemble ancient ruins, spread out over undersea hills that are covered in snow-white carpets of bacteria and organic hydrates - compounds which can only exist at the extreme pressures of the deep ocean.
It was during the search that Franklin's dredge snagged the huge chimney of a black smoker, a lucky chance event. The dredge fell right over the top, anchoring the ship for an hour, but finally the chimney snapped off at the base. Then it became wedged in the dredge frame on its point of balance, so it stayed there while it was winched all the way up to the ship, where it could be identified as mainly sphalerite, a mineral form of zinc sulfide.
The scientists say that the chimney must have been actively venting, because live snails dropped into the dredge bag, and fluid dripping from the chimney was quite acidic. Binns said that their prize lacked the characteristic smell of rotten eggs (from hydrogen sulfide gas) often found with smaller chimneys, but it was teeming with bacteria and archaea (also called \Jarchaebacteria\j), very ancient and primitive life forms.
The microbiologists aboard were delighted, according to Binns, since one of the major goals of the expedition was to identify particular microbes that can be used to process minerals on dry land in an effort to develop more efficient and cleaner ways to win metals. Located in the violent environment surrounding volcanic vents, these are 'extremophile' microbes which have the natural ability to process minerals at high temperatures. There is every prospect that these microbes could help to make Australia's $37 billion mineral export industry cleaner, greener, safer and more competitive.
The mineral-mining microbes are possibly relatives of some of the earliest forms of life to emerge on the planet more than three billion years ago, when conditions across the world were rather more like what we now see in these seafloor hydrothermal vents: high temperatures, lots of volcanic activity, and total darkness, with the nutrients that life needs pouring out of the earth itself.
The scientists on board the vessel came not only from Australia and New Zealand, but also from Papua New Guinea, the Solomon Islands and the USA.
See also \JChainsaw-equipped robot goes after smokers\j, July 1998, and \JTracking the missing minerals\j, November 1999 for another story on Dr. McInnes.
#
"The first humans out of Africa?",1502,0,0,0
(May '00)
Some very interesting human remains found at Dmanisi in the Republic of Georgia (part of the former USSR) were reported in \IScience\i in mid-May. A nearly complete fossil cranium and another skullcap seem to offer evidence of the first hominid species to journey out of Africa, and the finds have raised a few questions about the standard assumptions scientists have made about early humans and proto-humans.
The fossils can be reliably dated to an age of 1.7 million years, and they are the first fossils discovered outside of Africa to show clear signs of African ancestry. The age and skeletal characteristics of the Dmanisi skulls link them to the early human species \IHomo ergaster\i, a species known previously from Koobi Fora in Africa that some researchers believe is the African version of \IHomo erectus\i.
The standard scenario has \IH. erectus\i leaving Africa at an undetermined date, equipped with an advanced tool kit called the Acheulian or hand-ax tradition. The \JAcheulian tools\j are supposed to have given \IH. erectus\i the flexibility and power to become the first human group capable of surviving in the challenging environments outside of their African cradle. The actual date when these technologically advanced people left Africa is still hotly debated, but it now appears that there were people at Dmanisi, using a more primitive form of stone tool, at a date which is most probably earlier than the first \IH. erectus\i presence outside of Africa.
The tools found at Dmanisi are of the less sophisticated 'pebble-chopper' type that preceded the Acheulian in Africa, and the site itself is older than any known Acheulian tools anywhere. When we consider the tools along with the fossils' anatomy and the age of the site, the case for early, pre-Acheulian migrations out of Africa becomes a strong one.
The fossils were uncovered during the course of archaeological investigations of a medieval castle at Dmanisi. Careful geological investigation confirms that the human fossils, like the accompanying animal bones and tools, come from sediment-filled, irregularly-shaped 'burrows,' scooped out of the ancient strata by the flow of groundwater. A hominid jawbone was found at the same site, in the same layer and excavation pit as the two skulls in 1991, provoking some debate about its correct species. But while jawbones offer fewer useful clues about the species they come from, the well-preserved crania have provided enough diagnostic detail for the researchers to compare them with other fossil human species, revealing close similarities with \IHomo ergaster\i.
The Dmanisi site offers a beautiful set of chronological clues, ranging from the isotope dates for the layer of basalt rock running beneath the site, to the paleomagnetic signature and the animal fossils in overlying deposits. The underlying basalt is dated at 1.77 million years, setting an upper limit to the age, while the paleomagnetic signature of the sediment burrows themselves point to a period from 1.77 million to a little over a million years ago. Luckily, the European faunal record, the animal fossil history of Europe, is very well documented.
As a result, the associated animal fossils at Dmanisi became the key to understanding the age of the site. Small rodents known from other European finds to have lived more than 1.7 million years ago occur with the hominids, and that sets a lower limit on their age.
Somewhere in the 70,000 year range from 1.77 to 1.7 million years ago, there were primitive humans living in Georgia, feeding successfully and making the tools they were used to making. Although the area offered plenty of raw material suitable for making Acheulian tools, all of the stone artefacts, more than a thousand of them, that have been recovered from the Dmanisi fossil layers are of a pre-Acheulian type that appeared in Africa as early as 2.4 million years ago.
So if improved technology did not lead the first hominids out of Africa, what were they doing in Georgia? According to the authors, the move might have been appetite-driven. The African savanna was expanding, so there was more 'protein on the hoof' and, with the appearance of members of the genus \IHomo\i, our ancestors had bigger bodies than before, bodies that required more energy to run, and therefore needed higher quality sources of protein as fuel.
As the early humans shifted their diets to include larger amounts of animal protein, they probably shifted their range to coincide with their new foods, according to Susan Ant≤n. Until a better theory comes along, that one will have to do. But if her theory stands the test of time, it means that humans moved out of Africa almost as soon as they evolved a larger brain, and before they learned to make better stone tools.
Key names: Reid Ferring, David Lordkipanidze, Carl C. Swisher III, Susan Ant≤n.
#
"Astrophysicists detect cosmic shear",1503,0,0,0
(May '00)
To a physicist, shear is a deforming force, or more loosely, the result of that force. During May, a group of astrophysicists announced in \INature\i that they had achieved the first observations of shear in the universe, an effect called cosmological shear that is predicted by Einstein's theory of relativity. What this means is that we now have some clear indications of where the \Jdark matter\j of the universe is located.
We cannot see the dark matter with a telescope, hence its name, but it can be detected by its effects when it forms a \Jgravitational lens\j. These lenses have been shown most dramatically by examining 145,000 galaxies for distortion in a systematic way, because when this sort of distortion is detected, it gives us a hint about the distribution of the dark matter. The observations were made with the US National Science Foundation's Cerro Tololo Inter-American Observatory in Chile and were based on a method known as weak gravitational lensing.
Gravitational lensing follows from a prediction of the general theory of relativity that gravity bends light. The trick was to look for evidence of distortions in very distant galaxies produced by the gravitational pull of dark matter that lies in the foreground, between the source and the observer. The effect of the lensing is to distort the galaxies into elliptical shapes, and it is this effect which the physicists call cosmic shear. The galaxies are not really distorted, they just look that way because the light has been affected as it travels towards us.
From measurements of the distortion, astrophysicists can calculate how much dark matter there is and, according to models favored by cosmologists, the amount of cosmic dark matter will determine whether the universe will continue to expand forever, slow down to a halt, or one day collapse on itself. From the data now available, they have already concluded that the 'standard cold dark matter model' can be ruled out. Under this model, there is enough ordinary matter and dark matter in the universe to eventually stop its expansion through gravitational force. Instead, their observations support an alternative universe which contains a certain amount of vacuum energy that causes it to expand more rapidly over time.
The observations were made with a designed device, the Big Throughput Camera, which was specially designed and built to measure cosmic shear, and which was installed in the upgraded 4-meter Blanco telescope at Cerro Tololo. The main problem lay in controlling imaging errors that are introduced when the light from distant sources passes through the Earth's atmosphere as well as through the telescope's optics. They achieved this by using thousands of foreground stars to correct the errors.
When these systematic errors are eliminated, any general trend that remains has to be caused by cosmic shear. The aim then is to look for certain similarities in the images of background galaxies that appear close together on the sky. The light from these galaxies, following a similar path, passes through similar intervening volumes of dark matter and gets bent by the dark matter's gravity. In other words, the distorted light carries information about the dark matter distribution.
Processing thousands of images is not easy, and not only because galaxies are typically football-shaped in the first place, though this does make detecting any additional stretching of background galaxies difficult. The answer is to average thousands of galaxies, but while the limit for this study was 145,000 galaxies, future work will look at tens of millions of distant galaxies to refine our estimates further. And when these measures are added to observations of the cosmic background radiation (see \JMAXIMA experiment confirms BOOMERANG\j, this month), our understanding of the cosmos will take a giant leap forward.
Key names: David Wittman, Anthony Tyson, David Kirkman, Ian Dell'Antonio and Gary Bernstein.
#
"MAXIMA experiment confirms BOOMERANG",1504,0,0,0
(May '00)
Close behind the publications of the BOOMERANG data (see \JBOOMERANG yields good returns\j, April 2000), comes the news that a second set of balloon-borne telescopes have provided confirmation of the earlier data. Called the Millimeter Anisotropy eXperiment IMaging Array (MAXIMA), the data for the experiment, together with a detailed analysis, appeared in two papers submitted on Monday, May 8, to \IAstrophysical Journal Letters\i, and released on the Internet the following day.
The main point about the results is that they confirm that '\Jdark matter\j' and 'dark energy' make up most of the cosmos. The papers appear on the Internet, at the time of writing, at http://xxx.lanl.gov/list/astro-ph/new (as they will move, it is worth noting that the numbers of the papers are #0005123 and #0005124).
Unlike BOOMERANG, which covered the whole sky, MAXIMA concentrated on a smaller, higher-resolution map of a square area of sky 11 degrees across in a northern region near the constellation Draco. For comparison, the sun and the moon are 0.5 degree across, and an adult's four fingers, at arm's length, cover about 10 degrees.
Like BOOMERANG, the MAXIMA data show that the universe is 'flat', rather than curved, that it conforms to the sort of geometry described by \JEuclid\j, some 2300 years ago. Only 5% of the universe's mass and energy is made up of ordinary matter, the common stuff of which the Earth, the stars and humans are made. The rest is either cold dark matter, the unseen mass that holds galaxies together, or dark energy, a mystifying pressure or repulsive force that seems to be accelerating the expansion of the universe. The dark energy is often referred to as the \Jcosmological constant\j.
Cosmology cannot be an experimental science in the normal sense, so cosmologists develop models of what might have been, calculate what traces different models would have left, and then look for those traces. The MAXIMA data fit very well with a subset of cosmological theories involving inflation, dark matter and a cosmological constant. Inflation is currently the most popular cosmological theory describing the early history of the universe, so this is a good confirmation of the standard cosmology.
There are 22 collaborators from 13 institutions and five countries in the MAXIMA team. While some of them are also involved in BOOMERANG, the two analyses were done completely independently, so the fact that these independent experiments give such similar results is the best indication that both groups are getting the science right. "These are extremely difficult experiments, and yet the data from MAXIMA and BOOMERANG show spectacular agreement," according to cosmologist Adrian Lee, a leader of the MAXIMA project.
In simple terms, both projects have mapped the temperature variations of the universe, producing thermal maps of what the universe was like 300,000 years after the Big Bang, when the universe first cooled down enough to allow actual atoms of matter to assemble. Over time, the apparent average temperature has fallen until it stands now at 2.73 kelvin, some 270░C (454░F) below zero. This temperature is detectable in microwave radiation that can be 'seen' in space, with variations that are no more than a few parts in 100,000. From the size of the spots in the thermal map, which are clustered in a size range of about one degree across, physicists can calculate that the universe is flat.
That is to say, it has the sort of geometry where parallel lines always remain the same distance apart. It need not be so: on the surface of a sphere like the earth, two meridians of longitude, both pointing north, will meet at the north pole. While that is only curvature on a two-dimensional surface, mathematicians can conceive of a curved three-dimensional space.
It is the flat nature of the universe, combined with recent data on the universe obtained from studies of distant supernovas, that lends support to the inflationary theory of the universe. And in particular, the experiments peg the amount of normal matter in the universe at about 5%, the amount of dark matter at about 30%, and the amount of dark energy, the \Jcosmological constant\j, at about 65%. It now looks very much as though the cosmological constant is necessary (see \JWhen was the Big Bang again?\j, May 1999).
While the clustering at a one-degree scale supports a flat universe, analysis reveals clusters at smaller angular sizes that are what scientists would expect from inflationary theories. There is other information as well, like a zero 'tilt' to the power spectrum, which essentially means that, immediately after inflation, the fluctuations in the energy in the universe were uniform over all size scales. Inflationary theories make two main predictions - a flat universe and a power spectrum with no tilt - and the MAXIMA data support both of these predictions.
The history of the \Jcosmic background radiation\j began in 1965 when it was first accidentally discovered, and hailed as proof that the Big Bang happened. A number of theories were developed to explain the evolution of the universe from its explosive birth until 1992, when the Cosmic Background Explorer (COBE) satellite provided the first evidence that the microwave glow is not uniformly 2.73 Kelvin.
The variations, around 100 millionths of a kelvin above or below the average, are traces of clumps and wrinkles in the very early universe, irregularities which presumably evolved into the clusters and superclusters of galaxies we see today. Previous experiments hinted at a flat universe, but MAXIMA and BOOMERANG have given cosmologists far clearer evidence for this.
MAXIMA 1 flew in August 1998 for one night at 40,000 meters (130,000 feet) and then returned to the ground by a parachute. It is a 1.3 meter telescope which focuses the microwave radiation on detectors cooled to one tenth of a kelvin by high-tech refrigerators, sampling the sky at angular resolutions from 5 degrees to 1/6 degree, where COBE could get no smaller than 7 degrees. MAXIMA 2 flew in June 1999 and observed roughly twice the area that MAXIMA 1 observed. MAXIMA 3 will fly in late 2000, and will attempt to measure the polarization of the cosmic background radiation, which has never been observed so far.
The MAXIMA web page is at http://cfpa.berkeley.edu/group/cmb, and print-quality maps of thermal fluctuations in the microwave background can be found at http://cfpa.berkeley.edu/group/cmb/sanders/wmap300npix.gif and at http://cfpa.berkeley.edu/group/cmb/sanders/wmap300npix.ps
#
"New photos of Mercury's surface",1505,0,0,0
(May '00)
The surface of Mercury has always been a challenge, ever since Galileo first tried to examine the planet in 1609. More than a quarter-century ago, the Mariner 10 spacecraft flew past Mercury and for the first and only time transmitted satellite-based photos of half of the surface of the planet closest to the Sun. But like Galileo, modern astronomers have found getting images of the surface of Mercury with a ground-based telescope very daunting. Now in the May issue of \IThe Astronomical Journal\i, and at the American Geophysical Union, astronomers from Boston University have released images revealing details of Mercury's surface.
The images were taken at the Mt. Wilson Observatory in California in late August 1998. The images, taken with a digital camera and stored on CD-ROMs for subsequent processing, reveal surface markings similar to the bright craters and dark lunar 'seas' (or mare) found on the Moon.
The challenge comes from Mercury's closeness to the sun, which means that astronomers can only see the planet just before sunrise or shortly after sunset. Then there is the need for 'the seeing' to be right, when the air is clear, so that researchers are looking through a minimum of turbulence in Earth's atmosphere. Opportunities even from space are limited because light-sensitive equipment on the Hubble Space Telescope are not allowed to 'look' at objects close to the Sun, such as Mercury or Venus. The reason for this restriction is simple: the ban is to avoid the possibility of an accidental pointing error which might let too much light fall on the sensitive surfaces of an instrument.
In this case, the observations were made shortly after sunrise before the Sun's heating of the atmosphere distorted the images captured by the telescope. These were very short exposures, 1/60th of a second, continuously for some 90 minutes. There are 340,000 pictures in all, from which a short set of 30 excellent shots (though 60 would have been better) could be used to create a time exposure of sufficient duration (half a second) to capture detail on Mercury's surface.
In this stage, the astronomers developed sophisticated computer techniques to identify the best images with detail taken during rare instances of 'perfect seeing.' They then combined these successfully, but they plan to go further, later in 2000, pushing the technique to try to capture images of the planet's weak atmosphere in a new set of observations.
Mercury has a thin atmosphere created by the ejection of atoms from its surface, a process that also occurs on our Moon. One of the elements in the Mercurian atmosphere is sodium, which forms a thin gas which reflects sunlight very well, and this year's work will concentrate on building a more sensitive detector system and using it to detect the sodium.
Key names: Michael Mendillo, Jeffrey Baumgardner, Jody Wilson, Mead Misic
#
"The origins of Eros",1506,0,0,0
(May '00)
Just before the first of the \Jasteroids\j was discovered in 1802, a young philosopher called Hegel wrote a dissertation to show that seven planets was not so much a chance situation, as a necessity of nature. While politely ignoring Hegel's gaffe, astronomers have wondered ever since what these strange objects were made from.
Now, almost two centuries later and more than 150 million kilometers (nearly 100 million miles) from Earth, the NEAR Shoemaker spacecraft has provided an account of the chemical composition of the asteroid Eros. The analysis of this information suggests that Eros is a primitive relic of the emergence of the solar system from a cloud of gas and dust. Researchers operating the X-ray/gamma-ray spectrometer (XGRS) believe that the 34 km (21 mile) long asteroid is a very primitive body that has remained largely unchanged since the solar system was first formed.
The analysis was reported in a paper delivered to the spring meeting of the American Geophysical Union in Washington, D.C., at the end of May, after the data gathering became possible as the result of a solar flare in early May. This produced a 30-minute burst of X-rays, which in turn produced a 'glow', called X-ray fluorescence. The XGRS recorded the different elements on the asteroid's surface fluorescing at different wavelengths, allowing researchers to identify the elements in an area 6 kilometers (3.7 miles) across. This small sample area was dictated by the spacecraft's narrow field of view.
In particular, three of the XGRS's four detectors were able to separate out the signs of the elements magnesium, silicon and aluminium. Identifying the elements present is crucial if we are to know whether the asteroid is more like Earth and the moon - a complex body that has gone through intense melting in which a crust and a dense core have been formed - or whether it is a primitive body that has escaped heating and melting, remaining largely unchanged. The evidence now points clearly to this second choice.
Eros is one of the common S-type asteroids, so it is fairly reasonable now to assume that many other S asteroids also might show this primitive characteristic. Against this, astronomers warn that we only have a sample from one small part of just one asteroid, so that, as one researcher commented, "One should extrapolate from Eros to the rest of the S asteroids with great caution."
The XGRS data obtained so far do not answer such essential geological questions as whether Eros is a rubble pile or a collision fragment. They do, however, show that the asteroid is not composed of basalt or any kind of \Jigneous rock\j. The evidence for no melting comes from the observation that the asteroid, like a group of meteorites called chondrites, is homogeneous. If the material had been heated and cooled, crystals would have formed at different temperatures, and there would be more variation visible, in part as a layered structure.
Equally importantly, if a body is melted, heavy materials, like nickel-iron metal, tend to sink to the center, while lighter materials, composed mostly of silicon and oxygen, rise to the surface. If Eros were composed mostly of light materials, it would probably be a fragment from near the surface of a larger body, while an abundance of heavy materials would point to the origin of Eros as the core of a larger, broken-up asteroid. In fact, there is no extreme either way, no excess or deficiency of heavy or light materials. In the language of science, the asteroid appears to be undifferentiated and most probably primordial.
The ratios of elements in chondritic meteorites are very similar to the ratios of many elements in the sun and, in Eros, the ratios of silicon, magnesium and aluminium are all preserved. According to standard theory, the chondrites were formed by condensation from the solar nebula, the gas and dust from which the solar system emerged. So if the asteroid's content, as reflected by element ratios, appears to be similar to the elemental composition of the chondrites, this means we must assume that the asteroids have a similar primitive origin.
It is still possible that the surface composition data have come from a dusting of micrometeorites on the surface of Eros, but as the spacecraft gets closer, similar analysis should be possible for the rest of the surface, even without solar flares. NEAR Shoemaker was orbiting at 50 km (31 miles) in early May, and will approach to 35 km (22 miles) in early July, and then as close as 19 km (12 miles).
See also: \JHegel, Georg Wilhelm Friedrich\j
#
"High winds in space",1507,0,0,0
(May '00)
The Chandra X-ray Observatory has detected what NASA calls a '1 million mph wind' streaming away from a black hole in the active galaxy NGC 3783. The wind is caused by the radiation produced by matter as it plunges into the black hole. The radiation heats the surrounding gas and drives the huge wind.
Researchers have used the High Energy Transmission Grating in combination with the CCD X-ray camera aboard Chandra to study the properties of the wind. At X-ray wavelengths, they have been able to probe the gas flows which were previously suspected, but which had remained invisible up until now. The grating spreads the incident X-ray beam into a spectrum, rather like a giant rainbow with hundreds of colors rather than the standard seven colors we see in the rainbow.
The next step was to plot the spectrum, and then use this to detect elements (including oxygen, neon, magnesium, silicon, sulfur, argon, and iron) in the stream by sharp absorption dips in the plot. By examining the widths and locations of these dips, the researchers can use the same Doppler principle used by a police radar gun to measure velocities in the extreme environment of the galaxy's core. An analysis of the wind has shown that the wind almost completely surrounds the black hole.
The black hole's event horizon has a diameter that is about a hundred times that of the sun, but it produces more radiation than a billion suns as gas is sucked into the black hole at nearly the speed of light. Some of the radiation is absorbed by the gas, where electrons on the atoms are energized or stripped away completely as the gas is heated to a hundred thousand degrees Celsius or more and driven away from the black hole into the galaxy.
Key names: Rita Sambruna, George Chartas, Gordon Garmire, and John Nousek.
#
"Io scrutinized",1508,0,0,0
(May '00)
Five reports on Jupiter's moon Io were featured in \IScience\i in mid-May, based on data gathered by NASA's Galileo spacecraft and the Hubble Space Telescope. As with previous studies, these reveal a bizarre world of hot volcanoes and sulfurous snowfields, but there were some surprises as well. Among these were giant, erupting plumes migrating with lava flows, red and green deposits that change as unstable sulfur compounds condense from huge plumes, and mountains that may split and slide sideways for hundreds of kilometers.
The view from Galileo, looking at Prometheus, reveals a volcanic field similar to Hawaii's volcanoes, but more active and much larger. Prometheus features an 80-kilometer (50-mile) tall plume of gas and particles erupting from near the end of the lava flows, like the steam plume seen where the Hawaiian volcano enters the sea. But there is a difference: while the size and shape of the plume have remained constant since at least 1979, the plume location wandered about 85 kilometers (53 miles) to the west between 1979 and 1996. One of the studies looks at this, and suggests that the Prometheus plume is fed when a 'snowfield' of sulfur dioxide and/or sulfur vaporizes under the lava flow and material erupts through a rootless conduit in the flow.
In the past, scientists had speculated that bright red material on Io came from unstable forms of sulfur condensing from sulfur gas. Now, by combining Galileo and Hubble data, more light has been shed on this. Hubble's ultraviolet spectrograph has revealed that there is a 350 kilometer (220 mile) high cloud of gaseous sulfur in the plume ejected by the volcano Pele. The sulfur atoms are grouped in S\S2\s molecules, similar to oxygen molecules, but these molecules are only stable at the very high temperatures found in the throats of Io's volcanoes.
When these molecules fall onto Io's frozen surface (about -160░C or -250░F) away from the volcanoes, they probably recombine into larger molecules with three or four sulfur atoms. The latter types of sulfur are red, so the Hubble results explain the 1200 kilometer (750 mile) wide, red debris ring around Pele.
Galileo has found many other smaller red patches near Io's active volcanoes where this sulfur conversion process probably also occurs. The red deposits are found near calderas or shield volcanoes where the lava first reaches the surface, often distant from plumes like Prometheus, where lava flows apparently vaporize surface materials. Eventually the atoms rearrange into their most stable configuration, rings of eight atoms (S8), which form ordinary pale yellow sulfur.
The composition of bright green materials on Io has been puzzling. In some places, it appears that when red material is deposited onto fresh lava flows, especially on caldera floors, it transforms into green material. It is possible that the surfaces are still warm, which accelerates the transformation of the red types of sulfur and the sublimation of sulfur dioxide.
Io is the most volcanically active body in the solar system, but its mountains (up to 16 kilometers or 10 miles high) are not volcanoes. There are no volcanic vents or flows on them; instead, they appear to be giant tilted blocks of crust. There are giant depressions on Io which are probably calderas formed by collapse over empty magma chambers, but unlike Earth's calderas, many Io depressions have very straight margins, sharp corners, and are located next to mountains.
At low resolution, many of the dark features, called pateras, appear to be calderas. However, higher resolution images such as this one suggest a different origin. In the case of Hi'iaka Patera, the northern and southern margins of the pateras have very similar shapes which appear to fit together. This may indicate that the crust has been pulled apart here and the resulting depression has subsequently been covered by dark lava flows.
In other words, the new images of the Hi'iaka Patera depression and the adjacent mountains appear to show two mountain blocks that have split and slid apart by 145 kilometers (90 miles), forming a pull-apart basin, or \Jrift valley\j, like California's Death Valley or the Salton Sea in the \JColorado Desert\j. On Earth, such large-scale lateral movements are usually caused by plate tectonics, but there are no indications of a similar process on Io.
"We consider it more likely that lateral movements may be driven by deep 'mantle plumes' of rising hot rock masses within Io," said Dr. Alfred McEwen of the University of Arizona, Tucson, lead author of one of the papers. Clearly, there is still more to come in this story.
Soot damage to artwork
(May '00)
A thing of beauty is a joy forever, but not if our modern environment gets to it. An old painting looks dimmer than a modern work because soot, tiny pieces of black carbon emitted from trucks burning diesel fuel or factories burning coal slowly builds up on paintings, causing the image to darken over time. Soot particles are small enough to elude most museums' air filters.
A study of this effect was published in the print edition of \IEnvironmental Science and Technology\i in mid-May. The work was carried out by Leon Bellan, a high school student at the time, who was named a national finalist in the prestigious Intel science competition for his work.
Once soot sticks to artwork, says Bellan, it is difficult or even impossible to remove, so any extension of our knowledge of this problem can help us control damage to priceless works of art, frescoes and family heirlooms. Bellan's research was overseen by Glenn Cass, a professor at the Georgia Institute of Technology.
Part of the research involved using a laser printer that randomly deposited microscopic dots on eight different colored samples. When the samples were separated, more than 12% of a sample had to be covered with dots before people could accurately see the difference between clean and soiled sheets. If the sheets were side by side, differences between 'clean' and 'dirty' samples were easier to detect: people could see a difference when less than 4% of a sheet was covered with dots simulating soot.
Previous studies in southern California have shown soot deposition rates that ranged from .08 to 2.7 micrograms per meter of artwork per day, and these rates are generally applicable to a modern building anywhere in the world.
At these rates, it would take somewhere between five and 300 years for a painting to reach the visual threshold of darkening, with shorter times likely in Europe, where more diesel fuel adds additional soot to the environment, and during the winter in colder climates, when fuel use goes up. But at that sort of rate, the changes in objects in a museum or art gallery will be so gradual as to escape the notice of people who see the collection on a daily basis.
The future does not look bright for works of art.
#
"Smart antennas for a wireless Net",1509,0,0,0
(May '00)
The wireless Internet is rushing upon us fast and, as more people adopt cellular mobile phones which are Internet-capable, clever methods will be needed to avoid Internet traffic jams and dropouts. In the United States, the number of people downloading data with wireless devices is expected to surge from the current 3% of the online population to as high as 78% over the next year. And while the rest of the developed world can be expected to follow close behind, the developing world, which lacks a large telecommunications infrastructure may achieve a rapid catch-up by going straight to this technology.
That will bring with it the need for smart antennas and clever signal processing, according to an Internet announcement from Michael Zoltowski, a professor in Purdue University's School of Electrical and Computer Engineering. He was describing a paper he, Thomas P. Krauss and Geert Leus have prepared for delivery in Turkey in June at a conference sponsored by the Signal Processing Society of the IEEE (see http://ICASSP2000.sdsu.edu for conference details). The real problems, he says, will arise when too many people seek to download data from within the same cell at the same time while relying on wireless links.
Just as people hear better with two ears than with one, future wireless communications devices may have two or more antennas so they can outperform conventional, single-antenna versions. With such antennas, cellular communications equipment will be better able to access the Internet and download large amounts of data, including video files.
The real problem is interference, but by using multiple antennas, there is the potential for increasing reception accuracy by as much as 100 times and enabling three times as many wireless users to operate within the same frequency band. The antennas are called 'smart' because they are able to reject the interference and compensate for the 'multipath effects' caused by signals reflecting off buildings and other structures.
Zoltowski reports that Texas Instruments, which is partially funding research to develop this technology, is testing a prototype cell phone that has two antennas. The second antenna is a patch-like strip instead of the standard whip antenna. A similar setup might also be used for laptop computers, he says, explaining the goal of the next generation of cellular communication systems as a user being able to fire up a laptop and being able to download information from the Web while in a car or on a train with no wire connections.
Cellular communications rely on base stations, which are arranged in a sort of honeycomb pattern spanning large geographical areas. The performance of these systems is hindered by two types of interference that foul up a widely used technique critical to the economical functioning of cellular communications.
That technique, code division multiple access or CDMA, makes it possible for many users to operate on the same frequency band at the same time. Multipath effects arise when signals bounce off buildings or mountains on their way to and from cellular base stations. CDMA only works when the signals for numerous users are transmitted in a precise sequence with exact timing, but the multipath signals produce echoes that interfere with other signals, destroying the precise timing. Known as 'multi-user access interference', this sort of problem would be greatly assisted by having more than one antenna.
CDMA systems also suffer interference when the user approaches the boundary of two adjacent cells and has to be 'handed off' from one base station to the next. At this time, the user has two base stations talking to the receiver at the same time, which means that one of them is necessarily causing interference. Having two antennas, says Zoltowski, allows the receiver to distinguish between the two base station signals because they are arriving from different directions, in much the same way that having two ears gives us the ability to determine from which direction sound is coming.
Another solution may be to use sophisticated techniques called 'space-time equalization', to try to restore the delicate timing and sequence of the codes that are used to transmit data, and this can be done in conjunction with multiple antennas.
In theory, CDMA should be able to accommodate up to 64 users in each band, but in reality, because of interference effects, they can only allow about 20 users in a given band, or, at most, a third of the potential number. Computer simulations have shown that using two antennas and space-time equalization would enable the simultaneous use of all 64 channel codes in each frequency band.
As always, there is a down side. While the accuracy can be increased by about 100 times when two antennas are used instead of one, adding more antennas will increase power consumption for portable equipment, and with small size a key selling point with new telephones, who wants to make a portable phone with a larger battery?
#
"New magnet tested in Japan",1510,0,0,0
(May '00)
A 150-ton magnet, which will be central to an international experiment for fusion energy research, has passed its initial operating test, according to May reports. The magnet has now been run under full operating conditions, producing a 13 tesla magnetic field. This is about 260 thousand times more powerful than the earth's magnetic field. The magnet draws a 46,000 ampere current, about three thousand times the peak current handled by typical household wiring.
This is the world's most powerful pulsed superconducting magnet, but it is only a test-bed for even larger magnets planned for the International Thermonuclear Experimental Reactor (ITER), which is to be used in a range of studies demonstrating the feasibility of nuclear fusion as an energy source. The ITER magnets will provide the magnetic fields needed to start and maintain the plasma, an electrically charged gas-like state of matter that is needed for the fusion reaction.
Tests are to continue, aimed at reaching the 13 tesla point sooner and at dropping away from that level faster because the magnet is only doing its job when the magnetic field is changing. The most recent tests showed change rates on the upward side of 0.005 tesla/second, with a dropping rate of 0.7 tesla/second, while the end goal is pulsed operation at rates up to 1.2 tesla/second, both going up and coming down.
The magnet is made of two modules, one designed and built in the United States, the other in Japan, which were brought together in Japan last year. Other components have been produced in other parts of the world. The ITER partners are the European Union, Japan, and the Russian Federation. The US was a partner until late 1998 and is presently participating in the testing of the magnet through a bilateral agreement with Japan.
See \JA better magnet for better fusion\j, February 1999 for an earlier story on the magnet.
#
"Reduced genetic variation can be good",1511,0,0,0
(May '00)
One of the 'givens' of biology is that genetic diversity is as important to a species as biodiversity is to an ecosystem, but like all good rules, nature seems all too willing to go the other way. In a mid-May paper in the \IProceedings of the National Academy of Sciences\i, a group of University of California, San Diego biologists have concluded that a lack of diversity can sometimes be a good thing as well in at least one species of ant.
The species in question is the Argentine ant, \ILinepithema humile\i, formerly called \IIridomyrmex humilis\i (and still often listed on the Internet under this name), which was introduced into a number of areas around the world and reached California some time before 1905. This is an aggressive species that drives other ants from the area and, in their native habitat, competes with other ants of their own species.
The tiny dark-brown and black ants, which are about two millimeters in length, are thought to have entered the United States aboard ships carrying coffee or sugar from Argentina, probably during the 1890s. They then expanded throughout California and the southern parts of the United States. In the Southeast and much of the South, their spread is now limited to some extent by the introduction of fire ants: see \JWhat is a quarantine system worth?\j, January 1999, and \JFire ants and the elderly\j, September 1999, for more on these American pests.
The study deals with the success of the ant in invading an area from San Diego to Ukiah, 100 miles (160 km) north of San Francisco. The researchers credit this success to reduced genetic diversity, which has allowed a giant 'supercolony' of closely related ants to grow unchecked across the range. In doing so, it has become a major pest that has invaded homes and displaced native species of ants in much of the coastal regions of the state.
In Argentina, fighting among the more genetically dissimilar territorial ants has managed to keep these insects in check and in smaller, much more sharply defined colonies than those in California. "When we did our field work in Argentina, it was surprisingly difficult to find Argentine ants, compared to our experience in California," says Ted J. Case, a professor of biology at UCSD who headed the research team. "They are a relatively inconspicuous feature, both in the urban and in the natural environment."
In California, the ants thrive in the temperate and damp coastal regions, killing and displacing native ants, many of which are 10 times larger in size. This is a problem not only for those species, but also for the species like the coastal horned lizard that feed on the larger native ants. In Argentina, native Argentine ant colonies living close together were territorial and aggressive toward one another, literally tearing one another apart whenever they came into contact.
So why were the Californian ants not doing the same thing? Neil Tsutsui, a graduate student, used some of the same DNA fingerprinting techniques employed by criminologists to show that the native Argentine ants were twice as diverse as the native California ants. Thus, the California ants regard individuals up and down the coast as close kin, while those different nearby colonies in Argentina do not.
Because of the similarities, an Argentine ant from San Diego dropped into a colony in San Francisco would be welcomed, while an Argentine ant dropped into a colony two hundred meters away in its native country be torn apart. "They have an innate ability to recognize other members of their colony based on how genetically similar they are to themselves," says Tsutsui. "Since they evolved in their native range, where colonies are set up as family structures, they tend to recognize other members of their colony by how closely related they are."
So how did this lack of diversity arise? The answer appears to be the genetic bottleneck, or founder effect, which was generated when all of the California ants came from a relatively small founding population of genetically similar ants from one single colony, or closely related colonies, in Argentina. "Because theyÆve gone through a genetic bottleneck, everybodyÆs genetically similar and everybody recognizes everybody else as a member of their own colony," says Tsutsui. "In essence, the supercolony that we see in California is in fact one big colony."
"The thing thatÆs surprising about this result is that, typically, reduced genetic variation or diversity are considered bad for populations," he adds. "With Argentine ants, it really looks like this is beneficial for the species, at least for the short term." And as if that was not enough, just a few days later, the standard assumption of evolutionary biology that successful organisms rely on sexual reproduction to spread their genes appeared to have a similar challenge; see \JNo sex for 40 million years\j, this month, for more on this story.
#
"No sex for 40 million years",1512,0,0,0
(May '00)
One of the standard 'givens' of evolutionary biology is that sexual reproduction is important to the survival of a species, but like all good rules, nature seems all too willing to go the other way. Just a few days after another hallowed principle of biology was shown not to apply in Argentine ants (see \JReduced genetic variation can be good\j, this month), a paper in \IScience\i outlined the evidence that some tiny bdelloid rotifers have seemingly evolved without sex for millions of years and probably do not even exist in male form. (A \Jrotifer\j is a very small aquatic animal.)
David Mark Welch and Matthew Meselson described an odd pattern of differences between versions of the same genes in the bdelloid genome, a pattern that is probably most easily explained by an extremely long period of asexual reproduction. Today, there are about 2 million named species, and a bare 2000 of them rely on asexual reproduction alone. Few of these lineages seem old, and fossil evidence has suggested that asexuality is a dead end. In other words, the reason why there are few asexual species is not a matter of the asexual forms not arising, but more a matter of their inability to last.
Sexual reproduction means taking each parent's 'deck' of genes, shuffling and cutting the deck, then combining half of it with half of the other parent's deck, so that sooner or later, surviving mutations which arose in separate individuals will come together in a single offspring. This way, any possible interactive effects can occur. In asexual species, the only way such mutations will be combined is if two mutations occur in the same line of clones.
There are a variety of ideas about the advantages of the sexual cut-and-shuffle, but they come down to two main groups. One group of explanations argues that sexual reproduction speeds the process of dumping bad mutations, losing them from the population, and the other group focuses on spreading benefits. For example, the delightfully named 'Red Queen hypothesis' finds the main benefit of sex in shuffling the genome quickly, making it difficult for parasites to lock onto weaknesses.
The oldest bdelloid rotifer so far recovered from amber died some 40 million years ago and, despite their apparent asexual status, they have evolved into around 360 recognized species. That said, there will always be some doubt as to whether we are correct in assuming asexuality. There was one case of a scale insect which turned out to have males, contrary to what scientists had previously thought, but these are tiny things that stick to the females' legs, and so they are hard to detect and recognize.
This is when biologists reach for the test tubes and start doing a bit of biochemistry. For a molecular test, Meselson focused on four bdelloid genes, to see how they had varied over time. In an asexual cloned line, each individual just goes on copying the genes passed down to it, and in different branches, over time, small glitches arise and are passed on. These are harmless changes, but they build up over time, so that one copy of a gene in an ancient asexual form can develop very different 'mistakes' from the same organism's other copy. The more different two copies of a gene are, the longer it must be since the organism used sexual reproduction.
Without sex to spread them around, copies of the same gene within an organism can look as different from each other as if they began diverging when sex stopped. That is the pattern of variation that Welch and Meselson found when they checked genes in four bdelloid species and, while it will require more studies for confirmation, it appears likely that bdelloids are, at the very least, doing something seriously weird with their genome.
#
"Morning sickness has benefits",1513,0,0,0
(May '00)
A study published in the June issue of \IThe Quarterly Review of Biology\i, released in May, offers a new insight into \Jmorning sickness\j, known in medical circles as NVP (for nausea and vomiting in pregnancy). The two thirds of pregnant women who suffer from this unpleasant problem can take some small comfort in knowing that it appears to be Mother Nature's way of protecting mothers and fetuses from food-borne illness and also shielding the fetus from chemicals that can deform fetal organs at the most critical time in development.
Samuel M. Flaxman and Paul W. Sherman believe their findings help to explain why many pregnant women develop an aversion to meats, as well as to certain vegetables and caffeinated beverages, in early pregnancy and prefer bland-tasting foods instead. 'Morning sickness' is a complete misnomer, they say, and commenting on the Internet after publication, Sherman argued that "NVP doesn't occur just in the morning but at any time during the waking hours, and it's not a sickness in the pathological sense. We should change the name to wellness insurance."
Sherman, a professor of neurobiology and behavior at Cornell, and Flaxman, a Cornell biology graduate student, analyzed hundreds of studies covering tens of thousands of pregnancies. They say that their analysis suggests that morning sickness and the aversion to potentially harmful foods is the body's way of preserving wellness of the mother at a time when her immune system is naturally suppressed. This suppression is designed to prevent rejection of the child that is developing in her uterus, since this is essentially a 'foreign object', with half of its genes different from those of the mother. During pregnancy, a woman has reduced defenses against food-borne pathogens.
During the key first trimester, the first three months in which the cells of the tiny embryo are differentiating and starting to form structures, this protection is important. The developing structures and organ systems, including arms and legs, eyes and the central nervous system, are vulnerable at this stage.
For example, many plants make what are called teratogenic phytochemicals, which means plant chemicals that interfere with the development of an embryo or fetus. The plants make these chemicals to defend themselves against disease and insects - and maybe also against browsing animals which eat the plants, such as us. In any case, the chemicals are there and, while they don't do us any good, they don't normally do us any harm either. (Some scientists believe or suspect that small amounts of these chemicals might even be beneficial because of their antioxidant properties and trace elements.)
During pregnancy, the rules change, and women with morning sickness are probably shielding the developing unborn from the harsh chemicals by vomiting and by learning to avoid certain foods altogether until the fetus develops beyond the most susceptible stage. There is certainly a wealth of circumstantial evidence to back this up: women who experience morning sickness, have their symptoms peak precisely when embryonic organ development is most susceptible to chemical disruption, between week 6 and week 18 of pregnancy.
And while it may be a small comfort at the time, women who experience morning sickness are significantly less likely to miscarry than women who do not, while women who actually vomit are significantly less likely to miscarry than those who experience nausea alone.
A pattern of avoidance and aversion to certain foods can be seen during the first trimester for many pregnant women. The foods most avoided include meats, fish, poultry and eggs, the very foods most likely to carry harmful microorganisms and parasites before the advent of modern refrigeration and food-handling processes. (We have to regard a response like NVP as an evolved response and, as such, it is unlikely to stop as soon as refrigerators appear on the scene, but is far more likely to hang on for millennia after that time.)
We can add to that the observation that strong-tasting vegetables, as well as alcoholic and caffeinated beverages, also are disliked by many pregnant women. The researchers identified seven traditional societies with virtually no morning sickness and noted that, in these cases, animal products are not generally a dietary staple. Plant-based foods and corn in particular are the dietary staple in six of seven societies with little or no morning sickness. The edible parts of the corn plant, the kernels, have very low levels of phytochemicals.
In the past, NVP has been explained in a number of ways. These include hormones, mother-offspring genetic conflict, or communicating to nearby males and kin that women are pregnant (resulting in decreased sexual activity and increased help from family members). The problem with this argument, Sherman says, intercourse during the peak period of morning sickness, the first trimester, generally is not harmful for pregnant woman.
The genetic-conflict hypothesis predicts that there should be more morning sickness later in pregnancy when the embryo is able to take resources, but, the biologists observe, morning sickness actually peaks early in pregnancy. Flaxman and Sherman do not dispute that hormonal effects are important, but they question why the response should be nausea and food aversions rather than some other symptom, such as headaches.
So should women worry if they do not experience morning sickness? The answer, say the researchers, is that these women should not be alarmed. "Our analysis of thousands of pregnancies shows that most women in Western societies bear healthy babies whether or not they experience morning sickness," Sherman says. "The lack of NVP symptoms does not portend pregnancy failure any more than experiencing NVP guarantees that the pregnancy will have a positive outcome."
The important message, they say, is that trying to ease the symptoms of 'normal' (not severe) NVP probably will not improve the outcome of a pregnancy and could have the opposite effect if treatment interferes with the expulsion or avoidance of potentially dangerous foods. At the same time, encouraging women to eat foods they dislike during pregnancy will not improve the pregnancy outcome and could increase the embryo's exposure to pathogens and harmful chemicals.
"We are not suggesting that pregnant women cut meat and vegetables out of their diets. In other words, listen to your body," Sherman says. It may sound a little 'New-Age-ish', but given the science behind it, this looks like good advice.
See also: \JSalt hunger\j, August 1998
#
"Secretin and autism",1514,0,0,0
(May '00)
In 1998, an autistic boy was given secretin in a diagnostic test for diarrhea, and his mother later claimed that her son showed improvements in his autistic symptoms afterwards. The chemical has one diagnostic application, no known therapeutic use, and few known side effects, although one version of the drug has been associated with allergic and anaphylactic reactions. It has not been tested for safety or efficacy in children, either in a single-dose or as a long-term medication.
In spite of that, since the first report, parents have been dosing their autistic children with secretin and reporting that their children have shown remarkable improvements. Word of the product spread over the Internet, but there have been no controlled tests so far to see if secretin can really improve autistic children's ability to talk and interact with others. At a mid-May meeting of the Pediatric Academic Societies/American Academy of Pediatrics in Boston, some light was shed on the subject, and the news was not good.
Twenty autistic children, aged 3 to 6, were given formal language testing before infusion with secretin and in four follow-up tests in ensuing weeks. None of the children showed significant changes in either receptive or expressive language.
This was contrary to the claims that had been aired on the Internet, in the \IWall Street Journal\i and on Dateline NBC, all claiming amazing results for secretin in treating \Jautism\j. The first boy was reported to have shown increased alertness, eye contact and expressive language within days after receiving a single dose of the hormone. An article in 1998 in the \IJournal of the Association of Academic Minority Physicians\i described marked improvements in language over a short time, and reported effects lasting up to five weeks.
The problem with these reports is that they were all based on subjective assessment, and could really not rate any higher than anecdotal evidence. But as the story spread, it appears that thousands of children had begun receiving secretin in repeated doses. It appears that the whole process was out of control, with some web entrepreneurs charging inflated prices for secretin for autistic children and the material was, in some cases, administered in unproven ways, such as by mouth or through the skin using a solvent called DMSO.
Autism is a syndrome which begins in the first few years of life, and which involves severe deficits in social and communication skills. These deficits show up mainly in the child's ability to process incoming information from the social and physical worlds, to interact with others and to use language. As yet, there is no identified cause, although a number of theories exist. Nor is there a proven cure, but some autistic children show remarkable improvement of their language and social skills under what are called 'behavioral interventions', while older individuals' obsessive behaviors can be eased by some medications.
The main effect of secretin is to stimulate the pancreas to increase production of digestive fluids. Impaired ability of the pancreas to respond to stimulation can result in chronic diarrhea. The usual medical use of secretin is to diagnose any such pancreatic problems when people suffer chronic diarrhea.
The children in Lightdale's study were near the same age and were given a similar intravenous dose of secretin to that given to the three children described in the 1998 journal article. They were all tested before the treatment, and then at one, two, three and five weeks after treatment, using a standardized test called the Preschool Language Scale - 3. They were also videotaped during play and scored for specific behaviors that are characteristic of autism.
The PLS-3 test showed no quantifiable changes either in the way children understood language or the way they were able to use words and gestures to express themselves. The videotaped behavior tests were assessed by three independent reviewers who never met the children and were not told which week of the study a given test represented, but these results are still to be published. The authors appear to imply that there will be no surprises forthcoming when this analysis is presented.
They do not find this negative result surprising. As Bryna Siegel, one of the researchers, commented on the Internet: "The claims made for this drug do not hold up well to any neuro-developmental model of how new skills are acquired," she said. "Language ability depends on changes in the brain as the child goes through activities that stimulate the acquisition of vocabulary and grammar structure. The child has to develop a two year old's language ability to go on and learn to speak like a three year old. A pill can't do that for him."
Interestingly, 18 sets of parents filled out surveys about their child's condition, and 15 of them indicated that they felt their child had moderate to significant improvements in language skills following the secretin treatment, even though the researchers could detect no changes.
A December 1999 report in the \INew England Journal of Medicine\i of another study indicated that it had produced similar negative results in a group of children aged 3 to 14, half of whom received secretin while the other half received a placebo. In that study, there were no significant effects - there was a slight improvement on the behavior test in the placebo group, in fact, but a majority of parents, including those whose children received placebo, retained their interest in using secretin even after they were told the results. Hope, it seems, is stronger than scientific rationality, especially when it relates to a child with a chronic disability.
"Hope is essential when you care for a child with a chronic disability and there is no definitive treatment," Siegel said. "But sometimes hope lets people believe more than they truly can count on. It can be a roller coaster that, in the end, is just another source of strain."
That may be so, but there \Imay\i just be the tail of an interesting idea here, as we explain in \JMeasuring autism by MLU\j, this month.
Key names: Melvin Heyman, Bryna Siegel and Jenifer Lightdale
#
"Measuring autism by MLU",1515,0,0,0
(May '00)
Secretin, the new hope (or fad) for parents of autistic children is now under serious scrutiny (see \JSecretin and autism\j, this month) to see if it can be useful in any way, with most of the evidence suggesting that secretin can, and does, have no effect on autism. It is just possible, however, that it may be necessary to refine the ways in which any such effects are measured.
A second report at the mid-May meeting of the Pediatric Academic Societies/American Academy of Pediatrics outlined Donna Moriarty's pilot study of secretin to treat autism. Ten children (8 boys, 2 girls) " . . . with autism/PDD with a mean age of 5.6 years were entered into a double-blind crossover study of the impact of a secretin infusion on their neurobehavioral profile over a 12-week period". Children were scored on weekly behavioral rating scales as well as formal testing and videotapes. Scores for pragmatic language and mean length of utterance (MLU) were used to monitor change.
While the therapist who performed the evaluations was able to correctly identify those who received the secretin in 90% of cases, parents were accurate only 50% of the time. And while the various measures as a whole could not distinguish the treatment group from the placebo group, both secretin and placebo effects were significantly different from baseline.
More importantly, data analysis revealed an improvement in MLU, but only in the secretin group. Given the small numbers involved, more studies will be needed, but if there are just some aspects of autism influenced by secretin, or if there are just some autistic people who are influenced, this might give researchers a useful scale on which to assess people after various treatments.
#
"Multidrug-resistant tuberculosis",1516,0,0,0
(May '00)
An editorial in the \IJournal of the American Medical Association\i (JAMA) in mid-May underlines the problem of drug-resistant tuberculosis (TB), which was first observed in 1948, not long after the first trials of streptomycin as a TB treatment. Ever since then, drug-resistant TB has been recognized to occur as the result of sub-optimal TB treatment.
According to the editorial, a recent survey of 35 countries revealed that 12.6% of \IMycobacterium tuberculosis\i isolates were resistant to at least one drug, and 2.2% were resistant to both of the primary drugs used against TB, isoniazid and rifampin. In other words, most cases of TB can be defeated by drugs when they begin, and will only become drug resistant if treatment is inadequate.
This is why the World Health Organization is concentrating its efforts to combat multidrug-resistant TB on a strategy of preventing the generation of new multidrug-resistant TB cases. The program, called directly observed therapy short-course (DOTS), relies on governments using case detection by sputum microscopy, followed by directly observed treatment with a standard therapeutic regimen, maintenance of an uninterrupted drug supply, and monitoring outcome with a standard reporting system.
That is to say, samples from people who may be infected are studied under the microscope, and those found to have TB are then given the drugs under supervision to ensure that the treatment is not broken off early. Typically, a number of patients will stop the drugs as soon as the symptoms ease off, at a time when there are still bacteria present and some of them are at least partially resistant to the antibiotics or other drugs being used. In the absence of treatment, the remnant bacteria can multiply, populating the patient with bacteria which are harder to dislodge. The fact that many of the victims of TB tend to be poor and either homeless or itinerant makes the task of continuous treatment harder.
The principle that has to be emphasized to new patients is this: the best way to generate new cases of resistant bacteria is to allow infected people to take small and irregular doses of a drug, exposing the bacteria to low levels which a few bacteria can survive. Then, when the treatment is stopped, these survivors can multiply, providing a stronger population to face the next round of the battle against the drug(s) being used.
The DOTS approach has been tested and proved effective. In Tarrant County, Texas, standard TB therapy without direct observation was used from 1980 to 1986. In 1986, directly observed therapy was introduced and used exclusively until 1992. There was effective treatment for multidrug-resistant TB during both periods, but after the DOTS approach was started, primary drug resistance declined from 14% to 2.1% of cases.
More importantly, among relapses, multidrug-resistant TB was reduced by 80%. The editorial cites a modeling study which reveals that a poorly functioning program is worse than none at all, since such a program actually produces multidrug-resistant TB, while a well-functioning TB control program can reduce the incidence of multidrug-resistant TB.
Around the world, says the \IJAMA\i editorial, 119 countries now have DOTS programs and, in a number of cases, have succeeded in preventing increases in cases of multidrug-resistant TB where preexisting levels of multidrug-resistant TB were low. Examples cited include Chile, in which only 0.4% of TB cases have multidrug-resistant TB, and Benin, in which the rate is 0.5%.
Sadly, the DOTS program has been less successful in reducing levels of multidrug-resistant TB in those countries where the levels of resistance are already high. Medical workers had believed that an effective DOTS program would lead to a reduced spread of multidrug-resistant TB by people with persistent disease and could eventually lead to its disappearance. The reality has now been revealed by a report appearing in the same \IJAMA\i issue, which reveals that excellent DOTS programs have been operating for several years in Peru and South Korea, but the prevalence of multidrug-resistant TB is still high among people with newly diagnosed TB.
Since these patients have not been previously treated, their disease cannot have resulted from failed therapy and must be the result of primary transmission of multidrug-resistant TB from others in the community. Multidrug-resistant TB has not been markedly reduced or prevented by the DOTS program and it is unlikely that these rates of resistance will decline to low levels without further intervention.
The editorial recommends that new treatment strategies be devised for patients with multidrug-resistant TB, but it is also essential that an effective DOTS program be ensured before beginning a multidrug-resistant TB treatment program. Without this, a poor TB control program would generate multidrug-resistant TB cases more rapidly than a treatment program could treat them. It is proposed that these new programs be known as DOTS Plus.
The issue is not just a problem in the Third World. Many of the isolates of \IM. tuberculosis\i that cause disease in the United States and other First World countries already come from abroad, either in people born abroad or in residents who become infected with TB while traveling overseas, and the editorial notes that these strains have been shown to spread rapidly in the United States once introduced. In other words, if drug-resistant TB increases in developing countries, rates in developed countries will increase as well. The rates have recently increased by 50% in Denmark and Germany.
The problem is a serious one. Currently available second-line antibiotics to treat multidrug-resistant TB are four to ten times more likely to fail than standard therapy for drug-susceptible TB, and about 100 times more costly. There are no new drugs that might be effective in treatment of multidrug-resistant TB currently undergoing clinical trials, and effective new drugs for TB are generally estimated to be at least a decade away. As a result, it is quite possible that we may see a return to the 18th and 19th centuries, when as many as 20% of all adult deaths in Europe and the United States resulted from TB, then known as consumption.
#
"The pill turns 40",1517,0,0,0
(May '00)
The contraceptive pill, first introduced on May 9, 1960, is now used by more than 100 million women around the world, and is the most popular contraceptive method in 78 of 150 surveyed countries, according to a report in \IPopulation Reports\i, a quarterly journal published by the Johns Hopkins Population Information Program in the USA. Apart form China and India, where family planning programs have emphasized long-term or permanent methods, 'the pill' is the most popular contraceptive method, used by some 12 percent of married women.
Taken regularly, the pill prevents pregnancy almost without fail, and it also helps to reduce excessive bleeding which can commonly lead to anemia in developing countries where diets are deficient in iron. The new lower dose pills are less likely to increase the risks of certain circulatory system diseases associated with the original pill.
Perhaps the greatest use is in the world's richer nations, where it is estimated that 80% of women born since 1945 have used the pill at some time. And even in China, where the percentage of married users of the oral contraceptive is just 3%, that still adds up to 7.6 million users, the largest number in any country in the world. There are 6.8 million married oral contraceptive users in Germany, 6.1 million in Indonesia, 6.0 million in Brazil, 5.7 million in Bangladesh, and 5.6 million in the United States. The low-dose pills were approved for contraceptive use in Japan only in September 1999. (For more country-by-country statistics on oral contraceptive use, go to http://www.jhuccp.org/pr/a9/a9suptab.stm)
Some of the highest levels of pill use in the world are among sexually active unmarried women in developed countries. Outside Eastern Europe and Asia, an estimated 36 percent of sexually active unmarried women in developed countries use this method. In Europe, the figure is 45 percent and in North America, 36 percent.
As a rule, 40 years on, modern oral contraceptives are safe for most women, associated with a risk lower than that from pregnancy and childbearing for almost all women, but particularly in countries with high maternal mortality rates. There are measurable increased risks of heart attack and stroke for older users of 'the pill' who have hypertension or who smoke. For women who do not smoke and who do not have high blood pressure, the risks associated with today's low-dose pills appear to be minimal.
The so-called Emergency Contraceptive Pills (ECPs) are oral contraceptive pills containing the progestin levonorgestrel or norgestrel with or without estrogen. These are becoming more common as a line of defense in those cases where women did not or could not use contraception or in cases where there was reason to suspect that their regular method failed to operate as normal. According to the report, ECPs can be used by almost all women, even those with medical conditions that rule out the ongoing use of normal oral contraceptives. The ECPs reduce the probability of pregnancy by around 88% over unprotected sex during the second or third week of the cycle, but they do not disrupt an established pregnancy.
To see a full-text copy of the \IPopulation Reports\i issue, Oral Contraceptives--An Update, go to: http://www.jhuccp.org/pr/a9edsum.stm
#
"Marathon runners can drink too much water",1518,0,0,0
(May '00)
Picture a long-distance runner, and the odds are that you will think of somebody seizing a drink on the run, but it appears that drinking too much water while running a marathon can kill you. This has been reported a number of times, but the actual cause of death has been something of a mystery until now. It appears that the excess water can help to cause the brain to swell and fluid to leak into the lungs, either of which can be fatal. That, at least, is what researchers at the University of California, San Francisco, and Baylor College of Medicine in Houston have suggested in a paper in the \IArchives of Internal Medicine\i in early May. They also list the cure: a simple intravenous dose of salt water.
Obviously marathon runners need to keep hydrated, but a few hours of sweating while drinking fresh water can lead to a condition called hyponatremia (literally, 'low blood salt'). The researchers looked at seven athletes who suffered hyponatremia while running a marathon. All had been nauseous, vomiting, or confused at some point during their run. Six of the seven patients survived after intravenous treatment with a high salt solution. All of them had fluid in their lungs, a standard indicator for hyponatremia, and all had low levels of sodium and oxygen in their blood.
X-rays indicated that six of them had significant brain swelling. The researchers believe that the body tries to maintain a balance of salt and water levels between the blood and tissues by drawing water out of the blood. This causes a puffiness in the skin and swelling in the brain. The brain then responds by triggering the release of water into the lungs, and the lung fluid, or the brain pressure, eventually kills most patients with hyponatremia.
In simple terms, they either suffocate or their brain herniates. All of the patients had been taking ibuprofin-based pain relievers, raising the suspicion that these drugs may have supported the water retention, making hyponatremia even more likely, but with such a small sample, this can only be regarded as a deep suspicion, not as a proven fact.
A further note of caution in the report: women may be more at risk than men, since the hormone estrogen can act in combination with another hormone called anti-diuretic hormone (ADH) to constrict the blood vessels in the brain. ADH is normally produced to save water in response to heavy perspiration.
Hyponatremia produces symptoms similar to those caused by heart failure, so there is a risk of misdiagnosis here, say the researchers. The trick for correct diagnosis is to measure blood sodium levels and give a chest X-ray. To avoid hyponatremia, the researchers recommend that athletes take salt supplements with their water before, and even during, marathons or longer endurance events.
While the researchers mentioned salt tablets in their recommendation, salt tablets are no longer recommended by most Australian medical practitioners as a prophylaxis against water intoxication, as hyponatremia is called in Australia, because of the risk of gastrointestinal upsets. Instead, they recommend a dilute solution of salt water, about 0.1% (i.e., one gram per liter of water) or sports drinks. Sports drinks are designed to provide similar levels of salt to those lost in their sweat by reasonably well heat-acclimatised people, since heat acclimatisation results in a reduction to about 25% or thereabouts of the normal salt level in sweat.
This additional salt is not recommended unless you are losing a lot of sweat and doing so for many hours. Ultramarathon competitors in a hot climate, soldiers engaged in long range patrols in the heat, miners working underground where temperatures can exceed 50░C, adventurers who walk across deserts in summer, and other people engaged in activities similar to these need extra salt. Very few other people need anywhere near the salt they currently ingest, according to nutritionists, who stress that salt is strongly associated with increased risk of hypertension (high blood pressure) the major risk factor for stroke. People who think they need extra salt should obtain competenmt medical advice first.
Key names: Allen Arieff, Carlos Ayus, and Joseph Varon.
#
"Indoor hot tubs and lung disease",1519,0,0,0
(May '00)
Indoor hot tubs generate aerosols, fine suspensions of liquid droplets in the air, and given the right conditions, these aerosols can spread bacteria related to tuberculosis (TB) to people making regular use of the tubs, according to a paper delivered at an American Thoracic Society conference in Toronto during May.
The bacteria, \IMycobacterium avium\i and \IM. fortuitum\i, are known collectively as nontuberculosis mycobacteria, or NTM. Nine people, including four children, were described as having the disease, and the bacteria in question were found in the hot tub water and/or in the air of the homes of the people diagnosed. The hot tubs in these cases were located inside homes, near family and living rooms, and bedrooms.
The jets from hot tubs aerosolize the bacteria, and bacteria-rich bubbles rise, bursting and spreading the bacteria around he room, ready to be breathed in. Luckily, NTM is not contagious like TB, and the bacteria, generally found in brackish ocean water like tide pools, are rare as infectants. But this may change if the cases described are any indication, since the once 'luxury items' are now becoming more common in the developed world.
NTM involves fever, tiredness, night sweats, cough and weight loss, and removing the hot tub from the home is the primary treatment for a mild case. In more severe cases, treatment may involve corticosteroids and/or corticosteroids and antimycobacterial antibiotics. Sometimes three to four antibiotics must be given at once.
NTM is often misdiagnosed as sarcoidosis, characterized by inflamed, microscopic growths called granulomas most often found in the lungs, or even as tuberculosis. Once the hot tub link is known, this common error should be less of a problem, according to Dr Cecile Rose, who gave the paper.
#
"CPR: a new look",1520,0,0,0
(May '00)
In Seattle, bystanders do not perform cardiopulmonary resuscitation in almost half of witnessed cardiac arrests, even though extensive training in CPR has been given to the citizens of that city. When CPR instructions are delivered by telephone, this can take as much as 2.4 minutes of valuable time when both chest compression plus mouth-to-mouth ventilation are dealt with. In experimental studies described in the \INew England Journal of Medicine\i during May, chest compression alone is associated with survival rates similar to those with chest compression plus mouth-to-mouth ventilation.
The test involved fire-department telephone dispatchers giving bystanders at the scene of apparent cardiac arrest instructions in either chest compression alone or chest compression plus mouth-to-mouth ventilation. The primary end point for the study, the measure of success, was survival to hospital discharge.
There were 241 patients randomly assigned to receive chest compression alone and 279 assigned to chest compression plus mouth-to-mouth ventilation. Complete instructions were delivered in 62% of episodes for the group receiving chest compression plus mouth-to-mouth ventilation and 81% of episodes for the group receiving chest compression alone (this difference was significant at the 0.005 level - statistical language for something which is highly unlikely to have happened by chance: see \Jcorrelation\j).
The instructions for compression required 1.4 minutes less to complete than instructions for compression plus mouth-to-mouth ventilation, and survival to hospital discharge was better among patients assigned to chest compression alone when compared with those assigned to chest compression plus mouth-to-mouth ventilation (14.6% against 10.4%). But this difference was not statistically significant (the probability of getting this difference was only 0.18, less than one chance in five, but this was still above the accepted threshold for acceptance).
At the very least, though, the outcome after CPR with chest compression alone is similar to that after chest compression with mouth-to-mouth ventilation, and chest compression alone may be the preferred approach for bystanders inexperienced in CPR, say the researchers.
The \INew England Journal of Medicine\i thought this important enough to provide an editorial on the matter in which Gordon A. Ewy wrote that "Authorities in CPR have come to realize that our now standard method of performing basic CPR is very difficult for the average lay person to learn, retain and perform." Dr. Ewy is an authority in the field, and recognized internationally for his CPR research. In February, he received a 'CPR Giant' award from the American Heart Association Emergency Cardiovascular Care Committee in recognition of his many contributions in the area.
In a study reported in the \IArchives of Internal Medicine\i about five years ago, a group of UA College of Medicine cardiologists found that 82% of people questioned were 'very concerned' or 'moderately concerned' about the possibility of getting a disease while giving mouth-to-mouth. This may help to explain why bystanders rarely initiate CPR, but as Ewy points out, "survival with CC-CPR (chest-compression CPR) is dramatically better than no bystander CPR."
At the same time, he emphasizes that standard CPR is almost always essential in children and young adults. He also encourages the early use of automatic external defibrillators (electronic devices that shock the heart to restore normal contraction rhythms) in patients suffering cardiac arrest due to ventricular fibrillation, in which the ventricles contract in a rapid and uncoordinated way.
#
"Mutations to order",1521,0,0,0
(May '00)
Usually, when a crop plant is being altered by traditional genetic engineering techniques, the operation is rather haphazard, involving the introduction of many copies of foreign genes at random positions on the chromosome. This blunderbuss approach usually works, but sometimes it causes variation in the level of foreign gene expression and, occasionally, it may even trigger the suppression (silencing) of similar genes that occur naturally in the plant. As well, this allows objectors to claim that fish genes are being placed in plants, which is 'contrary to the laws of nature', or to make other similarly emotive claims.
Usually, the desired gene is very similar to a gene already in the target plant, so a simpler method would be to make a change in the plant gene so that it becomes more efficient. Biologically, this makes genetic manipulation about as dangerous as looking for natural mutants in a crop and selecting those, while being a great deal more efficient than traditional methods.
Enter Chris Baszczynski and his colleagues, who have adapted a technique, originally developed for gene repair in mammalian cells, which allows them to introduce single base changes into DNA. They do this with small hairpin-shaped molecules made up of RNA and DNA (so called chimeric oligonucleotides). The method works: they have already generated herbicide-resistant plants with just a single change in the genetic code.
The key point here is that, as more and more genomes are mapped out, we can identify the nucleotide sequences in different organisms and see what changes are needed to change the gene in, say, a petunia, so it is the same as that in a herring. Then, rather than 'adding a herring gene', scientists can tweak the petunia's gene so that it works like the herring gene, while avoiding the objections of those who do not understand the process.
A chimeric oligonucleotide is made up of a short self-complementary stretch of double-stranded DNA flanked by a longer stretch of RNA to protect it from degradation inside the cell. The DNA portion is identical to the region of the gene to be modified, but with a single nucleotide change. When the chimeric oligonucleotide is introduced into the nucleus, it homes in on the gene of interest which has the matching sequence, and then it triggers the plant's own DNA repair machinery to substitute the oligonucleotide-encoded sequence with the single base alteration for the original plant sequence.
The group had previously used chimeric oligonucleotides to alter the gene encoding the plant enzyme acetohydroxyacid synthase (AHAS) in masses of undifferentiated plant cells called callus tissue. Once this gene was added, the plant cells gained resistance to the broad-spectrum herbicide imidazolinone. The problem was that the conversion was inefficient, affecting about one cell in 10,000, and there was no evidence that the modified AHAS gene was maintained in subsequent generations. The new work has produced whole plants that carry the gene, and has shown that the resistance is passed on to plants in later generations.
If the technique can be made more efficient, it could avoid many of the problems associated with traditional plant transformation used for GM crops. It will also make a mockery of current labeling laws, which require genetically modified crops to be identified as such, since there is no way that this new type of GM plant can be distinguished from a natural mutant spotted in a field, selected and used, just as humans have been doing for the past 10,000 years.
#
"Chromosome 21 published",1522,0,0,0
(May '00)
A German-Japanese consortium published the sequence of human chromosome 21 in \INature\i in mid-May. This chromosome is one of the smallest among the 24 different human chromosomes (22 autosomes, plus the X and Y chromosomes), but it is important because it is associated with Trisomy 21, one of the most common genetic diseases, also known as Down's syndrome (or Down syndrome in the USA). In this condition, affecting about one live birth in 700, an extra copy of chromosome 21 results in severely aberrant physical and mental development in children, leading to mental retardation.
After chromosome 22, published in December, this is the second human chromosome that has been deciphered completely. More importantly, virtually all of the genes encoded by chromosome 21 have now been identified by computer-based analysis of the genomic sequence. This ends a process that began in the early 1990s, with the German-Japanese consortium being formed in 1995. The end result was the completed sequencing of all of the 33,546,361 base pairs of the chromosome in an effort that involved a total of 170 workers.
There are 225 genes in the chromosome. Of these, 127 are clearly characterized genes, whereas the remaining 98 have been discovered by computer gene predictions that rely on the fact that characteristic structural patterns are common to most human genes. This means the computer can scan the sequence to identify where the coding regions are located.
The function of 103 out of the 127 characterized genes is known because the corresponding protein was either previously identified or its activity in a defined biochemical pathway is known. The next step will be to characterize in detail the remaining novel genes and to find their functions. Some of these genes will be involved in diseases such as several forms of deafness, several solid tumor cancers, and a form of manic depressive psychosis, all known to map to chromosome 21, and all still to be linked to a specific gene.
There are 14 known genes localized on chromosome 21 that have variations that are linked to diseases such as Alzheimer's disease, a particular form of epilepsy, auto-immune conditions, and also an increased probability of developing leukemia.
As mentioned last month, the number of genes in the human genome is generally accepted as being somewhere between 80,000 and 140,000, with 100,000 as a commonly quoted figure. If all of the 3 to 4 billion base pairs were part of a gene (they are not), the number of genes could be as high as 3 million, but if we extrapolate from the number of genes in chromosome 21, there could be as few as 40,000 genes in the human genome. This matter will become less confusing as more human chromosomes are completed.
While people with chromosome 21 trisomy have many problems, they are in fact born alive, where most other trisomies are aborted spontaneously. Already, a number of commentators have asked: if the number of genes in chromosome 21 is unusually low, could this explain why trisomy 21 humans survive where others do not?
#
"GM foods and vitamin A (1)",1523,0,0,0
(May '00)
In mid-May, a collaboration between Greenovation and Zeneca was announced to distribute a genetically modified rice freely in developing countries. The rice (see \JNew rice strains and vitamin A and iron deficiency\j, August 1999) is designed to provide increased levels of vitamin A, a vitamin desperately needed in much of the developing world, where Vitamin A deficiency is the cause of 500,000 cases of irreversible blindness each year.
The two companies will be working to deliver this remarkable technology free-of-charge for humanitarian purposes in the developing world. At the same time, Zeneca will explore commercial opportunities for sales of 'Golden Rice' into the growing market for healthy foods in the developed world. The collaborators anticipate that 'Golden Rice' will not be available for local planting and consumption until 2003 at the earliest.
Zeneca is a large commercial biotechnology company, while Greenovation is a German university spin-off biotechnology company which performs and funds research and development in plant biotechnology for agricultural and phytopharmaceutical applications.
Key names: The inventors of 'Golden Rice' are Professor Ingo Potrykus of the Institute for Plant Sciences, Swiss Federal Institute of Technology, Zurich, Switzerland, and Dr Peter Beyer of the Centre for Applied Biosciences, University of Freiburg, Germany.
#
"GM foods and vitamin A (2)",1524,0,0,0
(May '00)
British researchers report that they have used a single bacterial gene in tomatoes to triple the amount of b-carotene. Lycopene and b-carotene are major carotenoids that are essential to good health and can benefit chronic conditions such as coronary heart disease, some cancers, and macular degeneration. Tomatoes are the main dietary sources of these substances, often referred to as phytonutrients. Unfortunately, most people fail to consume the recommended daily allowance.
As well, a dietary deficiency of b-carotene, which is a provitamin A carotenoid, can lead to blindness and premature death, and UNICEF estimates 1 to 2 million deaths of children aged 1 to 4 years could be prevented each year by improved vitamin A nutrition.
Peter Bramley and his colleagues at the University of London/Royal Holloway hospital announced during May that their new tomato plant contains altered proportions of carotenoids - specifically, increased levels of b-carotene/provitamin A carotenoid. They say that the modification has no effect on growth or development of the plant and it can be passed on to subsequent generations.
The bacterial gene that they have added is called crtI, and its function is to produce phytoene desaturase, an enzyme that converts phytoene into lycopene, cyclization of which results in the formation of a- and b-carotene. But, while the levels of b-carotene went up as much as 3.5 times normal, the overall carotenoid levels did not increase and lycopene content was often twofold lower. This is an interesting result, and Bramley suspects that there is some form of feedback inhibition within the pathway. This, he suggested, could have implications for effective manipulation of carotenoid levels in transgenic plants.
#
"Genetic brothers populate the Middle East",1525,0,0,0
(May '00)
For all the strife in the Middle East, the people of that area appear to be more closely related than the extremists among them may care to consider. That is the message from genetic evidence, based on a study of the Y chromosomes of men in the area. And the evidence is clear: Jews are the genetic brothers of Palestinians, Syrians and Lebanese, and they all share a common genetic lineage that stretches back thousands of years.
If a common heritage conferred peace, then perhaps the long history of conflict in the Middle East would have been resolved years ago, says a senior author, Harry Ostrer, of a report published in the \IProceedings of the National Academy of Sciences\i in May. "Jews and Arabs are all really children of Abraham, and all have preserved their Middle Eastern genetic roots over 4,000 years," he says.
The Y chromosome is usually passed on unchanged from father to son, but occasionally a random variation occurs, something which is then transmitted in that male line, and which can be spotted by modern genetic analysis. Related populations carry the same specific variations, which allows scientists to track descendants of large populations and determine their common ancestors.
The researchers analyzed specific regions of the Y chromosomes of 1371 men from 29 worldwide populations, including Jews and non-Jews from the Middle East, North Africa, sub-Saharan Africa, and Europe. The study found that Jewish men shared a common set of genetic signatures with non-Jews from the Middle East, including Palestinians, Syrians, and Lebanese, and these signatures diverged significantly from non-Jewish men outside of this region. In simple terms, Jews and Arabs share a common ancestor and are more closely related to one another than to non-Jews from other areas of the world.
The analysis also shows that since the \JDiaspora\j, the time around 556 BC when Jews migrated out of Palestine, the world's Jewish communities have generally not intermixed with non-Jewish populations. If they had, then Jewish men from different regions of the world would not share the same genetic signatures in their Y chromosome. Ancient Jewish law states that Jewish religious affiliation is assigned maternally, meaning that there was a potential for non-Jewish men to contribute to present-day Jewish genetic diversity. But there appear to be no signs of such diversity, according to Michael Hammer, the lead author of the new study, who commented, "It was surprising to see how significant the Middle Eastern genetic signal was in Jewish men from different communities in the Diaspora."
#
"BioMed Central brings a new era",1526,0,0,0
(May '00)
A new online publishing house, BioMed Central, started receiving manuscripts in late May for the fast-track electronic publishing of biological and medical research. They also announced a star-studded editorial board, including Nobel laureate, Dr Harold Varmus; Professor Elizabeth Blackburn, the co-discoverer of telomerase; Professeur Philippe Kourilsky, head of the Pasteur Institute; Dr Paul Nurse, Director-General of the Imperial Cancer Research Fund; and Professor Sir David Weatherall from Oxford University, as well as a range of other senior researchers from around the world. More 'A-list names' from the world of life sciences are expected to join over the next few weeks.
BioMed Central is part of the Current Science Group, a group of independent companies that collaborate closely with each other to publish and develop information and services for the professional biomedical community. The Group has its head office in London (UK), with other offices in Philadelphia, New York and Tokyo.
BioMed's system will allow researchers around the world to access peer-reviewed research articles free-of-charge, and without the traditional barriers to access imposed by conventional publishers. The system will offer free access to research and allows users the right to distribute work among their colleagues and other contacts. This, we are promised by Biomed, will allow scientists the freedom to participate in a truly worldwide community of scholars.
According to Harold Varmus, "If this works as hoped, we will move much closer to achieving the revolution in scientific publishing that the Internet promises." It was Varmus who, in 1999, proposed the concept for PubMed Central, an on-line repository for archiving, organizing and distributing peer-reviewed reports from journals.
All primary research published by BioMed Central will be made available free and placed immediately and in full in PubMed Central, the archive of biomedical research sponsored by the US national Institutes of Health (NIH). Authors will retain the copyright for their work and will be allowed to distribute their work however they see fit, including archiving it on their own Web sites.
Information for prospective authors (and others) is available at http://www.biomedcentral.com/ifora.asp and this gives a good general introduction for other interested people as well.
See also \JVarmus, Harold E(lliott)\j, \JTelomerase broadens its scope\j, \JSerious publishing\j, August 1999 and \JPaperless publishing\j, September 1999.
#
"The Internet and old books",1527,0,0,0
(May '00)
At the same time that the Internet is revolutionizing scientific publishing, (see \JBioMed Central brings a new era\j, this month), it is also raising the value of and market for used and rare books. A recent survey of 189 booksellers around the world reveals that online sales have increased the number of used books sold by an average of 12.5%, even though some of the books sold online cost more than the same books sold in a conventional store.
To a book-lover, there is a special romance about buying books, whether old or new, and something entrancing about walking into a book store. But the real joy, as Helene Hanff reminds us in \I84 Charing Cross Road\i is the receiving of new (to the buyer, if not to the world) books. Hanff in New York bought her books from a London dealer in used books, purely by post, and it seems that the spirit of Helene Hanff in New York, and Frank Doel in London, is still alive and well on the Internet.
Online sales have stabilized or decreased the price of the more common, plentiful used book titles, but at the same time, the price of rare books has climbed. The problem is that there is a finite supply of collectable books, and the Internet reaches many more potential customers than any normal shop.
The survey was carried out by two Ohio University researchers, Phyllis Bernt, a professor of communication systems management, and Joseph Bernt, associate professor of journalism. They reported the results of their poll of affiliates of the Advanced Book Exchange (ABE), (http://www.abebooks.com) at the Popular Culture Association Conference in New Orleans during May. ABE operates as a Web search engine for 5,200 book dealers who are engaged in e-commerce.
They reported that 29 of the 131 owners in the study had closed their conventional shops to conduct business solely in cyberspace, but the traditional book store is not entirely replaced since dealers still need a place to store their books, and some sort of storefront to attract sellers of books. About a third of the respondents were book lovers who had been recently encouraged to start an online operation. Some of them were there just to sell a private book collection, something they couldn't do before the advent of the Internet.
Most reported that the overhead and start-up costs for online stores are lower, but it seems that they are not passing this saving on to their customers. Even so, the buyers have not been deterred, and Joseph Bernt believes this may relate to the convenience that the electronic market can offer.
Another aspect is the overseas market, with 22% of those surveyed reporting that their international market was 30% or more of their sales volume, but then, anybody who was familiar with \I84 Charing Cross Road\i would know that already.
#
"Restaurant noise",1528,0,0,0
(May '00)
There seems to be a scientific Law of Restaurants that the quality of the food is proportional to the sound levels, or, failing that, it seems that restaurant proprietors think this is the case. All over the world, eateries offer hard surfaces, high ceilings and open kitchens, while large crowds respond to the noisy conditions by speaking louder. In fact, according to a report in the May issue of \IAudiology Today\i, the noisiest restaurants are so loud they may be damaging the hearing of waiters and other workers who put in full shifts during the dinnertime rush.
Diners, according to the report, are not at risk of hearing loss, but the workers, exposed to the noise for longer periods, most certainly are at risk, and even the diners feel pain. Robert Sweetow and Lisa Tate, both audiologists, report that noise in restaurants is one of the most common complaints among diners, particularly those with sensitive ears or hearing problems.
The researchers measured sound levels in five restaurants, ranging from a quiet bistro to a very noisy restaurant/bar, on Thursday, Friday, or Saturday evenings between 6 pm and 10 pm. They took sound readings for several one-hour blocks of time over the three-day period and used the data to calculate the average noise exposure a waiter or other worker might get over an 8-hour period.
They found maximum noise levels ranging from 85.5 decibels (dBA), about as noisy as heavy city traffic, to 109 dBA, which is equivalent to a loud dance club. Averaged over eight hours, the exposures ranged from 50.5 dBA to 90 dBA, but continued exposure to noise at 85 dBA or higher can eventually cause hearing loss, according to standards set by most occupational health authorities (a few set the level more conservatively at 80 dBA).
The researchers recommend a system adopted by the \ISan Francisco Chronicle\i, which rates restaurant noise on a one bell through four bells rating, or uses a bomb symbol if the noise exceeds 80 decibels. The ratings are stated as:
4 bells - Can talk only in raised voices - 75-80dB
1 stick of dynamite - Too noisy for conversation - over 80dB
Perhaps, they suggest, if other media followed suit, diners would know what they are walking into, and waiters and other workers would know whether their workplace might be damaging their hearing.
#
"Colonial spiders and web space",1529,0,0,0
(May '00)
Some orb weavers, web-spinning spiders, live in colonies, making scientists wonder how large female spiders in colonies are able to claim enough territory to rebuild their daily webs in the face of competition from other large spiders and smaller ones. It would be reasonable to assume that the spiders engage in aggressive behavior, but a group of researchers from Cincinnati and Cornell universities have found that this just does not happen. Instead, the larger spiders just get there first, pre-empting the best spaces in the colony before the other spiders can get a look-in.
The work has been described in the May issue of the journal \IAnimal Behaviour\i, and looks at the interactions in a colony of spiders in the species \IMetepeira incrassata\i (Araneidae) living in Mexico. This spider lives in groups of hundreds or even thousands, but even though they are colonial, they need their own private space to build their webs.
This is no different to the problem confronting other colonial animals, from seabirds to marine invertebrates. Among the orb weavers, only one spider can efficiently operate a single web and catch prey, and this means that, once the webs are created, tolerance can take over. But up until that time, they seem to compete non-belligerently for big spaces in the colony.
Female spiders with egg sacs attach their egg sacs to lines near the orb web, but once the sacs are in place, the females cannot move them. This means that the females then have to stay put and guard the eggs or they will be parasitized. So the large females with eggs build much earlier than those without eggs, which means they are assured of enough space to build a large orb web.
All is not sweetness and light, though. If a small spider starts building while the large spiders are still building, a large spider will threaten it by shaking the silk lines so that it ends up scared and stops building. But once the larger spiders have their webs complete, they settle down and the threats diminish.
The result is that the small spiders that start early lose so much time when they are disrupted that they finish later than the smaller spiders that wait until the large spiders are finished building their webs. This tends to produce a situation where belligerence is unnecessary, since the smaller spiders wait their turn.
When the researchers removed the big spiders from the colonies, the little ones finished their webs over an hour earlier because they were not being distracted by the actions of the larger spiders.
Key names: Linda Rayor and George W. Uetz. Rayor was one of those whose laboratory research (see \JEngineered corn and monarch butterflies\j, May 1999, purported to show a risk to butterflies from GM corn, a risk which has since been dismissed as unrealistic by those familiar with the technology that Rayor was attacking: see \JGM: the scientists begin to speak out\j, September 1999, for the refutation.)
#
"Harpin: a protein that protects plants",1530,0,0,0
(May '00)
The US Environmental Protection Agency allowed conditional registration for the first commercial agricultural use of harpin, a protein that enhances plant growth and also makes a plant mobilize its own defenses against pathogens and insects.
The protein, discovered in 1991 by Steven V. Beer, a Cornell professor of plant pathology, will be combined with other ingredients and sold under the name Messenger(tm) under license from the Cornell Research Foundation, which aids in the development of Cornell-discovered technology. In a press release, Beer explains that treating plants with the harpin protein signals the plant to turn on its natural defense systems. The plant must be treated before the pathogen attacks, and it takes several days for the plant's system to mobilize its own defenses. The product has been proven in over 500 field trials on about 45 crops in four countries.
Oddly, the protein is derived from a plant pathogen, \IErwinia amylovora\i, the bacterium responsible for fire blight, a scourge in fruit orchards all over the world since the 18th century. The bacterium gets its name because it attacks apple and pear trees and many ornamentals in the rose family, leaving blackened branches, trunks, leaves, flowers and fruit. But if the bacterial blight is bad news for plants, the protein has surprisingly helpful properties.
Plants have a response to attack called the hypersensitive reaction, which develops in the few cells that are in direct contact with an invading pathogen in the plant's intercellular spaces. This is effectively the suicide of plant cells attempting to thwart disease. By using a technique called molecular mutagenesis, two of Beer's graduate students, Eva Steinberger and David Bauer, identified a number of hrp (pronounced 'harp') genes of \IE. amylovora\i. These genes are involved both in fire blight infection and in the development of the hypersensitive response.
One hrp gene produces harpin and, if this is placed in a few of the intercellular spaces of tobacco, tomato or geranium leaves, the plant cells collapse and die within 24 hours, just as if bacteria were present. If they \Iwere\i present, the collapsed plant cells would immobilize the bacterial cells, preventing the spread of further infection.
As well, treating plants with harpin was shown to induce a second response called systemic acquired resistance, or SAR, which provides protection against a broad range of pathogens. When tobacco plants are treated with their normal bacteria or viruses, they reject the pathogens. After this was noted, researchers found that harpin-treated plants grew larger and faster than plants not treated with the protein. Then Thomas Zitter found that harpin-treated pepper plants were less damaged by insects than control plants not treated with the protein.
"It's a really peculiar protein," Beer says. "When the isolated protein from the bacteria is applied to many sorts of plants, pathogen resistance develops and other beneficial effects occur too."
Key names in the development process: aside from those named, Alan Collmer, Zhongmin Wei, Sheng Yang He, Ron Laby and Cathy Zumoff
#
"Residential radon exposure and lung cancer",1531,0,0,0
(May '00)
A paper published in the June 1 issue of the \IAmerican Journal of Epidemiology\i argues that long-term exposure to radon in the home is associated with lung cancer risk and presents a significant environmental health hazard.
The study looked at 1,027 Iowa women, 413 of whom were newly diagnosed with lung cancer and 614 'controls', aged from 40 to 84 who had lived in their homes for the past 20 years or more. The study looked at both smokers and non-smokers, and concentrated on women because they are less likely to be exposed in the work place to substances that may cause lung cancer and historically have spent more time in the home.
Almost 60% of the basement radon concentrations for both the lung cancer cases (study participants with lung cancer) and the control group (participants without lung cancer) exceeded the US Environmental Protection Agency action level for radon of 4 picocuries per liter (pCi/L). This included 33% of living areas for the lung cancer cases, and 28% percent of the living areas for the control group.
According to William Field, lead author of the article, this was the most sophisticated radon exposure analysis ever performed in a residential epidemiologic study. Previous studies have shown that Iowa has the highest average radon concentrations in the United States, making Iowa the natural place to conduct the study. Radon is a naturally occurring, odorless, tasteless and colorless radioactive gas that is produced by the breakdown of radium in soil, rock and water. The high concentrations in Iowa and the upper Midwest are due primarily to glacial deposits that occurred more than 10,000 years ago.
Even at the EPA action level of 4 pCi/L, there was about a 50% excess lung cancer risk among the women in the study after correcting for the impact of smoking. The method involved placing at least four radon detection devices in each of the study subjects' homes for one year. The multiple radon measurements were then combined with estimates on radon exposure outside the subjects' homes and the subjects' past mobility history to construct detailed exposure estimates for each study participant.
While Iowa has the highest levels in the USA, radon is also a problem in areas with granite geology, especially where houses are actually made from granite blocks, as happens in parts of Europe. Granite contains small amounts of radioactive elements that break down to form radon, which may then escape from the rocks.
#
"Counting Grizzly Bear Numbers",1532,0,0,0
(May '00)
The grizzly bear (\IUrsus arctos\i) or \Jbrown bear\j, once roamed most of the North American continent, but habitat destruction and direct conflicts with humans have reduced their range by 99% in the 'lower 48 states', the part of the USA which excludes Hawaii and Alaska. Today, there may be fewer than 800 of these bears south of the Canadian border, and in 1975 grizzly bears were listed as threatened under the Endangered Species Act.
According to conservation authorities, the Greater Glacier National Park may provide one of the only opportunities for long-term survival of this species south of Canada. That, in turn, raises the question of how you count the bears without contributing to their diet, but science now has the answer. Recent advances in genetic technology allow identification of species, sex, and individuals from DNA extracted from bear hair and scats without handling bears. The US Geological Service has published a large amount of information about the methods they have been and will be using, which offers a useful example of the way genetics moves into all areas of science, given the chance.
Their plans involve setting up a grid of systematically positioned hair traps and using these to obtain minimum counts and a baseline index of population size. As well, bears identified from hair trap collections will be used in a mark-recapture model to estimate the population density and will provide an independent calibration of the population index developed from survey routes, where bear 'sign' has been collected. As well, the bears' DNA profiles will provide information on the degree of genetic variation, relatedness of individuals, and sex. This information will be used to address bear conservation issues.
Hair follicles (roots) are a good source of DNA, and a single bear 'guard' hair can yield sufficient DNA for identifying species, individuals, and gender, while DNA shed from the intestinal lining can also be extracted from bear fecal samples. It is more difficult, say the scientists, to extract DNA from fecal samples than from hair because DNA is present in smaller quantities, is degraded, and herbivore feces (this includes most bear scats) contain plant polysaccharides that inhibit the polymerase chain reaction (PCR; see \JMullis, Kary (Banks)\j for more detail, or search under 'polymerase' in earlier science review articles) used to amplify the genetic material. Without PCR, there is not enough DNA to allow reliable identification.
Hair traps work with barbed wire and a scented lure (brewed from aged fish, cattle blood, and other goodies) to bring the bears in to the 620 traps that were set out. Remote cameras are used in some cases to record bear behavior around the traps, and the traps are moved at regular intervals, always remaining within a 8 km (5 mile) square. Hair samples are also collected from trees that the bears use to rub against. Using tweezers, all hair is collected and placed in envelopes. Each hair sample is examined under a microscope to identify samples with intact follicles prior to DNA extraction.
Scat samples are collected by National Park Service back country rangers at least three times a year on 1200 km of trail located throughout Glacier National Park. The method involves taking a one-tablespoon sample, placing it in a 120 mL specimen cup on filter paper over silica gel beads, which serve to dry out the scat, stopping microbial activity. The remnant of the scat is removed from the trail so that it will not be sampled again, and the location and estimated age of the scat are recorded. Once the scat is dried, it is frozen and sent off for laboratory analysis.
Results so far indicate that there are an estimated 437 (from 349 to 590) grizzly bears in the northern third portion of the Northern Continental Divide Ecosystem and an estimated 332 (from 241 to 549) grizzly bears in Glacier National Park. The park estimates are higher than the previous 1970s estimate of 200 grizzly bears in the park, but that figure came from bear sightings only and would not stand up under today's scientific standards for reliability.
The continuing studies will soon be completed, but the data gathered will allow researchers to estimate the survival rate of bears between 1998 and 2000, an important factor to understand when managers set conservation actions and priorities.
#
"The ancient Antarctic and global climate change",1533,0,0,0
(May '00)
Hanging off the world, as we usually view our globe, the continent of Antarctica may seem to be a bit of an isolated irrelevance. Yet this huge lump of ice, located where the Pacific, Atlantic and Indian oceans converge and mix, is not only influenced by the world's climate, it also exerts effects that can be felt worldwide. In many ways, climatologists see the icy continent as a very central part of world systems.
Finding out how Antarctica has responded to changes in the past will help scientists to understand the global climate changes that concern us today, and that explains what Leah Joseph and her colleagues were doing, describing their studies of Antarctica's ancient environment to the American Geophysical Union during May.
They have been analyzing sediment samples from a site on the Maud Rise in the Weddell Sea. Scientists cannot study its history with usual surface geology techniques because Antarctica is covered with ice, but they can do it indirectly by examining materials that have eroded from the continent in the past and settled into the surrounding oceans. In this case, the researchers have looked at three measures - mass accumulation rate, grain size, and 'magnetic fabric' - for clues to past climate changes.
When it is warm and wet, soil and rock erode into rivers, which carry sediment into the sea. When ice sheets are present and mobile, the masses of moving ice drag sediment along, and the meltwaters from receding ice sheets help transport sediment from the continent to the sea. If the ice is cold and stable, there is a much lower mass accumulation rate (this is a measure of the amount of sediment deposited in a given time).
When the researchers look at grain size, this yields more clues about where sedimentary material came from and how it was deposited. For example, very fine dust grains were most likely swept off from dry, desert-like areas by the wind and slowly settled at random on the sea floor, uninfluenced by ocean currents. Medium-sized grains represent mud that flowed out in plumes from rivers or mud that was piled into drifts by currents that flow along the bottom of the ocean floor. Large grains were most probably deposited by very fast currents, perhaps those bringing material down quickly to the ocean bottom from the continental shelf.
Magnetic fabric analysis provides information about the orientation of the grains in a sample, and that in turn is related to the velocity of ocean currents at the time and place they were deposited, with faster currents making the grains align more with one another.
Joseph and her colleagues were looking at the relatively dry time span between the late Cretaceous Period and the middle Eocene Epoch, 70 million to 45 million years ago. Previous studies seemed to indicate that there were desert-like conditions on Antarctica in this period, but the new evidence seems to go against this. The fine, wind-blown grains that should be found in desert-like conditions were only found much later, suggesting that the climate was dry, but maybe not as dry as people believed in the past. The data also indicate a gradual trend from dry conditions toward a warmer, wetter Antarctica, which agrees with other investigators' research.
The study helps to fill gaps in understanding what happened before the Antarctic ice formed, when it formed and how stable it has been, and that has implications which are relevant to today's global warming concerns. Right now, the stability of the Antarctic ice sheet is a very big issue, and Joseph's information provides a several-million-year picture of when the ice sheet might have been more mobile.
The data also suggest that it is not just a case of simple temperature changes dictating what happens to the Antarctic ice. The feedback is not as simple as warm weather causing the ice to melt, but rather, there is a more complex relationship involving ocean circulation patterns, and before scientists can do more useful climate modeling prediction, they will need to understand those feedback effects.
#
"Evidence found for ice-age El Ni±o",1534,0,0,0
(May '00)
An odd effect noticed by an American student working on her master's thesis has now been interpreted as evidence that El Ni±o events happened during the last Ice Age, according to a paper published in \IScience\i in mid-May. Up until now, archaeological evidence from South America has been assumed to indicate that El Ni±o events are warm-weather conditions. Up until now, no evidence has been seen that El Ni±os occurred during glacial time periods. And perhaps even more surprising than finding the evidence on the North American continent was its location, right at the toe of the last ice sheet.
Tammy Rittenour's project involved looking at the way New England's glacial Lake Hitchcock drained at the end of the last Ice Age. While she found evidence to explain that, she also found evidence of El Ni±o effects in New England's climate 17,500 to 13,500 years ago.
As trade winds in the pacific Ocean weaken, they allow unusually warm currents in the western Pacific to flow eastward across the equatorial Pacific to the western coast of South America. Each time this cycle is repeated, it changes global weather patterns, and some of these changes leave a mark on the environment.
Lake Hitchcock was formed when the farthest advance of the Laurentide Ice Sheet built a terminal moraine that acted as a natural dam to the runoff when the ice sheet melted. This flooded what is now the Connecticut River valley from southern Connecticut to northern Vermont, and later, the lake drained again. The question Rittenour and her colleagues were trying to answer was whether the lake drained gradually, in stages or catastrophically.
Working at the University of Massachusetts (UMass) at Amherst, Rittenour was sitting on the edge of the ancient lake, so they used well-drilling equipment at a site on a campus athletic field to pull two cores from the ancient lake bed. The first core was 105 feet (30 meters), and a second core of 25 feet (eight meters) was taken because the longer core was missing the uppermost portion of the geologic record. The two cores were matched up to form one 110-foot core.
Sediments from deposits like this are varved, composed of alternate summer and winter layers which mark the years very clearly. Like any glacial waters, Lake Hitchcock was filled with fine sediment that was formed when the glacier dragged rock over rock, grinding and pulverizing the rocks to form a rock flour that is eventually carried out by meltwater.
Trapped in the waters of the lake, the larger, light-colored, silt particles could fall to the bottom, but the stirring of winds on the surface and the new run-off pouring into the lake served to keep the smaller particles in suspension. Then in winter, when the surface was sealed by a sheet of ice and run-off had stopped, the fine dark sediment could fall out. Then summer would come, and only the heavy particles would fall, and so on, building a fine pattern of bands called varves.
Rittenour counted 1389 varves, and by using radiocarbon dates, she was able to determine the time that the ice retreated from Amherst and how long glacial Lake Hitchcock existed in Massachusetts. This was what she expected to achieve, but she also found that there was a variance in the thickness of the annual layers. She began to suspect that there might be a pattern to them and she knew if there was, it had to be climate-related.
Ernst Antevs was a Swedish geologist who came to North America in the early 20th century and developed what is now called the New England varve chronology by examining and measuring remnants left by Lake Hitchcock and other now-dry glacial lakes in New York and New England. He correlated the varves, year-for-year and lake-for-lake, developing a chronology that covered 4,000 years of sedimentation, but Antevs could only do this because the amount of glacial flow entering the lakes was controlled by seasonal variations.
It soon became clear that Rittenour's patterns matched Antev's, even though his samples came from different sites. The match extended down to a pattern of thicker layers, suggesting that the sedimentation pattern was the same all over the lake, and other lakes as well. Normally, lakes have different sedimentation in different regions, so there was something special about these glacial lakes, and that something imposed the same variability on all the glacial lakes across the region.
They eliminated from their study the earliest, thickest varves from their study because they were formed when the ice sheet was almost on top of the core site, so that the thickness only reflected the local effect of the melting ice sheet, not climate. Then Rittenour and her colleagues performed a statistical analysis to look for climate signals and a spectral analysis to look for periodicities.
They found a grouping of three peaks of varve thickness. The first one lay between 2.5 and 2.8 years, the next between 3.3 and 3.5 years and the third from four to five years. In other words, right in the modern El Ni±o climate frequency. Says Rittenour: "The fact that there are three peaks is related to the way El Ni±o operates and this gives us a better idea that this is El Nina. We expected some climate signals were recorded in the sediments because of how the thickness of the lake sediments change from year to year. But we expected the climate to be influenced by something called the North Atlantic Oscillation, which is closer to New England, and not a climate signal from the tropical Pacific."
So now we know that El Ni±os have been around for at least 17,000 years, and there has been new evidence published this year that they existed 120,000 years ago during the last interglacial period. We shouldn't just expect them to go away, says Rittenour.
So how did Lake Hitchcock go away? Gently, says Rittenour, draining in stages, and that was the major part of her master's thesis, but it was the minor chance discovery that gained her a publication in the most prestigious American scientific journal.
Key names: Tammy Rittenour, Julie Brigham-Grette and Michael E. Mann.
#
"El Ni±o Update",1535,0,0,0
(May '00)
Late news: in early June, 2000, the El Ni±o Southern Oscillation (ENSO) Index went negative, indicating the likelihood of the drought in North America breaking, the prospect of the drought breaking in Ethiopia (see \JDrought strikes Ethiopia again\j, April 2000), and drought breaking out in Australia. The ENSO index is inclined to jump up and down, but the trend in mid-June was beginning to look convincing.
#
"June, 2000 Science Review",1536,0,0,0
\JThe Iceland Healthcare Database, ethics and bioinformatics\j
\JThe Arabidopsis Information Resource (TAIR)\j
\JNew McGill Centre for Bioinformatics\j
\JCreating a virtual plant\j
\JAdult stem cells are versatile\j
\JThe genetics of blood stem cells\j
\JAir-breathing rockets\j
\JAntimatter and space\j
\JFarewell to Compton\j
\JWater on Mars\j
\JHow much water on Mars?\j
\JAncient salts\j
\JNIAID lets AIDS research contract to Australia\j
\JMale circumcision and HIV\j
\JIslet transplantation and Type I diabetes\j
\JGrowing insulin-secreting cells\j
\JCocaine use and ADHD in children\j
\JNew measles DNA vaccine\j
\JAsthma and garbage\j
\JAsthma and insecticide sprays\j
\JAsthma and thunderstorms\j
\JA breast cancer metastasis suppressor gene\j
\JNovel prostate cancer vaccine\j
\JThe human genome working draft\j
\JThe one-day genome\j
\JCaterpillars safe from some types of Bt corn\j
\JFamily size and IQ\j
\JA gecko on the ceiling\j
\JThe secret of the Neandertal bones\j
\JAnts in the plants\j
\JBirds and dinosaurs\j
\JT. rex may be in for a name change\j
\JHawaiian volcanoes, hot spots and ridges\j
\JPredicting toxic algal outbreaks\j
\JPacific leatherback turtles face extinction\j
\JHow cancer keeps on keeping on\j
\JFruit fly model: how mosquitoes carry malaria\j
\JFood chains and pond size\j
\JAn unusual crocodile fossil\j
\JModifying malaria mosquitoes\j
\JHumans can regrow liver cells from bone marrow\j
#
"The Iceland Healthcare Database, ethics and bioinformatics",1537,0,0,0
(Jun '00)
Bioinformatics has suddenly become a buzz-word, with a major article on the topic in the July 2000 issue of \IScientific American\i, and breakthroughs and developments turning up all over the place. See, for example, \JThe Arabidopsis Information Resource (TAIR)\j or \JMcGill Centre for Bioinformatics\j and \JCreating a virtual plant\j, all this month. Or consider the wider implications of the Iceland Healthcare Database (IHD), discussed in the \INew England Journal of Medicine\i during June, and described below. Or consider the new database created to describe the results outlined in \JThe genetics of blood stem cells\j.
In simple terms, bioinformatics is the junction between computerised management of large data banks and all of biology. A classic example is the human genome (\JHUGO\j) project, which relies heavily on computers for its completion, and uses computers to store, transmit and compare the data from different sources.
But is also involves gathering information on human populations, whether it is the extended Corbett family in Australia, about half of whose 200 members carry the gene for an unpleasant bone condition called \JPaget's disease\j, or the whole population of Iceland.
Why Iceland? Well, Iceland has medical records for all its citizens going back to World War I and detailed genealogic information going back even further. Because Iceland's small population (270,000) has long been isolated and homogeneous, it is thought by many to be an ideal place to search for disease-related genes. That means that, if the nation's people were willing to be tested, the whole world (and the bioinformatics companies involved) would benefit. In the case of the Corbett family, the family stands to benefit, as future generations will be able to be diagnosed more readily, but there will also be benefits for others beyond the family, so even here, the key ethical question of 'data ownership' is an issue. There are also questions of discrimination as well: if a family can be shown to have a higher probability of possessing a damaging gene, should they pay higher insurance premiums, or be barred from certain jobs, training, or other benefits?
So the new technology offers interesting possibilities in developing methods to understand diseases better, but at the same time, it also presents new ethical challenges. A large part of the solution would seem to be a question of informed consent. Only in this way can science gather the information it needs for the high-powered data analysis that will answer the questions we thought were out of reach.
In 1997, deCODE genetics, a genomics company located in Iceland, proposed the construction of a centralized database called the Iceland Healthcare Database (IHD). The aim was to allow researchers 'to cross-reference phenotypic information with a large amount of genotypic and genealogic data.' In ordinary terms, they needed measures on people (the phenotype), together with their genetic make-up and their family relationships (the genealogy).
The IHD's stated aim is 'to discover new methods to diagnose, prevent, and cure common diseases.' As soon as this was announced, it raised many questions about scientific ethics, protection of privacy, and the possible corruption of science by commercial interests. There was a vigorous debate before the Icelandic parliament passed a law permitting the construction of the IHD. This was to be a database made up of information from the medical records of all Icelandic citizens.
The debate involved 700 newspaper articles, more than 100 radio and television programs, and several town meetings all across Iceland, and led to a 75% support for the law at the time it was passed, with the remaining 25% opposing the law, the Health Sector Database Act, late in 1998. A survey taken in April 2000 revealed that "90% of those expressing an opinion support the law now, with just 10% against it."
This is a slightly glib use of statistics, for what it leaves unsaid. To see this, you need to know that the 'don't know' group was around 14%, meaning that actual support for the IHD had only shifted from 75% to 77.4% in that time. The opponents to the IHD argue that it is more accurate to call the process "community consultation" than to speak of "community consent."
The law allows the IHD data to be taken under the assumption of 'presumed consent.' Presumed consentÆ is a nebulous concept, but in the context of this project, it is now treated as the 'consent of society to the use of health care information according to the norms of society,' according to the IHD collectors.
The questions to consider are:
\I* What right do I have to deprive my society of data about me that may help my society to prevent others, perhaps my descendants, and certainly my relatives, suffering from genetic conditions that I may suffer from?
* Will I suffer in any way if I am seen to be protecting my privacy?
* To what extent can the information I allow to be taken be used against me?
* Will the benefits be shared fairly?\i
Another way to put this would be to ask: whose rights are the most important, those of the individual, or those of society? Of course, it is always possible that we might be able to respect both, but in general, favoring one set of rights must damage the other set.
The IHD law allows individuals to opt out of the database by filling out a form which is freely available. The list of those who have opted out of the IHD shows no names, and social-security numbers which appear on the list are encrypted to diminish the likelihood that opting out of the IHD will lead to discrimination.
Of the 8.6% who remain strongly opposed, 80%, or 7% of the nationÆs population, had opted out of the database. The remaining individuals can be cross-matched on genotypic data, but only in accordance with strict rules set out by the Data Protection Commission of Iceland. (That, at least, is the way deCODE puts it. Critics say that the 18,000 who have opted out represent more than 10% of Iceland's adult population.)
An unresolved issue is whether deCODE will be allowed to ask for broad consent from participants to correlate any information in the database with data on variance in their genomes (genotypic data). "Broad consent" as applied here indicates consent in which the potential subjects cannot be informed in the same detail required by informed consent.
At the same time, broad consent is a far cry from "blanket consent," which would give researchers an unrestricted right to use the data or the biologic sample.
According to IHD, "With broad consent to use the genotypic information to study the genetics of health and disease it would be possible to use combinatorial analysis systematically to seek the best fit between all regions in the genome and all phenotypic variants recorded in the data base. The interests at stake are not trivial, because, without broad consent, the data base would be only an extraordinarily effective tool for classic gene mapping, rather than a revolutionary method for studying the interplay between genetics and environment in human disease and health."
Broad consent is generally accepted by bioethicists, although there are those who argue that broad consent has problems because of the difficulties in making certain that consent is informed. Nobody, they say, should participate in biomedical research unless he or she makes an informed decision to do so, and nobody should be coerced or tricked into making such a decision. The goal is to protect the autonomy of the individual; the tool is informed consent.
By their decision to engage in the debate on bioethics, such people have marked themselves out for their concern over the rights of the individual who can be identified, over the rights of the unknown individual who may be condemned, unnecessarily, to die an unpleasant death from a hideous disease. At the other extreme, there are bound to be medical specialists, fervent in their mission to beat disease, even if they need to risk causing harm to a few for the sake of the many. In other words, there is no clear and easy answer, but as we stow more and more information in bioinformatic systems, the need to reach a consensus becomes more important.
Then there is the question of tainted data. If somebody \Idid\i breach the guidelines and misuse data, then found the 'cure for cancer', should we use it? The issue arose some years ago when people realised that data obtained from human guinea pigs in concentration camps during World War II were available for use. Should we ignore the data, making the victims' deaths totally pointless, or should we recognize that we cannot undo what is done and use the data to save lives? Once again, there is no easy answer.
There are a few simple rules that nobody would argue with: we should not ask children or mentally impaired people to offer consent, and it is important, in a democratic society, to preserve the individual's right to make uninformed and even foolish decisions. Yet, if that decision was to commit suicide, we would have no compunction about stopping them. So where do we draw the line? Should we force somebody to agree to providing data which would save their lives? If we take away the freedom to murder, should we not take away the freedom to withhold information that would save lives? How about information that only \Imight\i save lives?
In the end, it probably comes down to trust, creating an environment where science and scientists are seen as being a benefit to society. But in a world where more people know their star sign than know what a ribosome does, or what granite is made of, or what the laws of thermodynamics are about, what chance is there of that?
According to deCODE, there are three parts to genetic research in their database: 'The first is consent to the acquisition of the biologic material that is used as the source of DNA. The second is consent to the genotyping of the DNA. The third is consent to the use of the genotypic information that results.'
The first two issues are common to many studies, but the third aspect is different because the operators seek broad consent, which will allow them to go on fishing expeditions, looking for the unexpected correlations that may lead to breakthroughs. Yet the analysis generates information about the group, not about the individuals involved, so there should be no real cause for concern. Unfortunately, the consent process for correlating genotypic data with the medical-records information contained in the IHD has yet to be defined by the committees that will oversee the IHD, and given a wide public inability to distinguish correlations from causes, it will be a matter for special concern.
This is where clever design comes in. The IHD data will be copied from medical records that are filed under individual names and social security numbers at the various health care institutions, but stored with identification numbers encrypted by the Data Protection Commission of Iceland. More importantly, information will be retrievable from the database only for groups of 10 or more people, not for individual persons. The end result is that the confidential information in the IHD would be much easier to get from the original records.
The deCODE staff argue that it is probably better for a private company to hold this information than for the state to do so, since governments can violate the privacy of individuals to advance the interests of society as a whole. On this logic, government should hold the data, since a private company might use it to make a profit, something governments seem unable to do. This still raises the question of why a welfare state like Iceland should give away for nothing a data set that will allow the private company to make considerable profits, even if they benefit humanity at the same time.
The deCODE case also suggests that if a health care database managed by a private company violates privacy, the company can be closed down, while violations can be dealt with by prison sentences and fines. It is hard to see why such limits could not also apply to a government database.
But what of possible benefits for participants? The structure of the database makes it difficult to connect discoveries made through the use of the database directly with individual persons. Still, if the appropriate authorities granted permission, it would be relatively easy to identify and contact all persons in Iceland who had a particular risk factor for disease. As part of this, people giving genetic material under explicit consent will have the option to specify that their data should be linked to their names, and whether they should be notified about any associations between alleles they carry and specific diseases or predispositions to the development of disease.
But even a scenario like this has its problems. Suppose research shows that Snorri's syndrome sufferers all have a mutated g5-alpha gene? Until we know what proportion of the population carry that gene, we can say little about the linkage. After all, every one of those people with Snorri's syndrome will also have genes for good hemoglobin, good digestive enzymes, and many other common genes as well.
This outline is based partly on material provided by Jeffrey R. Gulcher and Kari Stefansson, both of deCODE genetics, and so may understate the concerns felt by opponents to the IHD plans, although every effort has been made to add these concerns at the appropriate points. The criticisms have been described by George Annas in the same issue of the \INew England Journal of Medicine\i. Annas, who clearly opposes the whole concept, argues as follows on the consent issue:
"With Glantz and Roche, I have previously proposed that new consent would not be necessary for research if the DNA were stripped of its identifiers, so that it could not be traced back to the individual person. Such research with nonlinkable data, sometimes termed "anonymous" research, does not violate individual privacy, since the individual source of the DNA, by definition, cannot be identified and thus cannot be harmed. On the other hand, if identifiers are retained, then persons can be tested, screened, and analyzed without informed consent each time a new genetic research project is initiated. This is not acceptable, because by definition such prior consent is not informed consent (since the person does not know what he or she is agreeing to); therefore, the authorization is actually a waiver of consent, which is unacceptable for research."
Having shown that there are serious problems with linked DNA, Annas goes on to note: "The only question for nonlinkable DNA samples, which are simply identified as "Icelandic," is whether individual citizens are willing to permit their DNA to be used in simplistic and reductionist genetic research that could label the Icelandic people in a negative way -- for example, as being prone to violence or alcoholism."
This is not the first time that ethical concerns like these have been raised, nor will it be the last.
#
"The Arabidopsis Information Resource (TAIR)",1538,0,0,0
(Jun '00)
As part of this month's lead theme on bioinformatics, it is worth noting the methods being used to ensure that scientists around the world have access to the latest information on the genome of the small workhorse of plant geneticists, the mustard plant \IArabidopsis thaliana\i.
Over the next decade, geneticists plan to reach an understanding of the function of every gene in \IArabidopsis\i. Now, the data needed to isolate essentially any gene in the plant have been released by The Arabidopsis Information Resource (TAIR). TAIR is funded by the (US) National Science Foundation (NSF), which has announced the release while acclaiming it as representing "a new paradigm in public-private information sharing for the genomic study of model organisms." TAIR offers a comprehensive list of more than 39,000 polymorphisms between subspecies of the plant via a Web site, http://www.arabidopsis.org/cereon
The key to the excitement at the NSF comes in the last part of the URL, because the information was gathered by Cereon Genomics, LLC, which has taken a groundbreaking step in making privately owned data available to the public sector. If followed by others, this lead will revolutionize the sharing of biological data in the post-genomic era.
Polymorphisms between subspecies of an organism typically are used to map genes on a chromosome and eventually isolate the gene to study its function. No other organism has such a rich collection of polymorphisms accessible to the academic and nonprofit sectors.
Researchers in the academic and nonprofit sectors can access the new Arabidopsis data by registering at www.arabidopsis.org/cereon and agreeing to the terms of access by Cereon. The agreement is based on the reasonable premise that the Cereon information cannot be used in a commercial setting (i.e., selling parts of the database for profit).
This is not a restriction on any results gained from the use of the information (i.e., if the data lead to the discovery of a gene, the restriction does not apply to the gene). Registered users will get a free license and can publish up to 20 polymorphisms without obligation to Cereon but cannot sell or redistribute the data. Other potential users should contact Cereon by email at athal@cereon.com for a commercial license.
#
"New McGill Centre for Bioinformatics",1539,0,0,0
(Jun '00)
Once a new branch of science reaches a certain level of complexity, the specialists in that branch need to work close together, to spark off each other, and to train the next generation of specialists. Bioinformatics has just reached that stage, and McGill University in Montreal, Canada, which almost a century ago was gathering in some of the best minds to work on radioactivity, has just set its sights on establishing a Centre for Bioinformatics.
Geneticists now have 90% of the human genome available and, as we have been reporting over recent years, many smaller genomes have also been sequenced in their entirety. This leaves us about to enter what is being called the "post-genomic age," which experts say will transform our society to an extent unparalleled in history.
So what is bioinformatics, and what is its promise?
Bioinformatics is the tool we will use to understand the integrated behavior of 100,000 genes. It will help us understand how they turn one another on and off in cells and between cells. It will also bring us to a better understanding of the signaling networks within and between cells.
It is now very clear that biology needs computational and mathematical techniques to deal with the flood of data coming out of the world's genomic laboratories. We have RNA chips which can tell us how active hundreds of genes are in a given cell type, and the only way to deal with that sort of information is by computer analysis. This is especially so as the data are combined into ever larger databases, snapshots of different cells at different moments, snapshots which need to be welded into a 'movie' that starts to reveal developmental pathways.
In the future, we should be able to direct the movies, giving a group of cells a hormone, and then watching to see the order in which genes become active and then drop from the cellular scene, building the picture up by computer analysis of the data, bioinformatics.
Most of the work being done now relies on a standard method of statistics called cluster analysis. This looks for groupings of genes, seen by the computer as 'clusters' in many-dimensional space, as these clusters are reflected in chip data. In this sort of picture, genes that cluster on a regular basis are likely to be related in some way, offering clear hints about the functions of unknown genes as soon as the function of one gene is found.
Another aspect involves looking at the genomes of different organisms to see if there are any identified genes that are similar to the genes in a new genome. Many human genes, for example, are immediately recognisable because of their similarity to genes in mice, worms or yeasts, but it takes a computer to carry out the search of an entire haystack to find a single needle.
The methods available now are not particularly useful when it comes to deciding that gene X triggers the activity of gene Y, which produces an enzyme that turns off gene X, while at the same time triggering gene Z into activity. The promise of bioinformatics is that it will go far beyond cluster analysis. It will not replace traditional laboratory studies, but it will focus them on the key questions that need to be answered or confirmed.
Bioinformatics should help us to unscramble the whole mysterious area of the signals that pass from cell to cell, an area that we only understand faintly right now. We can also anticipate that scientists will find that many of their simplistic linear pathway models of development will turn out to be simply the clearest lines in a complex network. Identifying the whole network is another part of the promise of this new field.
Bioinformatics should give us the power to deal with the cells of a tumor so that they differentiate into some other kind of cell, unable to behave as cancer cells do. Or we may be able to trigger them to apoptosis, the controlled cell death that our body uses in so many other ways to maintain us as we are.
The prospects for controlling cancer are fascinating. Most cancer cells are monoclonal, derived from some single cell that went wrong, but as cancer cells differentiate, the cloning process is 'leaky' -- some of the resulting cells are normal. If bioinformatics can track down why some cancerous cells revert to normal, and if scientists can then use this to drive the whole of the cancer back to normality, cancer will cease to be a worry. There are several big 'ifs' there, but this is where bioinformatics may lead us.
But the promise does not end there. Rather than extracting islets from cadavers to infuse into living diabetics (see \JIslet transplantation and Type I diabetes\j, this month), we will be able to regenerate the islet cells in the pancreas of a diabetic, avoiding all of the problems of immunosuppression that we now face.
This brings us to McGill's new centre, which is to be fully operational by the end of 2001. It is, says the university, "to be led by an internationally-renowned scientist, will have 20 full-time researchers working together, including current academic staff involved in genomics, structural biology, genetic epidemiology, physiology, computer science, bio-instrumentation, proteomics and other related fields."
That list is probably as good a short definition as anybody can give right now of the extent of bioinformatics, though for a detailed definition, McGill's list of desired skills (below) is also a help. Right now, we are in the position of the first atom-splitters, trying to see what a nuclear weapon and a nuclear power industry might be like, but completely unaware of the potential for nuclear medicine. Bioinformatics will be all of the things in those lists, but there are bound to be some surprises as well.
One thing that is certain is that bioinformatics is an inherently interdisciplinary field. At McGill, medicine, science, engineering and agricultural and environmental sciences will all have a part to play, but the rest is something we will have to wait to see.
But in this age of 'virtual everything,' where will the scientists be? On the campus, according to Luc Vinet of McGill. Research academics know that one of the key elements to successful research is a good tea room, where staff can sit, away from their bubbling vats and hissing jeroboams, discussing their progress and their problems, and Vinet probably recognizes this as well. He says he wants to see bioinformaticians working in the immediate vicinity of genomics and proteomics experts, since they constitute the main driving force behind bioinformatics.
No doubt we will see other institutions moving in this direction over the next two or three years. For those who think theyÆd like to get involved in something like this, here is the skills shopping list that McGill has set up in its hunt for staff:
* \BBioinstrumentation\b: Instrumentation and experimental methods will be needed for measuring and acquiring information about the structure and function of biologically active molecules, including DNA, RNA, and proteins.
* \BFunctional genomics\b: Collaborative work between medical scientists and computer scientists will identify clearly defined medical problems that are amenable to careful analysis by accumulating accurately timed gene expression data. This work will identify novel control points to target pharmaceuticals.
* \BStructural biology\b: A key problem is to figure out the structure and function of biomolecules based on the sequence information, and to rationally design new drugs based on known physiology and biochemistry.
* \BComputation and DNA\b: This work is directed at developing novel approaches to the interface of computation and molecular biology.
* \BGenetic analysis\b: The genetic dissection of disease, particularly the complex human disorders such as asthma, heart disease, diabetes, cancer, etc. requires the development of a firm theoretical and practical understanding of how to best apply methods for pedigree and population genetic analyses.
* \BData mining and visualization\b: Data mining methods are being developed to analyze hidden information in large databases. Computational methods are being applied to analyze databases of physiological function, e.g., heart rate variability. This area entails joining powerful computer methods for data storage and analysis to biological data.
* \BGenetic system modeling and identification\b: Enhanced computer power makes possible realistic modeling of physiological systems on scales ranging from the molecular to the organism. Modeling plays an increasingly important role in understanding the dynamics of gene expression; protein synthesis; ion channels; function of heart, nerve and endocrine cells, and properties of aggregates of cells.
* \BBiomedical imaging\b: New power in imaging is having an extraordinary impact on experimental systems and on understanding the human body in the clinic. Collaborative work between basic scientists, computer scientists and medical scientists will extend the new technologies in both experimental and clinical applications.
There seems almost to be a 50-year cycle going on in biology. It is probably just coincidence or selective attention by the observer, but the 1850s saw the emergence of our modern notions of evolution, while the first ten years of the twentieth century saw the foundation of basic genetics, and the 1950s saw the unravelling of the structure of DNA. Now, it seems, we are poised on the brink of yet another surge of activity, but it is a surge that has its roots in the whole of science.
The revolution of the 1950s, the working out of the structure of DNA, came about when the field of biology was invaded by physicists and a few chemists. Bioinformatics will see an invasion of mathematicians and a few physicists. If the pattern holds good, who will enter the field of biology in the 2050s?
#
"Creating a virtual plant",1540,0,0,0
(Jun '00)
In yet another variation on the bioinformatics theme, a group of plant scientists are proposing that the data available on the genetic workhorse plant, \IArabidopsis thaliana\i, are sufficient to consider the development of a 'clickable plant,' a four-dimensional computer model that can show in all of the physical dimensions plus time the biological machinery of a plant. The information is not all there at the moment, they say, but there is enough information to make a framework which can be added to.
They plan to seek support from the US National Science Foundation for a multinational effort to determine the function of all the genes and proteins in \IArabidopsis\i so they may be built into the virtual plant that can then be used to examine every aspect of a plant's development.
In a brief announcement on the Internet during June, Joanne Chory from the Salk Institute, and one of the proponents of the plan, said, "We'd love, for instance, to see a four-dimensional view of a plant that covers all the details from when the seed germinates to when the next generation seeds fall off the mother plant. And we'd like to be able to stop the process at any phase in the plant's life-cycle and see which proteins are expressed and how they interact."
#
"Adult stem cells are versatile",1541,0,0,0
(Jun '00)
A \IScience\i report in early June indicated that reprogrammed adult neural stem cells may be able to generate all sorts of cell types, such as the cells in heart, liver, muscle, intestine and other tissues. When the adult neural stem cells from mice were grown with embryonic cells or within an embryo, the adult stem cells were able to revert to an unspecialized state and give rise to different cell lineages.
While this sort of versatility is expected in embryonic stem cells, which are the "blank slates" of an organism that are capable of developing into all types of tissue in the body, the Swedish discovery of similar abilities in the adult stem cells is something of a surprise.
Scientists have been interested in the potential of embryonic stem cells for some time. The cells might be used some day to create new tissues for organ transplants and replacements for cells destroyed by diseases like diabetes or trauma like spinal cord injuries. However, there are ethical and legal problems in using the embryonic cells, making the adult cells an easier choice - if only they could perform. (For more on the embryonic stem cells, see \JA breakthrough with human embryonic stem cells\j, November 1998.)
Until recently, researchers thought that the more specialized adult stem cells, found in areas of the body like the skin, nervous system, and blood and lymph systems, could only give rise to their own kind. Now evidence is coming together that adult stem cells may be capable of reprogramming themselves. In early 1999, a mouse study showed how brain stem cells transplanted into bone marrow could produce blood cells, and now this.
The Swedish team exposed genetically tagged mouse neural stem cells to a variety of tissue types by growing them together with embryo cell cultures in the lab and injecting them into early-stage chick and mouse embryos. They did this to test the influence of environment on adult stem cell destiny, and found that the adult stem cells take cues from their cellular environment to produce offspring of the same type as the cells that surround them.
In lab cultures, the offspring of the stem cells switched their identities to become muscle cells. It depended on which early cell layer they managed to infiltrate in the developing chick and mouse, but the various stem cell progeny incorporated into these embryos contributed to heart, lung, intestine, kidney, liver, nervous system, and other tissues.
Even lone neural adult stem cells displayed this ability to differentiate themselves into various cell types. In one case, the researchers managed to achieve apparently normal and beating embryonic mouse hearts containing very large numbers of these derived stem cells.
As yet, it is a mystery how the environment manages to influence an adult stem cell's fate: "The short answer is that we have no clue," says coauthor Jonas FrisΘn of the Karolinska Institute in Stockholm. "We can speculate that the crucial elements are extracellular signals, or secretions from the embryonic cells. There is probably a cocktail of various factors involved, but we have no solid data yet about what these molecules are."
Obviously, this is the next step in the puzzle, because if scientists can determine the molecular composition of these extracellular signals, they can direct adult stem cells toward several different cellular lineages, without exposing them to embryonic cells at all. Or, to use more lurid language, we will be just a step or five away from growing spare organs for people.
Some of the answers may come from testing other types of adult stem cells, not just neural cells, to see if they have similarly plastic potential. (Note that 'plastic' in this context means 'changeable'.) This will be an important question to answer, because neural stem cells are the least accessible of all the stem cell types. The research team is also planning future experiments to see how long the transformed stem cells survive within these tissues, and whether they retain their new commitments indefinitely.
See also: \JThe genetics of blood stem cells\j, this month
#
"The genetics of blood stem cells",1542,0,0,0
(Jun '00)
A group of American scientists reported in the journal \IScience\i in early June that they have developed a database of more than 2,000 genes which gives the first comprehensive picture of the workings of blood stem cells. This will contribute to solving one of the great mysteries of biology: how blood cells replenish themselves. The short answer is that blood stem cells are the master component of bone marrow that gives rise to all cellular constituents of blood, from red and white cells to platelets, while the long answer is still coming - but now it is closer.
The prospect is that biologists will now have a better insight into diseases of the blood, such as leukemias, and also into how blood stem cells may better be used therapeutically in transplantation and in eventual gene therapy scenarios. The research also may yield insights into other types of stem cells throughout the body, such as those responsible for the production of skin, intestinal cells, and liver tissue.
One of the fascinating aspects of stem cells is their balancing act between maintaining their own numbers and spawning progeny that go on to become the many types of mature specialized cells of different tissues and organs. Very little is known, in any mammalian stem cell system, about how the underlying molecular biology allows these cells to make choices about their fates.
The researchers created a library of gene fragments from blood stem cells of mice. They also created a library of genes from a sample of mature blood cells that had been depleted of stem cells. Then it was a matter of arithmetic to subtract one database from the other to take out the majority of commonly expressed 'housekeeping' genes while enriching for those that are preferentially expressed in the immature stem cells.
The next step was to analyze the DNA sequences in the 'subtracted' library using sophisticated computational techniques and comparing them to the sequences of many other genes and proteins. With this approach, the researchers have so far identified more than 2,000 genes that are likely to be active in stem cells.
This beats hands-down the traditional method of searching for an animal that has a stem cell disorder and looking for the gene mutation that causes the problem, or any method which relies on focusing on small numbers of genes previously identified in other systems. It is a tremendously exciting approach that we can expect to see applied in other areas.
This big-picture approach could answer a host of critical questions, say the researchers. What, for example, prevents the proliferation process from getting out of hand, as it does in the leukemia blood cancers. Or how are the correct proportions of mature blood cells produced on a daily basis throughout life?
Best of all, thanks mainly to the efforts of postdoctoral researcher Robert Phillips, there is a Web site that carries the database of genes and outlines many aspects of stem cell biology, located at http://stemcell.princeton.edu and open to all visitors, although it is heavily technical. This database is to be developed further, and looks set to be one of the more innovative tools of the coming revolution in bioinformatics.
Key names: Ihor Lemischka, Robert Phillips, and G. Christian Overton.
See also: \JAdult stem cells are versatile\j, this month
#
"Air-breathing rockets",1543,0,0,0
(Jun '00)
NASA's work on developing an air-breathing rocket engine continues to move ahead, with a successful one-hour burn in May 2000 of such an engine, formally a rocket-based, combined-cycle engine, at the General Applied Sciences Laboratory (GASL) in Ronkonkoma, NY.
This is the longest burn yet on an air-breathing rocket engine, and brings closer the day when space transportation becomes safe, reliable and affordable for ordinary people. Powered by engines that 'breathe' oxygen from the air, the spacecraft would be completely reusable, take off and land at airport runways, and be ready to fly again within days.
The engines are to get their initial take-off power from specially designed rockets, called air-augmented rockets, that boost performance about 15% over conventional rockets. When the vehicle's velocity reaches twice the speed of sound, the rockets are turned off and the engines rely totally on oxygen in the atmosphere to burn the hydrogen fuel. Once the vehicle's speed increases to about 10 times the speed of sound, the engine converts to a conventional rocket-power mode to propel the vehicle into orbit.
The next step, according to NASA, will involve defining the flight-weight structures and engine systems required for in-flight demonstration of this advanced propulsion system.
#
"Antimatter and space",1544,0,0,0
(Jun '00)
Every space opera scriptwriter knows all about antimatter and its potential as a fuel. The details are never all that clear, but during battle scenes, you may reasonably expect the chief engineer to complain that the containment fields are about to go down. This would be fairly worrying because the thing about antimatter is that, when it comes in contact with matter, it flashes out of existence, converting the mass of both the matter and the antimatter to energy, lots of it.
That at least, is the standard understanding for science fiction purposes: antimatter may be talked about, but there isn't a lot of it around. In reality, there is quite a lot of it about, but only in tiny lumps, far too little to vaporise any starship and its crew while sterilizing every dust mote within 15 parsecs.
A number of serious scientists believe that antimatter may well be the key to the future of human space travel. The energy yields are certainly high, but there are still the problems of gathering and storing the antimatter until it is needed. We have known about antimatter for a generation, and we even use it (see below), but we still have a great deal to learn.
The antimatter story goes back to 1928, when P. A. M. Dirac, a British physicist, came up with a theory for the motion of electrons in electric and magnetic fields. This was not the first such theory, but Dirac had managed to include the effects of Einstein's Special Theory of Relativity in his theory.
Dirac's equations worked exceptionally well, describing many attributes of electron motion that previous equations could not. There was just one oddity: it followed from his theory that the electron must have an 'antiparticle,' identical with the electron - having the same mass but a positive electrical charge, while the standard electron has a negative charge.
This was the heyday of quantum theories, when any decent physicist had to expect to believe three impossible things before breakfast, so the world of science was unconcerned by the prediction. However, physicists around the world set out to discover the truth of the matter by experiment.
In fact, it took until 1932, when Carl Anderson observed this new particle experimentally and it was named the 'positron,' a name that may give the clever reader a clue as to how we use antimatter today. This was the first known example of antimatter, but by 1955, an antiproton had been produced by the Bevatron at Berkeley, and, in 1955, scientists at the CERN research facility in Europe created the first antihydrogen atom by combining the antiproton with a positron. They had a rather limited time to celebrate, as the lifetime of these antihydrogen atoms, created at almost the speed of light, last for about 40 nanoseconds.
For most physical purposes, the fundamental particles in nature are identical to those of their corresponding antiparticles. More importantly, \Ievery\i fundamental particle must have an equivalent antiparticle, differing only in the mathematical sign of some property, and many particle-antiparticle pairings have been established by experiment, even oddities like the neutron and the antineutron, both with a neutral charge, but still capable of annihilating each other. Other properties such as magnetic moment are also able to be reversed in sign.
Antiparticles can be captured in a Penning trap, which uses a combination of low temperatures and electromagnetic fields to store antimatter, though nowhere near enough to power a starship's warp drive. Still, just as box kites were the forerunner of modern airliners, so we may one day look back at the Penning trap and see it as the place where antimatter drives began.
Yet, if there is no real difference between particles and antiparticles in particle physics theories, we are left with a question. Most physicists believe that at the time of the Big Bang antiparticles and particles were created in almost equal numbers, so why is antimatter so rare today?
It seems (for now, anyhow, and subject to change) that the key word is 'almost.' If particles outnumbered antiparticles in the Big Bang by as little as one part in 100 million, then the present universe could be explained by those extra particles that were not annihilated by an antiparticle counterpart. That, at least, is where the current theory stands.
Other physicists argue that even if the amounts of antimatter and matter created in the Big Bang were the same, the physics of the two types are nearly, but not quite, the same. If this is so, then the difference would explain why there was some matter left over after all original antimatter had been annihilated.
Of course, all of this theorizing would go out the window if there were significant amounts of antimatter left somewhere else in the universe. But how could we tell? If there is still any matter-antimatter annihilation going on in the universe, we could tell very easily indeed, since the reaction would produce telltale radiation.
So far, none of the orbiting gamma ray observatories have uncovered evidence for vast amounts of antimatter in the universe, but NASA's Compton Gamma Ray Observatory (see \JFarewell to Compton\j, this month)has detected photons with an energy of 511 keV, the energy created when a positron and an electron collide and annihilate, coming from the center of the Milky Way galaxy, where a jet of mutually annihilating electrons and positrons appears to exist.
So large supplies of antimatter, enough to power a starship, may still be a long way off, but that does not stop us using very small mounts of antimatter every day in medical diagnosis. Positron Emission Tomography (PET) scans are used routinely in medical imaging, and positrons have other uses as well. We get these positrons from the decay of radioactive isotopes, and the world's largest particle accelerators, all together, produce something like 1 to 10 nanograms of antimatter each year.
It is likely to be quite a while before any chief engineer will need to lose much sleep over the state of a starship's containment fields.
#
"Farewell to Compton",1545,0,0,0
(Jun '00)
One of astronomy's most useful tools of the last decade ended its career in early June when NASA commanded the Compton Gamma Ray Observatory to re-enter the atmosphere, where it broke up and plunged into the Pacific Ocean.
For nearly nine years, Compton's Burst and Transient Source Experiment, otherwise known as BATSE, maintained a watch on the universe's mysterious gamma ray bursts. The gamma rays that come from all sorts of sources -- black holes, pulsars, quasars, neutron stars and other exotic objects, offer clues to the way in which stars, galaxies and the whole universe form, develop, and eventually die.
In its time, BATSE answered some of the most perplexing questions in astronomy, and even answered a few questions that astronomers did not know they should ask. It has also provided new questions for the future.
In fact, the gamma ray experiment was just one of four major science instruments aboard Compton. It was made up of detectors at each of the eight corners of the rectangular satellite, which scanned the entire universe for bursts of gamma rays lasting somewhere from fractions of a second to minutes. Once it found a burst, BATSE set the other three instruments to give the source a closer look.
Gamma rays are too powerful to be brought into focus by a conventional telescope mirror, so the heart of a BATSE module was a large, flat, transparent crystal that generated a tiny flash of light when struck by a gamma ray.
Amplified and recorded, the flashes were transmitted back to Earth, while the other instruments, with limited fields of view, closed in on the source identified by BATSE. In its nine years, BATSE found more than 30 new exotic astrophysical objects and other phenomena stretching from Earth's own atmosphere to the edge of the universe. In 1996, it was one of the most-cited experiments in scientific papers. BATSE research won prizes for ten astronomers, and it provided the material for 18 Ph.D. theses. It also pioneered the use of the Internet in astronomy: in January 1999, the instrument used the Internet to cue a computer-controlled telescope at Los Alamos National Laboratory in Los Alamos within 20 seconds of registering a burst.
BATSE was also used to turn the Hubble Space Telescope, the Chandra X-ray Observatory and major ground-based observatories to pay attention to new and exciting events. In the end, even though the science instruments were functioning normally, one of the three on-board gyroscopes used to steer the orbiting observatory malfunctioned, and NASA decided in March to return Compton to Earth's atmosphere for safety reasons. With insufficient fuel to boost itself to a higher orbit, the craft eventually would have fallen from orbit, slowed by Earth's atmosphere, and it was safer to bring it down under control.
The problem was that Compton was too large, at 15,422 kg (17 tons), to burn up entirely in the atmosphere during re-entry, so it could have exposed populated areas to the risk of falling debris. So its instruments were turned off, and the satellite came down uneventfully in a large and empty piece of the Pacific Ocean, after aircraft and ships were warned away from the anticipated path.
The danger area was a rectangular block 26 kilometers (16.1 miles) wide and 1,552 kilometers (963.7) miles long. The area was about 1,770 kilometers (1,100 miles) from any land.
The satellite was expected to come down in "hot and fast chunks" that were expected to range from paper-thin debris to 7 to 10 kg (15- to 20-pound) chunks. These were expected to fall at several hundred kilometers or miles per hour, having slowed down from a re-entry speed of 9.7 kilometers (6 miles) per second.
(Significantly, even after the problems with the Mars Polar Lander, NASA continues to give 'British' units in preference to metric units in its announcements.)
As it turned out, there were no witnesses close enough to assess the accuracy of these predictions and estimates, and the observatory came down without incident.
It will be replaced in about 2005, when the Gamma Ray Large Area Space Telescope, or GLAST, is due for launch, carrying aloft an instrument called the Gamma-Ray Burst Monitor, hopefully to pick up where BATSE left off. But while the Compton Observatory had to come down, it has left us a rich legacy of discoveries, according to NASA, who published a list of BATSE's achievements on the Internet. Among other things, it:
* settled a long-running scientific debate by determining that bursts originate from the farthest reaches of the universe, not from inside our home galaxy, the Milky Way;
* expanded the list of known gamma ray burst sources from a few hundred before Compton's launch to more than 2,600;
* created an unprecedented online database of gamma ray bursts that is available to scientists worldwide, now with more than 200 users;
* was instrumental in obtaining the first simultaneous observation of a gamma ray burst source in both optical and gamma ray regions;
* created the BATSE Coordinates Distribution Network, an Internet-based system for rapid notification of burst locations to observatories and astronomers around the world;
* discovered mysterious gamma ray flashes above thunderstorm regions, lasting only for thousandths of a second, similar to the lightning commonly visible below them;
* discovered several of the brightest X-ray sources in the sky, thought to be the result of matter spilling from a normal star into a black hole;
* discovered a new class of bursting X-ray pulsars, rotating bodies that emit enormous energy, that contradicted prevailing theory by emitting both regular pulses and occasional high-powered bursts of X-ray radiation;
* discovered several objects, believed to be black holes, that produce numerous jet-like radio wavelength emissions exploding at nearly the speed of light from the central core of their source;
* discovered new Soft Gamma Ray Repeaters -- thought to be neutron stars that undergo occasional blasts, emitting most of their energy in lower frequency gamma rays and fading in the snap of oneÆs fingers -- and helped pinpoint the location of others;
* coordinated observations with other satellites of several objects like black holes, active galaxies and others in several regions of the radiation spectrum simultaneously to improve knowledge of their behavior; and, last of all,
* confirmed the existence of magnetars, super-magnetized neutron stars with magnetic fields a thousand trillion times stronger than Earth's magnetic field -- so strong they could erase credit cards and pull pens out of a pocket at least halfway from Earth to the Moon.
And that, according to NASA scientists, is a record that GLAST may find hard to beat -- but they hope that it does.
#
"Water on Mars",1546,0,0,0
(Jun '00)
NASA announced in late June that pictures from the Mars Global Surveyor spacecraft revealed puzzling signs of water seeping into what appear to be young, freshly -cut gullies and gaps in the Martian surface. And unlike the 'canals' of Mars that were described in the 19th century, there is quite a lot of evidence to support the belief that there is water somewhere on the planet, some of the evidence due to be published in mid-July, but described in \JHow much water on Mars?\j this month.
These are not isolated oddities, either. Structures appear in more than 250 pictures from 120 locations on Mars and, oddly, in the coldest places on the planet. This is perplexing, because logic says that the sub-zero temperatures and thin atmosphere of Mars should have kept those wet areas from forming. Most of the sites (but not all) are in the Martian southern hemisphere and usually appear on slopes that get the least amount of sunlight during each day, at latitudes of between 30░ and 70░. On earth, they would lie between the edges of Antarctica and a circle defined by Sydney, Easter Island, Santiago in Chile, and Cape Town. In the north, the area would be between the northern margins of Iceland, Europe, Russia and the far north of Canada's land area, down to New Orleans, Houston, Cairo, Delhi, and Shanghai.
NASA scientists say they are quite surprised and confused to find present-day evidence of water, "because it doesn't really fit our models of what Mars is like." What they think they can see is evidence of flash floods, where repeated flows of mud and debris have poured down the slopes, but until somebody actually digs into the formations with a shovel, it will be hard to be certain.
For now, NASA believes that there must be water lying some 100 to 400 meters (about 300 to 1300 feet) below the surface and limited to specific regions of Mars. They estimate that each flow that came down each gully may have had a volume of some 2500 cubic meters of water, enough to fill seven community-sized swimming pools or supply 100 households for a month. There are no lakes or flowing rivers on Mars, and they have not found hot springs that might support life forms such as those that support extremophiles on Earth, but they now believe that there may be water on Mars, near enough to the surface to support life.
NASA had known about the images, taken in January, for about a month. The most striking features found in these locations were "entrenched, steep-walled, V-shaped channels" that appear to be cut by water action within the gullies, and the original plan was to publish first in \IScience\i on June 30. However, the rumors began to flow, and so the story was broken early to counter what NASA called incorrect accounts of the discovery.
"We see features that look like gullies formed by flowing water and the deposits of soil and rocks transported by these flows. The features appear to be so young that they might be forming today. We think we are seeing evidence of a ground water supply, similar to an aquifer," said Michael Malin. He is the principal investigator for the Mars Orbiter Camera on the Mars Global Surveyor spacecraft at Malin Space Science Systems (MSSS), San Diego, CA. "These are new landforms that have never been seen before on Mars." Malin and Ken Edgett have coauthored the \IScience\i paper, and appeared together at the press conference.
"Twenty-eight years ago the Mariner 9 spacecraft found evidence -- in the form of channels and valleys -- that billions of years ago the planet had water flowing across its surface," Edgett said. "Ever since that time, Mars science has focused on the question, 'Where did the water go?' The new pictures from Global Surveyor tell us part of the answer -- some of that water went underground, and quite possibly it's still there."
The gullies observed in the images are on cliffs -- usually in crater or valley walls -- and are made up of a deep channel with a collapsed region at its upper end (called an alcove) and an area of accumulated debris (an apron) that appears to have been transported down the slope at the other end. Relative to the rest of the surface, the gullies appear to be extremely young, meaning they may have formed in the recent past. "They could be a few million years old, but we cannot rule out that some of them are so recent as to have formed yesterday," Malin said.
The atmospheric pressure on the surface is about 1% of the sea level pressure on earth, and that means any surface water should immediately boil furiously, even explosively, even though Mars is extremely cold. That, argues Malin, is the reason why the process must involve repeated outbursts of water and debris, similar to flash floods on Earth.
"We've come up with a model to explain these features and why the water would flow down the gullies instead of just boiling off the surface. When water evaporates it cools the ground -- that would cause the water behind the initial seepage site to freeze. This would result in pressure building up behind an 'ice dam.' Ultimately, the dam would break and send a flood down the gully," said Edgett.
The process that starts the water flowing remains a mystery, but the team believes it is not the result of volcanic heating. "I think one of the most interesting and significant aspects of this discovery is what it could mean if human explorers ever go to Mars," said Malin. "If water is available in substantial volumes in areas other than the poles, it would make it easier for human crews to access and use it -- for drinking, to create breathable air, and to extract oxygen and hydrogen for rocket fuel or to be stored for use in portable energy sources." For more on the importance of the water, see \JBringing Mars into the Iron Age\j, March 1999.
#
"How much water on Mars?",1547,0,0,0
(Jun '00)
The crust of the Mars may hold two to three times more water than scientists had previously believed, based on a study by Dr. Laurie A. Leshin of Arizona State University. She compared the amount of deuterium, an isotope of hydrogen, found in a meteorite of martian origin with the amount found in the martian atmosphere, Her report was scheduled to be published in \IGeophysical Research Letters\i in mid-July, although the word on her findings was 'out' before the NASA announcement in \JWater on Mars\j, this month.
Deuterium, an isotope of hydrogen, has twice the mass of normal hydrogen, and it combines with oxygen to make 'heavy water.'In today's thin Martian atmosphere, water has a deuterium-to-hydrogen ratio five times higher than the ratio found in water on Earth. While deuterium and 'normal' hydrogen are chemically very similar, the lighter hydrogen is able to escape Mars' weaker gravity more easily, leaving the atmosphere enriched in deuterium - or so we used to think.
Leshin thinks otherwise. From the deuterium level in a meteorite known as QUE94201, found in Antarctica in 1994 and believed to have been blasted off Mars three million years ago, she has a measure to compare with the known atmospheric figures. She analyzed tiny water-bearing crystals in the meteorite on the ion microprobe instrument at the University of California at Los Angeles.
These crystals contain hydrogen from the martian interior, which was not affected by atmospheric escape, and they do indeed reveal a smaller percentage of deuterium than current martian atmospheric measurements. Then comes the bite: instead of this ancient water demonstrating the same deuterium-to-hydrogen ratio as Earth water, as had been assumed, Leshin's research shows that Mars had a deuterium-to-hydrogen ratio nearly double that of Earth before any atmospheric escape could have occurred.
Leshin believes that this could have resulted from loss of hydrogen very early in martian history as a result of extreme ultraviolet radiation from the young Sun, a mechanism different than the current escape process. Alternatively, she writes, it could imply that comets, which share the same deuterium-to-hydrogen ration as martian interior water, supplied most of the water found on Mars today.
So, while there was nothing wrong with the calculations, they were based on an assumption that is now seen to be wrong, and Leshin concludes that the martian atmosphere has lost two to three times less water through the eons in order to arrive at the isotope's current atmospheric level. That water should still exist today on Mars, she says, located within the planet's crust, and there should be two to three times as much as was believed at the start of 2000. In fact, evidence from this and previous research on martian meteorites supports the idea that a significant martian groundwater reservoir currently exists. How big the reservoir is, will remain to be discovered on the ground at some stage in the future.
#
"Ancient salts",1548,0,0,0
(Jun '00)
The 'Zag' meteorite is a recent acquisition of science, falling in Morocco in 1998, and an interesting one because it includes brine-pocketed salt crystals. But while the meteorite is new to science, it appears that the salt crystals may be among the oldest materials found in the solar system. That, at least, was the claim put forward in a paper in \IScience\i during June. One of the implications of the claim, if it is correct, is that hospitable conditions for life might have arisen earlier than previously thought.
The dating is based on iodine-129/xenon-129 ratios in the salt crystals which indicate that the salt crystals probably formed within about two million years of the solar system's birth. This suggests that the dust, gas, and ice swirling around the newborn sun clumped together into rocky fragments far more quickly than researchers have assumed.
These fragments were then the parent bodies for primitive meteorites like Zag and the essential building blocks for asteroids and planets. The scenario they offer sees Zag's parent body accreting rapidly into a rocky mass containing water and radioactive isotopes. Then the isotopes' decay generated enough heat to melt any ice within the rock matrix, and soon caused the liquid to evaporate altogether.
At this point, the salt crystals, mainly sodium chloride, or 'halite' to geologists, precipitated, forming a rare form of meteorite. Only one other meteorite, the Monahans meteorite, has been found with halite crystals, and the crystals in that meteorite also contained microscopic inclusions of water.
Water, of course, is a key ingredient in all life forms, so the finds have raised hopes of learning more about the possibility that life might have evolved elsewhere in the solar system. However, to draw any conclusions, scientists need to know how old these deposits are. Previous attempts could only place the time of liquid water within 100 million years after the solar system's formation. The Monahans halite had been dated using rubidium dating and, while this showed an old method, it is less precise than the iodine/xenon method.
The big problem was making sure that the results were not contaminated from terrestrial sources after the meteorite landed. The researchers analyzed xenon, iodine, and argon isotopes extracted from a tiny sliver of a Zag halite crystal. They found a surprisingly large amount of xenon-129, which forms when iodine-129 decays. This was significant because iodine-129 was present in the early solar system but is not found on Earth. Only about 26.4% of the terrestrial xenon is xenon-129, so when they found a peak for this isotope, the researchers knew they were dealing with non-terrestrial material.
Iodine-129 may not occur on Earth, but its half-life is known: it is 1.7 x 10\U7\u years, and with this knowledge, the group could then calculate the age of the halite. They estimate that the crystals formed about two million years after the birth of the solar system 4.57 billion years ago. In other words, the liquid water departed from its parent body soon after it had appeared.
Up until now, the oldest materials in the solar system were thought to be chondrules, glassy spheres that make up much of the mass in primitive meteorites. These are believed to have formed fairly early as well, about 5 million years after the solar system's birth. Now we have a picture of a batch of halite crystals growing on a newly formed asteroid that had just been formed by the collision of smaller particles. Then, some 300 million years later, another large impact smashed the loose fragments together into a more solid conglomerate, a piece of which became the Zag meteorite.
The liquid water has a greater significance, since volcanic activity is closely tied with the availability of water, which plays a major role in the formation of magma. This has important implications for understanding the geology of moons and planets with large amounts of heat in their interiors, say the researchers.
Key names: James Whitby, Ray Burgess, Grenville Turner, Jamie Gilmour and John Bridges.
#
"NIAID lets AIDS research contract to Australia",1549,0,0,0
(Jun '00)
An Australian consortium led by the University of New South Wales (UNSW) was awarded a US$16 million development contract from the US government in late June to develop and clinically test a potential preventative HIV vaccine. A formal announcement on June 28 indicated that contracts had been signed between UNSW on behalf of the consortium and the US National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health.
This is one of four NIAID contracts to engage in novel public-private partnerships to accelerate development of promising HIV/AIDS vaccines for use around the world. The program responds directly to President Clinton's call to increase public-private cooperation in developing vaccines against globally important diseases such as AIDS, tuberculosis, and malaria.
The work will rely on a 'prime and boost' vaccine technique which, together with a patented technology known as Co-X-GeneT, will encourage cellular immunity by making white blood cells that fight HIV infection. The 'double whammy' method seems to induce unprecedented levels of immunity.
At the moment, it requires two DNA vaccines and two fowlpox virus boosts, using genetically modified fowlpox, but the aim is to improve it and narrow it down to fewer vaccinations. If tests show that the 'prime and boost' technique is both safe and effective in the laboratory, the vaccine will be evaluated in two clinical trials: the first in Australia, the second in a country of the developing world.
The vaccine strategy starts with a priming with a DNA vaccine made from HIV genes. This is then boosted with a genetically engineered fowlpox virus that contains parts of the HIV genome, together with DNA for a human cytokine (a protein made naturally by white blood cells to stimulate the immune system). This second vaccine is where the Virax Co-X-Gene technology comes in: these vaccines generally combine a protein from the disease agent (antigen) with a cytokine.
The prototype boost vaccine will be manufactured in Melbourne by the Institute of Drug Technology Australia Limited on behalf of the Melbourne-based biotech company, Virax Holdings Limited, the licensee of Co-X-GeneT, who have licensed the technology from the Australian government research body, the CSIRO, and the Australian National University.
Key names: David Cooper, David Boyle, Stephen Kent, Sean Emery, Susan Kippax, Rosemary Ffrench, Ian Ramshaw and Ian Ramsay. Key Web sites: http://www.med.unsw.edu.au/nchecr for specific information, http://www.iavi.org/2/ for general background on AIDS/HIV vaccine research.
#
"Male circumcision and HIV",1550,0,0,0
(Jun '00)
An Australian study, reported in the \IBritish Medical Journal\i during May, suggested that uncircumcised men are more likely to be infected with HIV than circumcised men. Roger Short and his colleagues at the University of Melbourne looked at information from over 40 previous studies to reach this conclusion.
They suggest that the virus targets specific cells found on the inner surface of the foreskin. These cells possess HIV receptors, making this area particularly susceptible to infection, and the researchers believe that male circumcision provides significant protection against HIV infection by removing most of the receptors.
The best evidence for this theory comes from Uganda, from a 30-month study of couples in Uganda in which each woman was HIV positive and her male partner was not. During the survey time, no new infections occurred among 50 circumcised men, whereas 40 of 137 uncircumcised men became infected. This was in spite of all couples being given advice about preventing infection and an offer of free condoms.
So, even though there are strong cultural and religious attitudes towards male circumcision in some cases, the researchers recommend that male circumcision should be seriously considered as an additional means of preventing HIV in countries with a high level of infection. If this is unacceptable, they say, an alternative approach, developing a 'chemical condom' that blocks the receptors in both the penis and the vagina could provide a more acceptable form of HIV prevention in the future.
#
"Islet transplantation and Type I diabetes",1551,0,0,0
(Jun '00)
Type I diabetes, the sort that is treated with regular insulin injections, is generally agreed to be caused by the diabetic's immune system attacking those cells in the pancreas which secrete insulin. This attack on the cells in the islets of Langerhans in the pancreas is called an autoimmune reaction, and the result is that the victim is unable to make insulin, a hormone that the body uses to manage the balance of sugar in the blood.
Insulin injections can keep a more-or-less stable insulin level n the blood, but they are unpleasant, disconcerting to others who may find themselves witnessing the self-injection, and seem to offer more of the more-or-less, and less of the stable. As a result, severe diabetics are likely to lapse into a coma or to suffer bodily damage such as blindness as a result of their diabetic condition. One year after diagnosis, though, only 8% of type I diabetics are able to manage without injections.
Now comes the good news that seven patients in the United States, all with severe diabetes, have successfully been given transplants of islet cells, and all seven quickly reached 'sustained insulin independence' after the transplantation. That means their bodies were able to manage their own blood sugar levels without any need for injections.
The patients all required islets from two donor pancreases, and one required a third transplant from two donors to achieve sustained insulin independence, but all seven have been successful, bringing to the end a long run of disappointments with 267 allografts since 1990. These were not all failures, but only 12.4% resulted in insulin independence for periods of more than one week, and just 8.2% have done so for more than a year, so seven consecutive successes is a great triumph.
The secret, say the researchers, lies in their choice of immunosuppressive agents. Most of the failures, they say, used antibody induction with an antilymphocyte globulin combined with cyclosporine, azathioprine, and glucocorticoids. They developed a regimen that was glucocorticoid-free, and this seems to have tipped the balance by suppressing the immune reaction without damaging the crucial beta cells that actually secrete the insulin within the islets of Langerhans in the pancreas.
The new treatment combines 'sirolimus, low-dose tacrolimus, and a monoclonal antibody against the interleukin-2 receptor (daclizumab), according to the report in the \INew England Journal of Medicine\i, which will not be formally published until July. The paper was released early on the \INEJM\i Web site because it is of such great importance.
Another difference with the current procedure is that many of the previous cases were transplantation attempts that accompanied kidney transplants, while the new cases were all just islet transplantation. All of the patients were regarded as having had type I diabetes for at least five years, and all had serious blood sugar problems, even with insulin injections -- in other words, they were extreme cases of uncontrolled diabetes.
The islets were obtained from brain-dead donors under normal donation protocols, and the patients, once a supply was ready, were treated with antibiotics including the powerful agent, vancomycin, to which very few bacteria have any resistance, as well as vitamins. Antiviral agents were administered after the transplantation.
The donor pancreases were removed and stored before being treated with an enzyme which broke up the organ gently, allowing the islets to be separated and purified. Once 'harvested' (the normal medical term for this procedure), the islet cells were transplanted almost immediately to avoid any problems that might have arisen during any prolonged culturing.
The islets were, however, incubated first and tested for their response to both high and low glucose levels before being transplanted into a tissue-matched recipient. According to the paper, "Each islet preparation from a donor was matched to the recipient's blood type and cross-matched for lymphocytotoxic antibodies, but no attempt at HLA matching was made."
The transplantation was achieved using a catheter guided by fluorosocopy, and ultrasound imaging was used to check the effects of the operation afterwards, along with liver function tests.
In summary, the seven patients, aged from 29 to 54 years, with a median age of 44 years, had suffered type 1 diabetes mellitus for a median of 35 years (the range was 18 to 50 years), underwent islet transplantation between March 1999 and January 2000. In June 2000, the median time since transplantation was 11.9 months and, in all seven patients, exogenous insulin therapy (insulin injections to the lay reader) quickly became unnecessary once sufficient numbers of islets were transplanted. The patient with the lowest number of islets has required insulin on four occasions during other illnesses, and another patient required insulin on one occasion during an illness.
The researchers report no cases of acute cellular rejection, and perhaps more importantly, there seems to have been no recurrence of the autoimmune reaction which attacked the patients' own islets in the first place. This, they suggest, is a side-effect of the immune suppression treatment that they used to manage the graft itself.
These were all extreme cases, and all have had the quality of their lives greatly improved. There have been no other problems, although the widespread use of this method will be limited by the supply of donors, with two or more donors required to meet the needs of one recipient. They indicate in passing that less than one third of the available pancreases in the United States are actually transplanted, but say that it is clear that higher islet levels are required than can be obtained from a single donor. (And the good news is that there is hope in sight - see \JGrowing insulin-secreting cells\j, this month.)
While the patients were not in the range of the American Diabetes Association criteria for a diagnosis of diabetes, they were still not entirely normal. This is not a miracle cure, but it is an amazingly effective improvement in their conditions. They note that, "In patients with type 1 diabetes, glycemic control can also be achieved with intensive insulin therapy and pancreatic transplantation." They also note that this transplantation is an invasive procedure with a substantial risk of death to the patient.
Key names: A.M. James Shapiro, Jonathan R. T. Lakey, Edmond A. Ryan, Gregory S. Korbutt, Ellen Toth, Garth L. Warnock, Norman M. Kneteman, Ray V. Rajotte.
#
"Growing insulin-secreting cells",1552,0,0,0
(Jun '00)
The American Diabetes Association's 60th Annual Scientific Sessions were told in mid-June about a hopeful new approach to treating diabetes. Scientists have successfully cultured human beta cells that grow indefinitely, and that could potentially serve as an unlimited source of insulin-producing tissue for transplantation to cure people with diabetes.
Beta cells are found exclusively in the pancreas, and they secrete insulin in response to glucose stimulation. When these cells are defective or when the body fails to utilize insulin properly, the result is diabetes, characterized by high blood sugar levels. While transplantation of either the pancreas or of islet cells which contain beta cells is possible, this is limited due to the scarcity of donor tissue from cadavers (see \JIslet transplantation and Type I diabetes\j, this month)
"Even if you had unlimited success with tissue transplantation, there is simply not enough donor tissue to treat the millions of people who have diabetes," according to Fred Levine, associate professor at the UCSD Cancer Center and the Whittier Institute in San Diego, whose laboratory reported the successful results. "We have now been able to create an immortal human cell line, and have demonstrated in mice that these cells are functional when transplanted, secreting insulin in response to glucose stimulation."
Levine and his colleagues have been working on this project for eight years, trying to create a system where beta cells can endlessly reproduce, while counteracting the malignant behavior in order to prevent tumor formation. So far, the system has only worked in mice, but human studies may be possible within a few years.
Other key names: Dominique Dufayet and Tonya Halverson.
#
"Cocaine use and ADHD in children",1553,0,0,0
(Jun '00)
A possible explanation for attention deficit hyperactivity disorder (ADHD) has been flagged in a finding to be published in \IBehavioral Neuroscience\i in August, but released on the Internet in June. The study reveals a connection between cocaine use during pregnancy and attention dysfunction in children. The study found that rat fetuses exposed to cocaine levels comparable to daily recreational use in humans show lasting dysfunction, especially in the area of attention -- and while the work was done on rats, it appears to be applicable to humans.
According to Barbara Strupp, a coauthor of the study, "Although prenatal cocaine exposure does not seem to affect most areas of cognitive function, the deficits in attention are consistent and lasting -- seen in adult animals long past the period of exposure. In humans, this type of dysfunction could significantly impact the lives of affected children, as seen in cases of ADHD."
She added that, while there were some differences between the behavior of the affected animals and normal human ADHD, in both cases, attentional dysfunction is prominent while many other cognitive processes are unaffected. Crack cocaine is being used increasingly by poor and young pregnant women, and a 1996 survey estimated that 1.1% of pregnant women in the United States, some 45,000 women, smoke crack cocaine each year. A study in 1998 suggested that somewhere between a third and a sixth of all children needing emergency medical care in some inner-city areas might have been exposed to cocaine before they were born.
The main problem is that it is very difficult for researchers to isolate the effects due to a range of confounding factors such as prenatal undernutrition, maternal stress, prenatal and postnatal medical care and accompanying exposure to alcohol, nicotine, marijuana, opiates and amphetamines.
One half of the rats in the study received appropriate levels of cocaine by catheter, while the other half were given a harmless saline solution. After birth the offspring were given a series of cognitive tasks. Basic learning ability, short- and long-term memory (both implicit and explicit), impulse control and cognitive and behavioral flexibility were all unaffected by the cocaine treatment.
This was not the case for selective attention -- the ability to stay focused despite environmental distractions. Instead, the ratsÆ attention tended to be 'captured' by the strongest cues in the environment. Because of this, when the most noticeable cues were needed to perform a task, then the performance of the cocaine-exposed animals did not differ from controls.
On the other hand, when the most salient cues were irrelevant or distracters, the cocaine-exposed animals were significantly impaired in performing the tasks. This type of deficit could have serious effects on the school performance of affected children.
Key names: Barbara Strupp, Hugh Garavan, Russell Morgan, David Levitsky, Charles Mactutus and Rosemarie Booze.
#
"New measles DNA vaccine",1554,0,0,0
(Jun '00)
A new DNA vaccine has been established as a successful defence against measles, according to the July 2000 issue of \INature Medicine\i, released in late June. Measles remains a major cause of worldwide mortality, in part because young infants cannot be immunized effectively, but the researchers believe they are now on the way to developing a new measles vaccine that can be used for immunizing infants in developing countries.
Results of the study show that the DNA-encoded vaccine, which used either of the surface proteins known as hemagglutinin (H) or fusion (F), provided protection against measles, and there were no adverse effects reported. Vaccines made from inactivated viruses in the past have always carried the threat of causing a severe strain of atypical measles, but there is no evidence of this problem with the DNA vaccine.
Key name: Diane Griffin.
#
"Asthma and garbage",1555,0,0,0
(Jun '00)
Several interesting asthma stories, due for release in the first few days of July, were notified to science writers in late June. Appearing in the \IEuropean Respiratory Journal\i, the stories come from a number of countries.
A French and Danish study showed that garbage collection may give rise to harmful consequences for the airways among some of the workers involved. Torben Sigsgaard and Suzanne Mamas and their colleagues exposed five men and five women, all garbage workers, to the same substances as they came into contact with daily. The scientists filled the nostrils of the volunteers with various liquid solutions which contained either one of the toxic compounds commonly found in garbage or a harmless placebo. In the course of this double-blind study, neither the volunteers nor the scientists knew which solutions were which until after the work was complete.
The volunteers were split into two groups: one with five garbage workers who had stopped working at least five years earlier because of occupational asthma-like (OAL) symptoms, such as wheeze, chest tightness and dyspnea. The control group of five persons included garbage workers who were still on the job and had never exhibited OAL symptoms.
After administering LPS to the volunteers, Sigsgaard and his colleagues found that the OAL patients and the control group differed considerably in the concentration of neutrophils, a variety of white blood cells, often referred to as "scavenger cells." The scientists counted up to six times more in the control group than in the OAL group. This finding has to be interpreted as a sign of a stronger response of the immune system in persons who have never shown OAL symptoms.
The first discovery made by the scientists in their hunt for the garbage trouble-makers was that endotoxins (LPS or lipopolysaccharides) of bacterial origin play a major part in setting off inflammation of the mucous membrane of the nose. These endotoxins are known to cause severe disorders in the lungs, but this is the first time scientists have really identified their effect on the mucous membrane of the nose. The authors note a few problems caused by complications such as smoking, but there would seem to be a clear indication here of areas needing further enquiry.
#
"Asthma and insecticide sprays",1556,0,0,0
(Jun '00)
An Australian study in the \IEuropean Respiratory Journal\i reports that fly sprays, used enthusiastically in Australia, can affect asthmatic patients in several ways: reduced breathing capacity, coughing or eye irritation. Cheryl Salome and her colleagues of the Institute of Respiratory Medicine of the University of Sydney looked at 25 patients in a carefully designed double-blind study which revealed that the sprays can reduce respiratory capacity by an average of 9% compared with an environment without insecticides.
Worse, in just over a quarter of the people tested, it fell by more than 15%, an amount which would be considered clinically important, according to Salome. The study focused on a sensitive category of people, i.e., asthmatics who had already told a doctor that they experienced cough or irritations of the eyes or nose when using these sprays. Salome and her colleagues compared the effects of standard shop insecticides with those produced by a recently marketed spray designed to cause little irritation and with those of a "neutral" control aerosol giving off only water droplets.
Patients were given a 'typical' 4-second jet, then 15 minutes later, received a second longer jet, lasting 32 seconds, which was supposed to simulate over-zealous use of the product. Salome comments: "It is important to stress that the effects were short lived, and breathing capacity returned to baseline levels within 30-40 minutes".
In an interview, Salome said that they were a little surprised by the result, as they had expected to show that even among self-selected patients, there were no physical asthma symptoms triggered by the sprays.
#
"Asthma and thunderstorms",1557,0,0,0
(Jun '00)
In the southern spring of 1997, 215 patients attended the emergency department of a hospital in Wagga Wagga, in rural New South Wales, all with asthmatic conditions, in the few hours after a thunderstorm in the inland city. Of these patients, 41 required hospital admission. Similar events have occurred in London and Birmingham in Britain and also in Melbourne and Tamworth in Australia.
An Australian study reported in the \IEuropean Respiratory Journal\i describes an attempt to find out who is at risk of thunderstorm-induced attacks and whether any particular behaviors or actions protected susceptible individuals from experiencing severe attacks. The study compared the patients with asthma who attended hospital at the time of the thunderstorm and patients with asthma who attended at other times.
It appears that people with hayfever and/or allergy to ryegrass pollen were most at risk. Nearly all the people with severe thunderstorm-induced attacks had these sensitivities, and some of those who attended the hospital had not previously been diagnosed with asthma. Those who presented themselves at the hospital were less likely than the average patient to be taking inhaled corticosteroids ('preventers' or 'puffers').
#
"A breast cancer metastasis suppressor gene",1558,0,0,0
(Jun '00)
A group of American researchers, led by Danny R. Welch, reported in the journal \ICancer Research\i in early June that they had identified a gene which blocks the ability of breast cancers to spread and colonize secondary sites. The gene has been named BRMS1, and it is expected to provide a target to develop therapies that keep cancer localized. It may also prove useful in making a proper diagnosis, they say.
The same research group had previously shown that the introduction of a normal human chromosome 11 reduced the ability of a human breast cancer cell line to spread or metastasize by 70 to 90 percent, suggesting the presence of one or more suppressor genes on that chromosome. Other laboratories have estimated that between half and two thirds of patients with late stage breast cancer lose copies of chromosome 11.
The group compared cells containing an added chromosome 11, in which metastasis was suppressed, with metastatic breast cancer cells and found that BRMS1 was more highly expressed (i.e., the gene is turned "on") in the nonmetastatic cells. When they cloned this single gene, it was able to suppress metastasis in mice, stopping the cancer from spreading to the lungs and lymph nodes without affecting their ability to form a tumor in the breast.
Most of the cells in a tumor cannot complete the multistep process of metastasis, according to Welch, simply because there are multiple steps in the chain, and if the cell fails any one of them, it has failed the whole process.
#
"Novel prostate cancer vaccine",1559,0,0,0
(Jun '00)
Scientists in the United States have announced that they are investigating a new vaccine for treating prostate cancer that has spread beyond the prostate gland. While the work has not yet been formally published, they have described an interesting approach to the problem of metastasis of prostate cancer. The trial is sponsored by the Eastern Cooperative Oncology Group, a national cancer clinical trial cooperative group funded by the (US) National Cancer Institute.
The aim is to kill off malignant cells while leaving healthy cells alone. While most of a prostate cancer may be dealt with by surgery or carefully directed radiation, or even by chemotherapy, these treatments can only mop up the last few cells with difficulty, if at all, and be a danger to the patient. A vaccine, on the other hand, should be harmless to the body as a whole and just target the remnant cancerous cells.
A trial is under way with men who have already undergone surgery or radiation therapy for prostate cancer, and who still have measurable levels of prostate specific antigen, usually called PSA. The PSA is normally produced by the prostate, but patients enrolling in this trial have had their prostates removed or treated with radiation, so their PSA levels should be very low or undetectable. Any remaining measurable PSA suggests that their cancer has spread.
The idea is to use the PSA test to identify a recurrence as early as possible in order to eliminate the cancer before it is re-established. But these are early days, and the final form of the vaccine is still to be decided. One version uses the vaccinia virus, a relatively harmless virus used in smallpox vaccines, while the second uses a fowlpox virus as the carrier. Each virus is no more than a minor problem for humans, but each has been genetically altered to express human PSA. The PSA-bearing virus is attacked by the human immune system and, hopefully, the body will just as readily attack the PSA tumor cells.
#
"The human genome working draft",1560,0,0,0
(Jun '00)
It drew great ballyhoo in the press and on television, it interested the general public and the more excitable opponents of genetics, but it hardly made a ripple in the scientific community, who knew it was coming. It was the declaration that the human genome had been mapped. In fact, most media reports implied that the base sequence of the human genome had been worked out, but all that was declared complete was the map set that will allow the sections of DNA to be strung together to make longer sequences.
The map takes the form of a working draft, with certain identifiable genes located in a sequence. The next step is to take chromosome fragments that overlap, determine all of their base sequences, and then line them up along the map.
The muted response of the scientific community does not do justice to the achievement, which the US president and the UK prime minister have hailed as "the most wondrous map ever produced by human kind." The enthusiasm of the press, hailing this as "a breakthrough," comparable with the invention of the wheel or the splitting of the atom, misses the point that what was announced is just another step rather than a one-off event. There will never be a real anniversary of the unraveling of the human genome, for the work has been going on for years and will go on for years more, growing by degrees, getting better in small steps, day by day.
To an extent, the June announcement, put out jointly in Britain and America, was an attempt to manufacture a date in history that can be recalled hereafter. This was just the day on which it was possible to say that 85% of the bases had been sequenced and 97% of the chromosomes had been mapped.
It will be another three years before a fully finished genome is available, and right now only 24% of the genome is in the 'gold standard' format. This is why the actual White House press release referred to the achievement as the 'completion of the first survey of the entire human genome', and later as 'an initial sequencing of the human genome'. In fact, there will probably never be a day when we can say that the genome is known 100%, as there are likely to be a few small gaps, a few errors here and there, so a manufactured special day is probably as good as any other.
But manufactured history or not, the simple fact is that the revolution is here, and the knowledge we have already is fast coming into use, offering cures and the hope of cures, for diseases that have long challenged medical science, and giving us new insights into how living things work. And with the hopes come the fears of the naysayers, who have been making what we can only hope are self-negating prophecies. Having heard them, scientists can now make strenuous efforts to stop them happening, but it won't be easy.
The future worlds they depict are slickly described in the emotive language of the negativists as the prophets warn us of 'designer babies' and 'Frankenstein foods' and hold out scenarios of people being denied insurance or employment because they carry (or fail to carry) certain genes.
It is true that there will be a slippery slope. People may wish to free their children of some genetic inheritance that they themselves have had, like a blood disease or shortsightedness, and that might seem acceptable. But what about somebody wanting to save their child from having an overly large nose? Somebody is going to have to decide where to draw the line so that children are not made to suffer indignities, including the indignity of being a designer baby.
In the same way, if we know that somebody has a gene that makes them prone to a heart condition that would make them unable to fly a plane after the age of 35, it might make sense not to spend money training that person as a pilot. After all, we use medical tests now to rule out people unfit to fly airliners, so why not step in a bit earlier? Somehow the line has to be drawn between what is acceptable and what is not.
It is reassuring, therefore, that the US president and the UK prime minister were prepared to be involved in the announcement, but we have to question whether the rest of the world's politicians are ready to play their parts in drawing the lines. In Australia, where scientists are taking a strong line on genomics and GM, it seems that it is mostly politicians who oppose the technology who are speaking out.
There were, in fact, two real stories there for the science community. The first was the news that the five separate maps, from five separate human races showed no real differences: race is only skin-deep, and not visible when you look at the genes underneath.
The second story was that the two competing groups had come together for the announcement after spending recent months brawling over ownership and access to the data. Dr Francis Collins, leader of the publicly funded Human Genome Project (HGP), and Dr Craig Venter, head of the private company Celera Genomics, were both at the White House for the conference, where Mr Clinton heralded the end of these arguments by announcing a program of cooperation between the two parties.
The rough draft has been completed years ahead of schedule thanks to the introduction of new robotic technology and the competition Celera gave the HGP when it started work in 1998. The speed of operation could hardly have been contemplated, even a decade ago. When scientists identified the gene for cystic fibrosis in 1989, it was at the end of a nine-year search; but by 1997, a gene for ParkinsonÆs disease was mapped in only nine days -- and see also \JThe one-day genome\j, this month. As another example, Celera announced on June 1, 2000, that it has sequenced approximately 1.15 billion base pairs of mouse DNA. Celera began to sequence the mouse genome on April 6, 2000, and they expect to have the entire mouse genome completed by the end of 2000.
Celera's own press release, naturally enough, concentrates on the Celera achievements in the race to get the whole human genome. This, they say, contains 3.12 billion base pairs, but Celera have assembled no less than 26.4 million sequences of 550 base pairs long for a total of 14.5 billion base pairs sequenced, or what they call "4.6X sequence coverage." This simply means that, \Ion average\i, each gene has been sampled 4.6 times.
Most importantly, they believe that this coverage means that 99% of the human genome is now covered and stored in their supercomputer, which also contains the data from GenBank, the public database, produced primarily by the public genome effort. They report that processing so far has required more than 20,000 CPU (central processor unit) hours on the supercomputer.
This, they say, is believed to be the largest computational biology calculation to date, involving 500 million trillion base-to-base comparisons. The data total is now of the order of 80 terabytes, enough to fill about 1000 normal hard disc drives, or 125,000 CD-ROMs that would stack up into a pillar 180 meters, almost 600 feet tall - and that is when they are out of their cases!
The next stage will be 'annotation,' in which located genes are studied to better understand their function. After this, researchers will be able to compare genetic data from the fruit fly, human and mouse in order to make discoveries that could lead to improved treatments and possible cures for diseases.
#
"The one-day genome",1561,0,0,0
(Jun '00)
The human genome will have taken years to sequence, the mouse genome is expected to take some nine months, but the entire genome of the "superbug" \IEnterococcus faecium\i was unraveled in just a day, say researchers at the Joint Genome Institute (JGI) in Walnut Creek, California. The bacterium is a harmful, antibiotic-resistant germ that is one of the leading causes of hospital-acquired infections.
The rate of infection by bacteria such as \IE. faecium\i and other enterococci surged during the past 20 years. Most alarming has been their escalating resistance to antibiotics, including Vancomycin, usually considered the treatment of last resort.
\IE. faecium\i can spread throughout the body and cause serious infections in the blood, heart, urinary tract, central nervous system, and also in wounds. Only a few new antibiotics have been identified in test tube studies that show promise against it.
The main importance of scientists being able to sequence \IE. faecium\i in a day is that, if a new disease emerged, either through an act of bioterrorism or by natural means, it could now be very rapidly sequenced, allowing scientists to probe the new germ for its weaknesses.
In fact, the claim is slightly exaggerated. The first day saw researchers complete the first phase of genome sequencing (the shotgun sequencing phase), permitting essentially all of the organism's genes to be identified. Future work will complete the assembly of the genome and provide a more complete analysis of its genetic structure. But even if the claim is exaggerated, the work completed is amazing. In one day, 2.8 million base pairs of DNA were sequenced.
#
"Caterpillars safe from some types of Bt corn",1562,0,0,0
(Jun '00)
Originally, there was the claim that monarch caterpillars were at risk from the pollen produced by a genetically modified corn (see \JEngineered corn and monarch butterflies\j, May 1999), which was later rejected by a number of other more experienced scientists.
Now a report of a more carefully designed study indicates that the Bt corn variety grown widely in east central Illinois in 1999 had no adverse effect on black swallowtail caterpillars that thrive in weeds alongside cornfields, or in laboratory tests either. Another variety of Bt corn, however, was shown to cause mortality of nearby caterpillars
The differences between the studies are that a different species of caterpillar was involved, and, more importantly, many of the tests were carried out in the field with natural levels of pollen, rather than the massive levels used in the first laboratory study. The report appeared in the \IProceedings of the National Academy of Sciences\i during June.
The black swallowtail larvae are just as likely as monarch caterpillars to encounter corn pollen in the field during a key developmental time between late June and mid-August, but they could detect no signs of mortality that could be attributed to the pollen under field conditions.
The researchers grew Pioneer variety 34R07, which contains Monsanto event 810 -- a particular genetic configuration of corn carrying the gene that encodes the Bacillus thuringiensis toxin that is fatal to European corn borers. The borer is a pest that causes severe damage to corn in some parts of North America, but the Bt varieties kill the borer. The researchers then assessed the pollen levels from the Bt corn at a variety of locations, ranging from 1/2 meter to 7 meters (18 inches to about 22 feet) from the cornfield.
Large numbers of caterpillars died, as would be expected, given that a female butterfly lays up to 800 eggs, but the deaths did not appear to be due to anything connected to the corn or the corn pollen. The key observation here was that there was no correlation between mortality and distance from the cornfield or between mortality and pollen load. If the Bt pollen was causing deaths, there should have been more deaths closer to the corn and on more heavily dusted leaves.
Some of the deaths were due to spiders, carnivorous insects and other environmental factors and, when surviving caterpillars were weighed, there was no negative pattern that might suggest a problem in their growth and development.
In laboratory studies, some of the field uncertainties can be more carefully controlled, and caterpillars were exposed to Bt corn pollen from the plants in the field as well as pollen from non-modified but genetically similar corn plants. The endotoxin from the same Bt corn again had no effect on survival of the caterpillars, nor did the pollen from non-modified corn.
Pollen from another transformed variety, Novartis Max 454, however, did cause mortality in the laboratory. Antibody assays of the Novartis event showed that it contained 40 times as much endotoxin on average than did the 810 variety. So while a high-Bt strain of corn may be more desirable in stopping corn borers from developing resistance, it \Imay\i cause an unacceptable level of risk to nearby harmless animals.
Then again, the level of risk might be acceptable, since the use of Bt crops helps to reduce the amount of conventional insecticide sprayed on crops, so the pollen damage, while undesirable, might still be less than that following from chemical sprays. Moreover, the Bt toxin has been used for many years as a conventional spray with no concerns or worries, and further studies will be needed to show whether the pollen damage is as great as the damage of Bt in a spray form.
Until that research is done, we need to keep an open mind on the Bt corns, and also on the possibility that the objections are based on a double standard. GM products and their dangers need to be judged relative to the alternatives, not on some absolute scale, as has been the case in the past. At the same time, the benefits from GM products need to be assessed in the same way.
Key names: May Berenbaum, Lydia Wright and Mark Carroll.
#
"Family size and IQ",1563,0,0,0
(Jun '00)
The IQ, while it is always a bit dubious as an individual measure, is a useful research tool for population studies because the individual variations that mask an individual's abilities can be merged into general trends. One of the standard and widely accepted generalizations about IQ is that, when there are multiple children in a family, the later children, on average, have a lower IQ. The explanation of this has always been that younger children tend to play and interact with their older siblings, rather than with adults, so that they are exposed to a lower level of stimulation and an impoverished intellectual environment.
Now it appears that having more children born into a family does not necessarily result in lower-IQ children, according to new research published in the June issue of the \IAmerican Psychologist\i, based on data from the National Longitudinal Survey of Youth (NLSY). The data provide a large random national sample of US families that includes children whose academic performance has been reviewed multiple times throughout their academic careers, having started out as a household probability sample of the nation's youth ages 14 to 22.
Since its beginnings, the NLSY has followed 11,406 young people at yearly intervals, but since 1986, the children born to the original female respondents have been surveyed every other year. And because it is so detailed, the authors of the report say they are able, for the first time, to assess both "across-family" and "within-family" measures.
Previous research has relied mainly on within-family models, using only across-family measures. As an example, family size is an "across-family" measure which compares families, while birth order is a "within-family" measure, applying within a family unit.
They say that previous conclusions about the apparent link between both birth order and offspring intelligence and family size and offspring intelligence are in fact spurious, caused by mistaking across-family effects for within-family effects.
The argument gets fairly technical, but it appears that, if you are planning to limit the size of your family, concerns about the intelligence of your children, as measured by an IQ test, should not be high on the list. More important is the question of parental IQ, and this has been one of the confusing factors because, on average, parents with a lower IQ tend to have more children. But family environment and genetic heritage may also play roles in both family size and children's intelligence.
Key names: Joseph Lee Rodgers, H. Harrington Cleveland, Edwin van den Oord, and David C. Rowe.
#
"A gecko on the ceiling",1564,0,0,0
(Jun '00)
Across the world, there are some 850 species of gecko lizard, most of them nocturnal, most of them able to climb happily up slippery walls, running up glass at one meter per second, and they can even wander across smooth ceilings as they hunt for insects to eat. The secret of their glue-like cling lies in about two million microscopic hairs on their toes which take a firm grip on even the smoothest surface, and now engineers are wondering if they can learn from the humble gecko.
A paper published in \INature\i during June describes the forces that these hairs, called setae, can exert on the surface they are gripping. Working with Tokay geckos (\IGekko gecko\i) from Southeast Asia, a team of engineers and biologists found that the force exerted by all of the setae is around 10 times the force actually needed to haul a gecko off a ceiling. It seems that geckos use only a fraction of the hairs at one time, although they have been known to hang from the ceiling by just one toe.
There is a pad called a spatula at the tip of each hair. These pads are only 1/4000 millimeters (1/100,000 of an inch) across, and these get so close to the surface that weak interactions between molecules in the pad and molecules in the surface become significant.
Looking like miniature broccoli flowerettes, the spatulas are outstanding adhesives, but the most amazing thing is that, as it scoots across the ceiling, a gecko lifts and replaces its foot 15 times a second. Sticking to the ceiling is not that much of a problem, but managing to come unstuck that often is most impressive -- and that is the secret the researchers think they have solved.
Using a fine probe, they established that touching the end of a hair to a surface is not good enough because the hair slips right off. Instead, the gecko pushes the pad in and pulls slightly downwards, which gives it 600 times greater sticking power than friction alone could account for. They believe that there is an uncurling motion as the gecko attaches its foot to a surface, and this manages the attaching action automatically.
Detaching is not just a matter of letting go, because the attachment is too strong -- a single hair could raise 1/50 gram (1/1500 ounce), so that a gecko's foot could lift a 20 kg (45 pound) child. If the hair is angled at 30 degrees, the spatula will just pop off, and the gecko manages the angled pull simply by peeling its toes off the surface.
There are no glue glands on the gecko's foot, and no signs of glue residue when a gecko has passed by. Geckos can stick to the wall in a vacuum, which rules out suction, the hairs do not interlock with the surface, as with Velcro, nor is friction likely, and other researchers have ruled out electrostatic attraction.
This leaves intermolecular forces as the most likely explanation. Usually, these forces are masked by other stronger forces, but with the gecko's feet getting so close to the surface, the intermolecular forces become important again.
The most likely forces are van der Waals forces, though the researchers say they cannot rule out water adsorption or some other type of water interaction. The forces may be small, but they are sufficient to keep layers of graphite together, and van der Waals forces are also responsible for the attraction between enzymes and their substrate.
But if nature can grip that well, what about robots? Could we make a machine that could climb walls and run across ceilings? Perhaps we could, but there are other possibilities as well, since the same sticking technique could make a strong but dry adhesive.
The first development, though, is likely to be in robotics. The researchers also have an association with IS Robotics, Inc., in Boston, that has led to a mechanical gecko that uses the peeling action of geckos to walk on vertical surfaces. At the moment, the small robot uses an adhesive glue to stick to the wall, but they are hoping to imitate nature more closely.
Key names: Robert J. Full, Kellar Autumn, Ronald Fearing and Thomas Kenny.
#
"The secret of the Neandertal bones",1565,0,0,0
(Jun '00)
Students of human evolution have long accepted that the early Neandertal people ate meat, but scientists have never been able to agree about how the Neandertals got their meat. Were they merely scavengers who snatched the leftovers of nature's predators, or were they themselves high-level carnivores with adept hunting skills?
The case for the Neandertal hunters includes the assumption that the hominid line evolved into humans as they adapted to a meat diet, but that assumption might be wrong, or maybe all of the hominids before our line were just opportunist scavengers.
Now, however, the question appears to have a definitive answer -- or it does in the eyes of the authors of a paper in the \IProceedings of the National Academy of Sciences\i in mid-June. They say that the bone chemistry gives a very clear indication that the Neandertal people feasted on meat, much like other top-level carnivores from the time period, such as wolves and lions.
This finding is the basis for some interesting speculations about the social structure of the Neandertals. If they had a diet dominated by animal protein, this tells us they must have been effective predators. This leads Erik Trinkaus to comment that this " . . . implies a much higher degree of social organization and behavioral complexity than is frequently attributed to the Neandertals."
The analysis was carried out on a jawbone and skull bone from two Neandertals recently dated to about 28,000 years. The fossils were recovered at the Vindija cave site, located some 50 kilometers (about 35 miles) north of the Croatian capital of Zagreb. The scientists measured the levels of key stable isotopes in the Neandertal bones, and also in other central European animals of the same time period, including wolves, wild cattle, mammoths, arctic fox and cave bear.
Almost every element in the periodic table has a number of forms of the atom, differing only in the number of neutrons in the nucleus, and these are called isotopes. Some isotopes, like carbon-14, are unstable, radioactive, and their nuclei slowly decay, so that the amount of carbon-14 is a constant, representing a balance between the natural decay and the natural formation of new carbon-14 atoms high in the atmosphere.
The other isotopes are stable and, while they are nominally identical, a slight difference in mass can influence the rate at which the different isotopes are used in a chemical reaction, which can influence the proportion of the isotopes found in the remains of a dead animal or plant. Isotope counts can even tell us about the relationship between ants and the trees they live in (see \JAnts in the plants\j, this month).
Archaeological evidence on its own can consider the remains of animal bones and stone tools that were used for hunting, but this provides only a glimpse of Neandertal diets, and on that evidence, it is possible to argue that there is little evidence that the Neandertals were accomplished hunters.
Stable-isotope signatures of the foods we have eaten in our lifetimes are evidence that requires very little interpretation, and allows very little argument, but there is still some room for doubt, according to Fred Smith. "It's still hard for us to know for certain, but it doesn't appear that they were getting much in the way of nutrients from something other than meat," he said.
The problem is that, while meat-eating can be detected in this sort of record, plant foods are almost invisible in the analysis, making it impossible to estimate accurately their dietary importance. Still, the new findings, along with data from older samples of Neandertal fossils in France and Belgium, indicate a pattern of European Neandertal adaptation as carnivores, the researchers claim.
As with most interpretations and reports in this area of science, the underlying question is whether or not the Neandertalers were part of the modern human family tree, whether any of them are among the ancestors of modern humans. It is this debate which brings out the strongest polemic among scientists, for it also touches on the 'out-of-Africa/mitochondrial Eve' debate, and much more.
This study can be seen in a context: recent reports, also appearing in the \IPNAS\i reporting the recent age of the Neandertal fossils from Vindija (see \JA more recent date for the last Neandertals\j, October 1999), demonstrating a joint presence of the Neandertal people and the Cro-Magnon people in Europe over long periods of time, and another report of evidence for Neandertals and modern humans interbreeding (see \JDid Neandertals and modern humans interbreed?\j, April 1999).
The Neandertals are often described as prehistoric humans of limited capabilities who were rapidly replaced and driven to extinction by superior early modern humans, once the latter appeared in Europe. If they were cunning and effective hunters, with the social structure that this demands, then they were far from the shambling brutes that we so often think of when the terms 'cavemen' or 'Neandertal' are raised.
"There's no reason to believe Neandertals were any less efficient exploiters of the environment than modern humans," Smith says.
"The new bone-chemistry data combined with evidence of sustained Neandertal coexistence and interbreeding with early modern humans offer a positive picture of the Neandertals and may make it easier for some to accept the possibility that the Neandertals were among the ancestors of early modern humans," Trinkaus commented.
Key names: Erik Trinkaus, Fred H. Smith, Michael P. Richards.
#
"Ants in the plants",1566,0,0,0
(Jun '00)
Two researchers from Arkansas, studying the relationships between ants and the plants they live on, have concluded that the ants are feeding the plants, as well as feeding on them. While the finding is restricted to a few Caribbean islands, it suggests that some forms of apparent mutualism may, under careful analysis, turn out to be symbiosis.
The results, already announced on the Internet, will appear soon in \IOecologia\i. The study looked at \IAzteca\i ants, some of whom live their entire lives in \ICecropia\i trees, nesting in the hollow stems and foraging for food among their leaves. Ants with this sort of habit sometimes are predators of the plant, sometimes act as herders, managing aphids which they 'milk,' and sometimes they serve as guards, repelling insects which might eat the plant.
In this case, previous studies had identified the exploitation of the plant by the ant, but had not detected any reciprocal benefit to the plant. According to the new report, earlier studies had also overestimated the amount of food the ants take from the \ICecropia\i.
The \ICecropia\i trees have huge lobed leaves that form in spirals on slender stems. The plants often grow in disturbed landscapes of moist lowland tropical forests. The plants have cigar-shaped male and female flowers that grow on separate trees. The \IAzteca\i ant queens colonize the hollow \ICecropia\i stems and the worker ants forage among the leaves and stems for beetles, caterpillars and specialized food bodies produced by the plant.
These food bodies are significant, since standard evolutionary logic tells us that if the plant is diverting resources to benefit the ants like this, it must also be obtaining a significant benefit, sufficient to outweigh the loss of resources that otherwise could be put into growing more seeds. The researchers set out to discover what the ants ate and what the plants used for nitrogen sources, knowing that the ants live only on the trees, which meant they either ate the plant, or parts of the plant, or insects that land on the plant.
To find out what the ants eat, you need to look at subtle differences in the ratios of the stable isotopes of carbon and nitrogen. Plants and animals have distinct isotope patterns, allowing researchers to identify the contribution of each item to the ant diet, just as the diet of the Neandertals could be assessed in another study (\JThe secret of the Neandertal bones\j, where there is more information on the method) this month.
In this case, the researchers looked at the ratios of carbon stable isotopes in the insects and potential diet items, and also on the nitrogen stable isotope ratios found in the host plants. From the analysis, they concluded that the worker ants get only about 19% of their food from the plant, and the ant larvae get about 42% of their food from the plant. The rest of their food comes from insects that land on the \ICecropia\i leaves or stems.
They also compared the nitrogen isotope signatures of colonized stems with uncolonized stems. The colonized \ICecropia\i stems received 93% of their nitrogen from the ants. According to Sager, the ants are feeding the plants, and nutrients are exchanged in both directions: "These results suggest that mutualism may persist in this system, not through mutual exploitation, but by a reduction in the sources and degrees of conflict between parties."
Key names: Cynthia Sager and Shauna Gingers.
#
"Birds and dinosaurs",1567,0,0,0
(Jun '00)
A recent paper in \IScience\i has proposed a new ancestor for the birds, a small reptile with feathers called \ILongisquama insignis\i that glided among the trees some 220 million years ago in what is now central Asia. This reptile was a contemporary of the earliest dinosaurs and lived some 75 million years before the first birds, giving it time to be an ancestor. More importantly, though, fossils of \ILongisquama\i show the same unique sheath that surrounds growing feathers in modern birds.
Everything you can make out is consistent with it being a small, tree-living, gliding animal, which is precisely the thing you'd expect birds to evolve out of, says Larry Martin of the University of Kansas, and one of the authors of the study. This goes against the trend, which sees the dinosaurs as surviving into the present day as birds. But, while it may be nice to think our budgerigar is a cousin of \ITyrannosaurus rex\i, Martin believes it is no more than wishful thinking.
This view is not shared by his colleague, David Burnham, who has just assembled \IBambiraptor feinbergi\i, a 3-foot-tall skeleton of a 75 million-year-old dinosaur that had a roadrunner's body, birdlike shoulders, and long-clawed arms that folded like wings. It also had a large brain that shares a number of features with modern birds and, with more than 90% of the skeleton recovered, it is easy to understand why Burnham is confident that birds evolved from dinosaurs. He questions whether the 'feather-sheaths' are actually proto-feathers, suggesting instead that they look more like hard scale-like structures that are attached to the ribs. As well, he notes, they do not cover the body like feathers would.
The next five to 10 years, the two men believe, will see the question resolved. And they agree that when it happens, it should resolve whether flight evolved upwards, with \IBambiraptor\i or something like it leaping, or whether it evolved downwards, with something like \ILongisquama\i parachuting down from the trees.
#
"T. rex may be in for a name change",1568,0,0,0
(Jun '00)
What's in a name? asked Juliet. Quite a lot if you are well-known, like \ITyrannosaurus rex\i, or \IBrontosaurus\i, now correctly called \IApatosaurus\i. It now appears that \IT. rex\i may also have a name change coming. The problem is that there are very clear rules for the naming of specimens. But the fossil hunters of the late 19th century were very good at assigning new names to small portions of skeletons, and each was competing with the other to be the first to name a specimen, so any old bit of bone would do, and so would any old name.
The world's zoologists have long since set their house in order, creating a watertight system of priority for names, where the oldest name takes precedence. So when somebody found that a juvenile form had already been labelled \IApatosaurus\i (by Othniel Marsh, who also named \IBrontosaurus\i) before using the much more impressive thunder-lizard name of \IBrontosaurus\i. But impressive or not, that was it. The name Marsh gave to the adult just had to change because it had come second.
Now it may be the turn of \IT. rex\i to change to \IManospondylus gigas\i. Edward Drinker Cope, who was for many years the bitter rival of Marsh, it seems, found the first piece of a \ITyrannosaurus rex\i in 1892, but the "tyrant lizard king" wasn't actually given its name until Henry Osborn discovered a more complete skeleton in 1902. From his single bone, found in South Dakota, Cope named a new genus and species, \IManospondylus gigas\i, but with only that small find, there was not enough to determine what kind of dinosaur it was, so the name didn't stick.
Paleontologists from the Black Hills Institute of Geological Research have just dug up several dinosaur bones in the same region of South Dakota. They have found ribs, a jaw and parts of a skull of a \ITyrannosaurus rex\i, and they think the bones they found and the single bone Cope found belong to the same animal. In other words, what Cope did was to get in first and slap a name on it. One of the problems, of course, is that both Cope and Marsh used to have their employees spoil sites if they thought the other, or some other rival, might come and dig there. So there has to be same doubt about whether it is the same site.
Luckily, there are exceptions to the priority rule. In order to change the name, the case would have to be brought before the International Commission on Zoological Nomenclature. The commission is an arbitration body that makes binding decisions when names are in dispute. In this case, Cope lacked enough material to determine the type of dinosaur it was. Even if the bones found recently turn out to be from the same fossil Cope found, it may not mean \IManospondylus gigas\i automatically takes precedence.
The other key issue is how much the name \IManospondylus gigas\i has been used over the last century. If the first name given is seen as a well-used term, there may be a case for the name change. If the name has fallen into relative obscurity, it could be ruled as defunct and set aside. Then again, there is some doubt: Cope's find may be in the family of meat-eating dinosaurs but not actually be the same species as \IT. rex\i. It was slightly different in the other case because paleontologists originally had two sets of bones, one called \IBrontosaurus\i, the other one known as \IApatosaurus\i, and both names were in common use. It was only after some time that people realized the two sets were actually from the same animal.
So there is every chance that \IT. rex\i will still be \IT. rex\i after the dust has settled. After all, \ITyrannosaurus rex\i is probably the most famous dinosaur in the world and lay people and scientists are used to the name. And that is bad luck for young dinosaur enthusiasts who can be relied on to pour scorn on any adult rash enough to use the name \IBrontosaurus\i, because unless there is a name change for \IT. rex\i. the young people will not get their chance.
#
"Hawaiian volcanoes, hot spots and ridges",1569,0,0,0
(Jun '00)
The volcanoes of Hawaii are of particular interest to scientists because they are part of the world's most powerful scientific nation, but they would probably be of interest in any case. The volcanoes seem to originate at a 'hot spot' (see \JFinding a hotspot\j, November 1998), but that is not quite the full picture. New research, set out in \INature\i in early June reveals that some of the seamounts originating from the volcanic chain of the Hawaiian hot spot were significantly influenced by the existence of a mid-ocean ridge that has long since been subducted under present-day Siberia.
The finding is just as relevant to Iceland, or any other volcanic region in an ocean as it is to Hawaii, since the same systems work there as well. That said, the Hawaiian hot spot is the largest and one of the oldest hot spots in the world, responsible for creating not only the Hawaiian Islands, but also a string of underwater seamounts that stretch across the Pacific Ocean toward the east Asian coastline.
A hot spot is a place in the ocean where magma comes up from deep within the Earth. When it reaches the surface, it cools, creating volcanic rock formations. As the tectonic plates shift over millions of years, these formations move away from the hot spot and others form in their place, one after another, rather like a giant volcano conveyer belt.
In this case, a research team looked at one of the oldest of these formations, thought to be about 80 million years old, drilling through 900 meters of sediment to reach rock, and then drilling a further 87 meters to retrieve samples. Then they used chemical analysis to match the rocks with those from from mid-ocean ridges such as the Juan de Fuca ridge, but while the rocks were chemically similar to what you would expect from a mid-ocean ridge, the chemical "signature" was unlike anything known from Hawaii, even though it came from the same hot spot.
In fact, the composition at the old seamount indicated that the rocks came from magma that had melted relatively close to the surface. The rocks were quite low in potassium, strontium and rubidium, which are easily removed from a rock when it is melted near the surface. Rock that melts further down has higher levels of these elements in it. And that is exactly what you find on Hawaii today.
We will probably never be able to see a hot spot, but scientists have usually imagined them as giant tubes that extend far into the Earth. They seemed to remain in one place, but now there is some evidence that suggests that these hot spots actually migrate. This theory will be tested when another research team explores the northernmost seamount in the Hawaiian chain in the northern summer of 2001. Being one step north of the seamount studied in the present research, it is expected to be older still, because the chain moves in that direction, but whether it is the oldest volcano, or whether there were others in that chain that already have been subducted, remains to be seen.
The aim of the next expedition will be to use paleomagnetism, the magnetic information stored in the rock when it went solid, to find the latitude at which the rock turned solid, since this will tell us if the hot spot has moved.
#
"Predicting toxic algal outbreaks",1570,0,0,0
(Jun '00)
When rivers are full and flowing freely, nutrients levels in the rivers stay low, diluted by the water, or washed downstream. But in times of drought, when rivers may even stop flowing altogether, the concentration of the nutrients can increase alarmingly, especially as human activities continue to add nutrients to the rivers.
This is a particular problem in Australia, where the rivers flow huge distances with a limited drop that makes the rivers slow and winding, in a hot climate that evaporates much of the water away. In times of drought, the situation gets even worse, and that is the time when blooms of toxic blue-green algae, or cyanophytes, clog the rivers, making the remaining water unfit for animals or humans to drink.
With evidence that the El Ni±o Southern Oscillation is swinging to negative (which means an end to drought in those parts of the world suffering drought, but severe drought in countries like Australia), the work of an Adelaide (Australia) Ph.D. student, Gavin Bowden, is very timely. Bowden has developed a computer model that can predict blooms of blue-green algae in the Murray River, Adelaide's main water source, and part of Australia's largest river system, four weeks before they occur.
Using a neural network, Bowden's system can 'learn' the key factors which contribute to an algal outbreak using data compiled from water samples collected every week over the last 20 years at a single station on the Murray.
The model looks at a range of environmental factors in the water, including nutrients such as nitrogen and phosphorus, the flow and temperature of the water, and the turbidity and color of the water. While there are other factors involved in triggering a bloom, with a neural network, even a few measures will give the model a reasonable power of prediction.
Because the data are historical, it is then possible to test the data against what really happened and, while the match is imperfect, the output clearly shows the peaks and troughs in algal bloom development, and it can detect the trend four weeks before the bloom develops. At the moment, the model is still being refined, but by the end of 2000, Bowden expects to deliver a fine-tuned, fully workable model. And at that time, on current indications, Australia should just be heading into an El Ni±o drought that should see a real need for the model.
Bowden is now looking to adapt the computer model for other uses, such as the all-important problem in the Murray, the salinity of the water in its lower reaches. He comments: "If you knew a few weeks in advance that a large volume of highly saline water was moving down the Murray, water could be pumped to the Adelaide reservoirs before or after that to avoid pumping the saline water."
Bowden believes the model could be used throughout the world to predict problems of water quality. With most experts believing that the wars of the next half-century will be fought over potable water supplies, any resource that improves water quality is going to be important.
#
"Pacific leatherback turtles face extinction",1571,0,0,0
(Jun '00)
A study, running over 12 years, seems to indicate that there will soon be no leatherback turtles (\IDermochelys coriacea\i) left in the Pacific Ocean. That is the conclusion of James Spotila and his colleagues, based on a survey of the nesting population of leatherbacks in Playa Grande, Costa Rica. This is the world's fourth largest colony of the species, and the group has constructed a mathematical model of future population trends.
Spotila says this population is "in the midst of a collapse." The number of nesting females has fallen from 1367 in 1988-89 to just 117 in 1998-99. Each year, about one-third of the population dies after falling prey to fishing nets or lines and, unless there are serious changes to fishing practices in the Pacific, there will be fewer than 50 nesting females in the entire ocean by 2004.
The leatherback is the world's largest turtle. The turtles live mostly in the ocean, approaching land only during breeding season. Most breed every other year and lay clutches of eggs totaling between 80 to 100 at roughly 10-day intervals before returning to the sea.
"The leatherback has outlasted dinosaurs, has outlasted all sorts of catastrophes," said Pamela Plotkin, a coauthor of the paper. "Now, at the hands of man, it is right on the brink of extinction." The most probable cause for the turtles' eminent extinction is mortality from gill-net and longline swordfish and tuna fisheries.
To save the turtles, humans must close to fishing areas of ocean where the turtles congregate, get rid of all gill nets and require all countries to use turtle-excluder devices, says Spotila. "The only effect this is going to have is to make fish more expensive. It makes no difference if you sell 10 fish for $100 or 100 fish for $10. Most of these fish are not being eaten by poor people."
The release that appeared on the Internet was notable for a key typographical error where the word 'mortality' was intended: "The dwindling numbers of leatherback turtles are signaling a threat to bio-diversity in the oceans. A mathematical model based on our assessment of a once-large leatherback population predicts that unsustainable adult morality, apparently due to human fishing activity, will soon drive this population to extinction."
Which may be amusing to humans, but for the turtles, there is little time left for amusement.
#
"How cancer keeps on keeping on",1572,0,0,0
(Jun '00)
The main problem with cancer is that it does not die. Instead, it goes out of control, and the cells keep on reproducing. So whoever finds out the way cancer cells manage this will have found the perfect place to look for a chink in the cancer's armor. Now, at The Salk Institute say they have found the answer. They have deciphered the three-dimensional structure of a molecule, appropriately named 'survivin', which gives cancer cells their ability to persevere. Now the race will be on to try to find drugs that might block its activity and lead cells to 'commit suicide.'
Growth and development is controlled by the process called apoptosis (see \Jmetamorphosis\j or \JControlling apoptosis\j, April 1999, and a number of other science review articles on this topic), often loosely called 'suicide,' although it comes from a Greek origin and actually means 'falling leaves.' Any explanation involving the concept of 'suicide' is a bad description because, for instance, the development of fingers from the original hand pad takes place when lines of cells between the future fingers die off, shaping our hand. As well, most cells that 'go wrong' trigger apoptosis. The cells may die, but they do so for the benefit of the whole organism.
Survivin is an anti-apoptosis chemical that stops this natural cleaning-up process. It is present at high levels in a number of cancer cells, which gives them a way of surviving and multiplying. Survivin is turned on in almost half the malignancies seen in the Salk clinic, and it is especially common in breast, lung, prostate and colon cancers, all among the most commonly occurring cancers.
The structure was determined by X-ray crystallography, and it has been reported in the July issue of \INature Structural Biology\i. Knowing the three-dimensional shape of the protein may allow scientists to design molecules in the laboratory that can stick to survivin and knock out its activity. Logic says that there will be critical regions on its surface that are hot spots for allowing survivin to promote cell division, and any drug that binds to these sites will stop survivin from using these critical pockets.
There will be some need for caution, though, as survivin is normally active in growing embryos or rapidly dividing cells such as those that comprise the immune system, and the planned drugs, if they crossed the placenta, could harm a developing fetus. And in all patients, they would have the potential to harm a patient's immune system. So a better answer may be to seek an explanation for survivin being switched on, for example, in mature breast or prostate cells. By going back a step, scientists may be able to achieve a cure that is less harmful than the disease.
The molecule probably attaches to the 'mitotic spindle,' a foundation in the nucleus of cells that pulls newly divided chromosomes into the two daughter cells created during cell division, where it probably sets the stage for the assembly of this foundation. If this foundation is missing, the cell is unable to pull itself apart and dies.
Key names: Mark A. Verdecia, Tony Hunter, Joseph P. Noel, Han-kuei Huang, Erica Dutl, and Donald A. Kaiser
#
"Fruit fly model: how mosquitoes carry malaria",1573,0,0,0
(Jun '00)
At the very end of June, scientists announced that they have found a way to turn a fruit fly into a surrogate mosquito that is able to carry malaria and infect chickens with the disease. While this may sound bizarre, the breakthrough is likely to pave the way for better antimalarial transmission-blocking vaccines as well as engineered mosquitoes that are resistant to malaria.
The report appeared in late June in \IScience\i (June 30). With malaria directly killing more than a million people a year (see \JWHO warns of microbial threats\j, June 1999), and damaging the lives of many more indirectly, it is one of the most devastating public health menaces in the world today.
Malaria is caused by a single cell parasite of the genus \IPlasmodium\i that can only grow inside the red blood cells. The sickness from malaria is mainly due to the destruction of red blood cells. Until now, most antimalarial drugs and vaccines have been directed at the parasitic form that infects humans. Most of these attempts have failed because mosquitoes transmit the disease from one human to another before the first person is cured, and this is why eradicating malaria from a population by drug or vaccine has been difficult.
Scientists know very little about how mosquitoes carry the disease and transmit it from person to person, mainly because the insects are difficult to manipulate and are not easy to use in laboratory studies. The fruit fly, \IDrosophila melanogaster\i, on the other hand, is the geneticist's best friend. Not only are there many genetic markers, but now we have almost the entire genome of this species (see \JFruit fly genome is published\j, March 2000), so the fly provides a perfect tool for study.
One important aspect of the mosquito-malaria relationship is that only a few species of mosquito can transmit the parasite, and those that do transmit the parasite kill 99.9% of the parasites they take from the blood of a victim. So parasite-killing factors exist in mosquitoes, and this offers a way to beat the parasite, a method called a transmission-blocking strategy, which sets out to exploit these factors.
Enter the problem: designing a transmission-blocking strategy requires an understanding of how malarial parasites grow inside the mosquito. The basic techniques to study mosquito-malaria interactions are not available, so the factors in mosquitoes that determine successful or unsuccessful transmission of malaria have been difficult to identify.
That is where the fruit fly comes in. The researchers have found a way to grow the parasite in \IDrosophila\i, and these flies make a good laboratory model for studying the effects of \IPlasmodium\i. The researchers injected a form of \IPlasmodium\i, the parasite that causes malaria in chickens, into the body cavity of the flies. This, it must be stressed, was not a human malarial parasite, but \IPlasmodium gallinaceum\i, a chicken parasite.
It is important to recognize that when they carry malaria, mosquitoes do not act merely as dirty hypodermic needles, passing infected blood from one person to another. Instead, they act as an incubation chamber, where the parasite can pass through a two-week growth phase before it is transmitted to a new host when the mosquito takes another blood meal, after the parasites have developed.
This is the significance of their observation that the \IPlasmodium\i parasites thrived in the fruit flies; however, even more significant was their discovery that they developed to an infective stage that can cause malaria in chickens if it is injected into them. But before people start worrying about catching malaria from fruit flies, the parasite used does not infect humans,, and, more importantly, the fruit fly, while it can be used to grow infective parasites, has no way of passing the parasites on. The chickens were only infected artificially with parasites that have been grown in the flies.
In other words, it is now possible to culture \IPlasmodium\i in open laboratories with none of the protection that would be required if there were malarial mosquitoes being grown, including special caging and a special out-of-bounds area where public access is restricted, according to Mohammed Shahabuddin at NIAID. The \IDrosophila\i model, he says, does not require any of these because the flies can be kept literally on the laboratory bench tops.
David Schneider also found that that a component of the fly's immune system, known as a macrophage, was able to destroy the parasites in an attempt to fight the infection. These results suggest that the fruit fly can indeed serve as a model for studying malaria parasite development.
Both the mosquito and the fruit fly are part of the same insect order, the Diptera, and so they have comparatively similar genes. Because the whole genome of the fruit fly is already known, if the fruit fly has a genetic capacity to fight off malaria, it should be possible to pin down this gene or its analogs remarkably fast. There is a hint here that it may also be possible to produce mosquitoes that are resistant to the malarial parasite. In theory, such mosquitoes would have an advantage over those who carry the parasite (it parasitizes them as well!) so that, in time, all of the mosquitoes in a species might become malaria-resistant, thus breaking the malarial cycle.
This scenario requires a lot of supposition, but it may be the closest we have seen to real hope of a solution to this disease in a long time. "One advantage of using fruit flies instead of mosquitoes is that we can do large-scale genetic screens to find mutants," says Schneider. "We can knock out genes to determine whether the parasites grow better or worse when a particular gene is missing."
As well, fly studies might reveal what factors are critical to the parasite's survival inside the mosquito, highlighting weak points in the parasite's lifecycle that could be potential targets for antimalarial drugs and vaccines. Another plus is that parts of the human immune system are very similar to the immune systems of flies, so this approach may also help scientists understand how we fight infection and could even lead to new ways to treat malaria.
The best result would be some light on the rapid way that the \IPlasmodium\i parasite becomes resistant to antibiotics. "The first drugs used to treat malaria were inexpensive and had few side effects, but they are no longer effective because the parasite has become resistant," Schneider explains.
A release from the Whitehead Institute (where Schneider works) reveals that chloroquine resistance continues to increase in Africa and, with fears of toxicity and decreased effectiveness for sulfadoxine/pyrimithamine, the World Health Organization has declared that there is an urgent need for an affordable, effective, and safe alternative to chloroquine. "The alternative drugs available today are very expensive and not a viable option for the vast majority of people suffering from malaria, who happen to live in some of the world's poorest countries," Schneider says.
Malaria vaccines have been particularly elusive because the \IPlasmodium\i parasites hide inside human cells, making it difficult for the immune system to locate and purge them from the body. Furthermore, the parasite acts like a chameleon, constantly changing its surface so the immune system is unable to recognize it. The fly studies carry the potential for developing a transmission-blocking vaccine.
Such a vaccine would give no direct benefit to the person who originally became infected with malaria. Instead, it would prevent the disease from spreading throughout an entire village by prohibiting the parasite from developing inside the mosquito, and by breaking the cycle, it would benefit the whole group of humans.
Key names: David Schneider and Mohammed Shahabuddin
#
"Food chains and pond size",1574,0,0,0
(Jun '00)
Which would be safer to eat: a fish from a small pond or a fish from a large lake? Obviously, there are lots of factors at play in a choice like this, but all other things (including water pollution levels) being equal, it is safer to take the fish from the small pond, say ecologists at Cornell University and the Institute of Ecosystem Studies (IES) in the USA.
The ecologists have found that food-chain length, the number of mouths food passes through on the way to the top predators, is determined by the size of an ecosystem, not by the amount of available food energy, according to a report in \INature\i, June 29. The finding appears to resolve a major part of one of the oldest questions in ecological science: how long are food chains and what determines their length?
The so-called top predators, such as eagles, falcons and most of the fish that anglers catch to eat, are all much more likely to accumulate high concentrations of mercury, PCBs and other contaminants when they live in large ecosystems. Humans, as the ultimate predators, are at the highest risk of all, but they can reduce the load a bit by choosing the fish from the small pond.
Bioaccumulated contaminants are chemicals which pass almost unchanged from prey to predator. The levels build up as a predator gathers in 50 or 100 of its prey and retains most of the contaminants, which are then accumulated to an even greater degree, as that predator and fifty just like it are eaten by an even higher predator. The effect is one of progressive build-up until toxic levels are reached.
The report in \INature\i says it all in its title: "Ecosystem size determines food-chain length in lakes," which calls into question the standard view of the 20th century that food-chain length is determined by energy availability. Their results show that larger lakes have longer food chains than smaller lakes. They suggest that the same principle may apply to food chains in terrestrial ecosystems such as large forests and smaller forest fragments
The large lakes used in the study were Erie and Ontario, while the medium lakes included Champlain, between Vermont and New York, and some of the larger Finger Lakes in western New York. The small lakes and ponds were near Madison, Wisconsin, and West Point, New York. The food chains start with green photosynthetic algae and end in large fish like walleye, lake trout, northern pike and largemouth bass. In between, there are dozens and perhaps hundreds, depending on the size of the ecosystem, of organisms such as tiny invertebrates, little fish and medium-sized fish.
So how do you assess something as complex as a food chain? You can't really hang head-down in the water watching what eats what, but there is an easier way: take tissue samples and measure the ratio of the stable isotopes of nitrogen that are found in all organisms as an essential part of proteins, nucleic acids, and other chemicals in the body.
Nitrogen in the atmosphere is 99.63% nitrogen-14, while the remaining 0.37% is nitrogen-15, and when nitrogen is 'fixed' by microorganisms, the isotopes are fixed in that same proportion. As nitrogen moves up the food chain, more nitrogen-15 than nitrogen-14 accumulates in the tissue of animals at higher 'trophic positions.' So, if you measure the ratio of nitrogen-15 to nitrogen-14 of fish at the top of the food chain, and you compare this with the ratio of nitrogen-15 to nitrogen-14 for organisms at the bottom, you have a direct indication of the length of any food chain.
The isotope ratio can identify the trophic position of various organisms in food chains; a human vegetarian occupies the same trophic position as a cow, and would have the same amount of nitrogen-15 in their tissue when they are both eating the same plants. A carnivore who ate the cow (or a cannibal that ate the human) that ate the plants is at a higher trophic position and would have more nitrogen-15 in his/her tissue than the vegetarian or the cow.
The real surprise is that productivity and energy availability are less important to food-chain length than the size of the ecosystem. They found that small, nutrient-rich and highly productive lakes have shorter food chains than do the larger, low-productivity, crystal-clear lakes, and big lakes have longer food chains than smaller lakes with the same productivity levels.
The fragmentation of forests has long been known to result in smaller numbers of species than in unfragmented forests (see \JAre we in the middle of a mass extinction?\j, August 1999), and this finding may explain the greater diversity of life forms in a large forest.
Key names: David M. Post, Michael L. Pace and Nelson G. Hairston Jr.
#
"An unusual crocodile fossil",1575,0,0,0
(Jun '00)
Several reports appeared on the Internet at the end of June about a fossil crocodile named \ISimosuchus clarki\i which has been discovered in Madagascar, off the east coast of Africa. Far from the usual image of a fearsome all-devouring monster lurking in the muddy waters of a river, this animal seems to have lived primarily on land, survived mainly on plants and dug holes for burrowing.
"This creature is not something that, if it were alive today, people would be running from," said crocodile expert Gregory Buckley in one short piece. "It's something very different from the crocodiles we see now."
The unusual fossil had a short, blunt snout and clove-shaped teeth with the multiple points that are usually associated with plant-eating animals like iguanas and herbivorous dinosaurs. Such teeth have never before been seen in either fossil or living crocodiles, which have teeth with a single point that is used to impale animal prey.
#
"Modifying malaria mosquitoes",1576,0,0,0
(Jun '00)
An interesting approach to controlling malaria was raised in \INature\i in late June: modifying the mosquito that spreads the disease so that it is able to resist the disease. Scientists already know that most mosquitoes resist the \IPlasmodium\i parasite, and even those which do not still manage to destroy most of the parasites (see \JFruit flies that carry malaria\j, this month), so this is a reasonable approach. Now, with the news that a team of researchers from several European countries has succeeded in modifying the germline of the \IAnopheles stephensi\i mosquito which carries the malarial parasite, it is also a feasible approach.
Translated from science-speak, the workers have changed genes in a mosquito, and had these genes transmit to later generations of the same mosquito. This is only a first demonstration experiment, but now there is an established technique for altering mosquito characteristics.
If resistant mosquitoes could be bred in large numbers and released in an area, most of the next generation of mosquitoes would be resistant. And if this could be repeated several times, the number of carrier mosquitoes would be knocked right down. It is even possible that the resistant mosquitoes would have a selective advantage, since the \IPlasmodium\i parasite also saps the energy of the mosquito that it rides in. If that happened, the number of carrier mosquitoes would stay low, and the transmission link from human to human would be broken. However, the actual way to use the new breed remains speculation until such a mosquito is produced.
The method commonly used to introduce genes into the germline of insects such as fruit flies involves transposable elements, mobile DNA segments that shepherd new genes into place. Until now, efforts to transform \IA. stephensi\i have been hampered by the lack of a transposable element that would work in this mosquito.
Andrea Crisanti suspected that the fruit fly Minos element might be able to adapt to \IA. stephensi\i. The test was simple: inject a fluorescent gene surrounded by particular DNA sequences along with the gene for the enzyme transposase into mosquito embryos. If all went well, the transposase would then recognize these sequences, cut out the DNA, and insert it into the mosquito's genome.
It may seem a little odd to set out to create a mosquito that glows when exposed to the right sort of light, but this is standard practice in genetic studies because the successes are easy to spot. In fact, after injecting a total of 885 embryos, Crisanti's team found that an average of 29% of the embryos survived, with about 50% of the larvae being fluorescent.
About 10% of the larvae survived to adulthood, and the overall transformation efficiency was 7%, which is comparable with the results obtained with the Minos transposon in other insects. Crisanti has yet to try the technique on the \IAnopheles gambiae\i mosquito, the main carrier of malaria in sub-Saharan Africa, but he does not expect there to be any real problem in overcoming some minor technical problems that he and his team have run into.
What sort of genes could the mosquitoes be given? Suitable candidates would be genes that interfere with the development of the malaria parasite, or perhaps genes involved in malaria resistance. The problem will be to locate and identify these genes, and the fruit fly work mentioned above could well play a role here. At least scientists now know that there is a point in searching for these genes.
Key names: Andrea Crisanti
#
"Humans can regrow liver cells from bone marrow",1577,0,0,0
(Jun '00)
It is a fairly standard view that new human tissue comes from stem cells, and, while several studies have tried to find liver stem cells, there is no evidence for them. As a result, some researchers have even suggested that livers are just regenerative organs which look after their renewal without the aid of any stem cells at all. Now, though, there is compelling evidence that that the human liver can regenerate its tissue with a cell type from outside the organ, and those stem cells are human bone marrow.
Appearing in the journal \IHepatology\i in late June, following on from preliminary reports in December 1999 and January 2000, the discovery has some interesting implications for future stem cell research. If a bone marrow stem cell can become liver, what is to stop it becoming anything else as well? That said, the result is not entirely unexpected since the effect has been seen in other animals, but this is the first time it has been seen in humans.
The evidence came remarkably easily. Compatibility for a transplant depends on a lot of things, but the sex of the donor is not one of them, so it is quite common for the donor of bone marrow to be of the opposite sex. Aside from this, the key issue is that only male cells carry the Y chromosome, and a Y chromosome can be detected in a cell because it takes up a specific dye in a particular way. Knowing this, Neil Theise and his colleagues were able to study samples of liver tissue from two women who had received bone-marrow transplants from male donors.
They prepared the tissues and then stained them with chromosome-specific dye (Y chromosomes fluoresced as light green dots, X chromosomes as red). Between 5 and 20 percent of the women's liver cells (hepatocytes) and bile-duct cells (cholangiocytes) exhibited green dots. As there was no other source for the Y chromosomes, " . . . the cells with Y chromosomes could only have come from their male marrow donations," explained Theise.
In the longer term, this might lead to a situation where the patient's own stem cells could potentially provide his or her own liver-cell transplant, solving both organ shortages and rejection issues. All that might be needed would be to harvest the stem cells from a blood sample, grow them in culture, and then reinject them.
Again, given that stem cells can be excellent targets for gene therapy, it might be possible to extract bone marrow stem cells, adjust them, and then return them to the body, in order to correct permanently inherited liver diseases. In fact, Theise mentions that this has already been demonstrated in mice by scientists at StemCells California Inc., so this part of the future may be closer than we thought.
#
"July, 2000 Science Review",1578,0,0,0
\JSilicon chip artificial retinas\j
\JBridge problems?\j
\JIBM's new biggest-ever computer - for now\j
\JA new view of the Ngandong hominids\j
\JThe moon's dust clouds\j
\JGalileo to crash into Jupiter\j
\JThe Durban Declaration\j
\JA new HIV vaccine\j
\JProtein biochips\j
\JProteomics 101\j
\JASM Issues Statement On Genetically Modified Organisms\j
\JThe Chandler Wobble solved\j
\JCancer causes\j
\JEdible vaccine scores a win\j
\JSpam in the news\j
\JScorpion venom can be good for you\j
\JHow meiosis gets started\j
\JAnts and the state of the environment\j
\JHow genes jump\j
\JSwitching off the appetite in mice\j
\JPrions and genes\j
\JChandra captures flare from brown dwarf star\j
\JSaving Vavilov\j
\JAutism and language development\j
\JMicroscopic life at the South Pole\j
\JLonger-running CDs\j
\JDeep Space 1 revived\j
\JNevirapine, mothers, children and HIV\j
\JPulsars may be much older\j
\JBioengineered corneas\j
#
"Silicon chip artificial retinas",1579,0,0,0
(Jul '00)
On June 28, the first artificial retinas made from silicon chips were implanted in the eyes of two blind patients who had lost almost all their vision because of retinal disease. The patients were both released from hospital the following morning, and preliminary tests indicate that no complications had occurred. A third transplant was carried out the next day.
The same team performed all three operations and was led by Dr Alan Chow, president and CEO of Optobionics Corporation of Illinois, the company that invented and developed the artificial retina. Also involved in the work were Gholam Peyman and Jose Pulido, and all three are feeling very pleased: "We've completed the first part of our journey to the Holy Grail of restoring eyesight to the blind," Pulido said in a press release.
The Artificial Silicon Retina (ASR) was invented by Chow and his brother Vincent Chow. It is a silicon microchip about one-tenth of an inch in diameter and one-thousandth of an inch thick - less than the thickness of a human hair. It contains some 3500 microscopic solar cells that convert light into electrical impulses. The chip is intended to replace damaged photoreceptors, the 'light-sensing' cells of the eye, which normally convert light into electrical signals within the retina.
Photoreceptor cells are typically lost when people suffer retinitis pigmentosa (RP) and other retinal diseases. All three patients who received the implants have lost almost all their vision from retinitis pigmentosa. The two men and one woman, two of whom use guide dogs, are between 45 and 75 years of age, and the operations were performed as part of a feasibility and safety study of ASR implantation.
These trial runs used a small version of the implant - placed in a side portion of the retina - and, if the implants are able to stimulate the retina, patients may develop some degree of vision over the location of the implant within the next month.
The insertion is done using standard techniques of microsurgery. It starts with three tiny incisions in the white part of the subject's eye, each no larger than the diameter of a needle. Instruments are then introduced to remove the gel from the eye and replace it with a saline solution. The next step is to inject fluid under the retina, raising the retina to make a small pocket in the 'subretinal space' just wide enough to accommodate the ASR.
Once the implant has been slipped into this space, they reseal the retina over the ASR, introduce air into the middle of the eye to gently push the retina back down over the device, and close the incisions. Once the eye is sealed, the air bubble is reabsorbed and replaced by fluids created within the eye.
"The use of the subretinal space to hold a device that artificially stimulates the retina seems a logical step in replacing the loss of photoreceptor cells of the retina," Peyman explained. "If the implant is tolerated well and is able to successfully stimulate the retina, it may open up new opportunities for restoring sight in patients with the end stages of retinitis pigmentosa."
#
"Bridge problems?",1580,0,0,0
(Jul '00)
Modern bridge experts like to believe that the collapse of the Tay Railway bridge in 1879 was the start of an era of thirty-year disaster cycles: Tay 1879, Quebec 1907, Tacoma Narrows 1940, and box girder bridges in several countries around 1970. On that accounting, they say, there is something else presumably lined up to happen in the next few years. Some of the smart money is on a scenario that sees engineers get too confident and start making cable-stayed bridges longer than is safe.
On the other hand, it might be something similar. There are more than 7200 pretensioned deck-beam bridges in the American state of Illinois, and two beams from such bridges have fallen apart in recent times, setting off an unfortunately named "crash course" in bridge inspection and repair. The bridges were intended to last 50 years, but, according to Neil Hawkins, a professor of civil and environmental engineering at the University of Illinois. ". . . several of the state's pretensioned deck-beam bridges have shown unacceptable levels of corrosion damage after only 20 to 30 years of service."
Engineers are now working on better visual inspection procedures to assess the corrosion damage in deck beams, the rate at which the damage is spreading, and the remaining service life of a bridge. Each pretensioned deck beam is basically a long, hollow box, made in a standard way. First, high-strength steel cables are stretched tightly, additional reinforcing bars are added, and then concrete is poured around the steel. After the concrete has cured, the cables are cut at each end of the beam, pulling the concrete into compression.
After that, the beams are laid side by side and bolted together, then the joints are grouted with cement. Over time, the joints can crack, allowing water and salt to infiltrate in winter, when salt is used to deal with ice and snow. The salt corrodes the steel reinforcing cables within the concrete and, as the strands break, the bridge weakens.
The problem is that the cables are hidden, so the emphasis has been on external assessment of the number of strands likely to be broken. The aim is to come up with a method to assess the hidden damage and evaluate the remaining capacity and life of the structure by inspecting the location and amount of rust on the surface.
The problem will never be as bad again because builders now add corrosion inhibitors to the concrete, and this will significantly reduce the likelihood of corrosion and increase the service life of the bridges, but there is a lot of catching up to do.
#
"IBM's new biggest-ever computer - for now",1581,0,0,0
(Jul '00)
On the last day of June, IBM started shipping the components for the world's largest supercomputer. The 28 large trucks carried just one quarter of the entire machine off to California, where it will be assembled at the Lawrence Livermore National Laboratory (LLNL). When it is operating, ASCI White, as it is called, will be the size of two basketball courts and draw enough electricity to power a small town.
The LLNL, one of America's leading nuclear research laboratories, will use ASCI White to simulate nuclear test blasts at an unprecedented level of detail and speed. Even so, one of the simulations will run for 30 days, but, if that seems slow, a Cray supercomputer built in 1995 would take 60,000 years to perform the same calculations.
Under provisions of the 1996 Comprehensive Test Ban Treaty, the U.S. is prohibited from testing warheads by exploding weapons from its aging nuclear stockpile, where the fissile material is decaying in accordance with the laws of nuclear physics. So the supercomputers will allow scientists to predict how the volatile materials in the warheads behave as they age and change.
ASCI White can perform a mind-boggling 12.3 trillion operations a second, or 12.3 teraflops. It is three times faster than the previous fastest machine, another IBM giant known as ASCI Blue, which runs at 3.8 teraflops. It is 1000 times more powerful than Deep Blue, which generated 200 million chess moves every second to famously defeat World Chess Champion Gary Kasparov in May 1997 (see \JDeep Blue beats Kasparov\j, May 1997) and later was shunted to other uses (see \JDeep Blue goes data mining\j, June 1997).
ASCI White is not a single computer, but a massively parallel machine made from 512 of IBM's RS 6000 servers. Each server has 16 processors, each a supercharged version of the PowerPC chips used in Apple's Macs, and these chips also operate in parallel, giving a total of 8192 processors, running in parallel. Each of the 512 servers is about the size of an air conditioner. They are stacked on top of each other in refrigerator-sized racks, which, arranged row after row, will fill a giant hall, running a parallel version of IBM's AIX, Big Blue's version of UNIX.
There are plans in place to have an even faster version in operation by 2004, running at 100 teraflops, and far beyond the rate predicted in Moore's law, which says that computing power doubles every 18 months.
Your reporter, floating to a group of power users the notion of having the power of one of today's hundred million dollar supercomputers on their desk just ten years from now, asked them to consider what they would do with that power. Proposals ranged from making realistic movies and art works to doing away with the keyboard in favor of voice recognition, but the consensus seemed to be that a large corporation would be able to create an operating system that would put such a computer firmly in its place.
Such is the cynicism of the computer user.
#
"A new view of the Ngandong hominids",1582,0,0,0
(Jul '00)
The Ngandong hominids of Indonesia are commonly regarded as \IHomo erectus\i and, according to the dominant 'out-of-Africa' model, they should have been entirely replaced by later waves of \IHomo sapiens\i, leaving their evolutionary cradle in Africa and spreading across the globe.
This matter came under question late in 1996 when Carl Swisher and his colleagues used electron spin resonance and uranium series dating to date antelope teeth found in the same deposit as the fossils at Ngandong on Java. They got dates of between 53,000 and 27,000 years, and argued that the hominid skulls found there were of similar age.
The skulls were originally found associated with animal fossils that are typical of the last 100,000 years, but scientists had assumed that the skulls were much older. The thickness of the bone, and the shape of the skulls seemed to make it quite clear that these are \IHomo erectus\i remains, and the fossil association was explained by assuming that the skulls had eroded from old sediments, and been deposited with younger material.
The Ngandong skulls are considered to be intermediate between the earlier Sangiran skulls of Java ('\JJava Man\j,Æ which are believed to be about 1 million years old) and the earliest \IHomo sapiens\i finds from the region. After Swisher's work, scientists were wondering how they could deal with a scenario which has \IHomo erectus\i and \IHomo sapiens\i living side by side over long periods of time. Most of them treated the matter cautiously, and saying only that more investigation will be needed before they will commit themselves.
A report in the \IJournal of Human Evolution\i in mid-July (Vol. 39, No. 1, July 2000) may have part of the answer, but the out-of-Africa group will not like it. Working on Australian material provided by Alan Thorne of the Australian National University, a group which include Milford Wolpoff has found close similarities between Australian material and the Ngandong specimens. (It is worth noting here that both Thorne and Wolpoff entirely reject the out-of-Africa theory.)
It is important to realize that \Iany\i hominids found in Australia must be modern humans capable of building a boat that could make an 80 km (50 mile) sea voyage[, even when sea levels fall, because there has always been a gap this large separating Asia from Australia. So if Australian finds are similar to Ngandong humans, this suggests that Ngandong humans were fully as human as we are. In other words, it begins to look as though any replacement theory of human origins has a few problems.
In simple terms, the ancestry of a single modern human specimen from Australia, known as WLH-50, can tell us a great deal because the replacement model and the multiregional evolution model make different predictions for the ancestry of this specimen and others like it.
The group tested these predictions using two methods described in the technical language of research as a discriminant analysis of metric data for three samples that are potential ancestors of WLH-50 (Ngandong, Late Pleistocene Africans, and Middle Eastern hominids from Skhul and Qafzeh) and a pairwise difference analysis of nonmetric data for individuals within these samples.
The key finding that they report is that the " . . . results of these procedures provide an unambiguous refutation of a model of complete replacement within this region, and indicate that the Ngandong hominids or a population like them may have contributed significantly to the ancestry of WLH-50." As a result, they believe that the Ngandong hominids should now be classified within the evolutionary species \IHomo sapiens\i.
Key names: John Hawks, Stephen Oh
, Keith Hunley
, Seth Dobson, Graciela Cabana, Praveen Dayalu
and Milford H. Wolpoff.
#
"The moon's dust clouds",1583,0,0,0
(Jul '00)
A small could of dust hangs suspended less than a meter above the moon's surface. It was first photographed by the Lunar Surveyor spacecraft in the 1960s and later observed at close range by Apollo astronauts. For the past 30 years, it has been a minor puzzle to some planetary scientists, but the standard explanation now appears to have been confirmed, according to a report in \IPhysical Review Letters\i on June 26.
There is no air on the moon, and there is no known antigravity effect, so that pretty much leaves some sort of electrical or magnetic effect. And unless the dust cloud is made of magnetic monopoles (which are as hard to find as antigravity), that leaves only electrical effects. Magnetic particles, rather than remain suspended, would flip over sooner or later and be drawn down to the surface.
Now the problem becomes one of explaining what would give the moon's surface, and the dust, the same charge. The most likely candidate was the photoelectric effect. Researchers have assumed the lunar dust levitation is caused by ultraviolet photons from the sun ejecting electrons from isolated grains of dust, giving each a positive charge. The same radiation also is thought to be knocking electrons off the moon's surface rocks, causing the electrons to bounce upward and negatively charge dust grains near the surface.
In the end, the negatively charged dust particles fall back to the surface while a small belt of positively charged particles float above the surface at a point where the downward force due to gravity and the upward force from the charge balance each other.
So much for theory. In practice, you need an experiment to see if you can create the effect. Researchers dropped individual pepper-grain-sized particles of zinc, copper and graphite through an evacuated chamber that was illuminated by UV light from an arc lamp. The grains fell about a foot into a device called a Faraday cup, which directly measured the electrical charge of each grain, and the charges were found to be close to what theory had predicted. Next, they put a zirconium plate near the falling grains to simulate the effect of moon rocks emitting photoelectrons, and this produced negatively charged dust particles, as expected.
This appears to be the first time anybody has actually confirmed what many have believed to be highly probable, and the result may be important in future space exploration, since this odd dust phenomenon may also occur on asteroids, the rings of planets and even around spacecraft. According to one of the team, Amanda Sickafoose, "By understanding how and why these dust particles charge, scientists can find ways to better protect telescope lenses, spacecraft instruments and astronauts from the negative effects of charged space dust."
Key Names: Amanda Sickafoose, Scott Robertson, Joshua Colwell, Mihaly Horanyi and Bob Walch
#
"Galileo to crash into Jupiter",1584,0,0,0
(Jul '00)
What do we do with old spacecraft? If they are sufficiently controllable and close enough, we can bring them down to earth in a safe place, with most of the material burning up on the way in (see \JFarewell to Compton\j, June 2000).
The Galileo spacecraft is more of a problem. It is too far away to ever return to Earth, and some people have urged NASA to leave Galileo in orbit around Jupiter for the benefit of future space archeologists. ThereÆs just one problem: the space agency does not want to risk the spacecraft hitting any of the planet's three large moons that could harbor extraterrestrial life, or any of the other moons, for that matter, just in case.
The likelihood of damaging life on one of the three large moons is remote, given the size of even the smaller moons, and given the heat that will be generated when a fast-moving spacecraft hits even a smallish moon under free fall with no atmosphere to slow it down. But if there is even the smallest, remotest possibility of risk to alien life forms, then it is better to drop the spacecraft into the gravity well that is Jupiter, where the craft will be ripped apart and destroyed well before it reaches any possible life forms.
While Galileo could be placed in a safe orbit, there can be no guarantee that a passing body will not pull or push the craft from that orbit. Some astronomers believe that Halley's Comet may fail to appear in 2062 because there was a small outgassing seen from the comet after it passed Jupiter on its outward journey after the 1986 visit. While the impact was probably small, and the reaction to the outgassing was also small, it could be enough to jog the comet out of its elliptical orbit. A similar principle could send Galileo cartwheeling around the planet and, worst of all, plunging down to hit the surface of Europa, generally regarded as the worst possible scenario.
Scientists believe an ocean sloshes around below the ice cap that covers the moon's surface, and any microbes stowed away aboard Galileo since its 1989 launch could contaminate that ocean. So rather than take that risk, the (US) National Research Council's Space Studies Board has seconded NASA's preliminary plans to re-route the spacecraft so it impacts Jupiter, possibly as early as December 2002.
Before then, NASA will try to get as much science as possible out of the aging spacecraft, including a plan to fly past two of Jupiter's moons, Io and Amalthea, which could delay the spacecraft's destruction to 2003 or even 2004. But in the end, Galileo will have a fiery death in the atmosphere of Jupiter. That, say scientists, should deal with any tenacious stowaways.
#
"The Durban Declaration",1585,0,0,0
(Jul '00)
The mischief stirred up by the self-styled "AIDS skeptics" (see \JSouth African AIDS panel meets\j, April 2000) has met with a powerful response from the medical community as the XIII International AIDS Conference got under way in Durban. While South African President Thabo Mbeki continues to assert that there is a need for an African solution to the African problem and that AIDS is caused, not by HIV, but by poverty, the Durban Declaration has been posted, asserting that the world's acknowledged medical experts agree that HIV is the cause of AIDS and setting out the reasons for that belief.
According to \INature\i, which has published a detailed case for the HIV/AIDS link, outlined below, the " . . . declaration has been signed by over 5000 people, including Nobel prizewinners, directors of leading research institutions, scientific academies and medical societies, notably the US National Academy of Sciences, the US Institute of Medicine, the Max Planck institutes, the European Molecular Biology Organization, the Pasteur Institute in Paris, the Royal Society of London, the AIDS Society of India and the National Institute of Virology in South Africa. In addition, thousands of individual scientists and doctors have signed, including many from the countries bearing the greatest burden of the epidemic. Signatories are of M.D. and Ph.D. level or equivalent, although scientists working for commercial companies were asked not to sign."
By the middle of the conference week, there were 5195 signatories from 83 countries, all acknowledged experts in their fields; but science has never been a matter of democracy and popular votes, so what is going on? In simple terms, people who deal with sick and dying patients every day have had enough. They say that the actions and posturings of a vocal minority who continue to deny the evidence will cost countless lives. And so they have stood forward and put their names to a simple straightforward display of the facts, posted on the Web, at http://www.durbandeclaration.org/
The list includes not only the leaders of the medical community, but the people who are working to stop a disease which is now sweeping Africa. Thabo Mbeki is right in one respect: it is a peculiarly African problem, even though it is now loose in many other countries, and poverty does indeed play its part in supporting the spread of HIV. But there can be no denying the facts: there is not a single person with AIDS who does not show signs of HIV infection, and only a very, very few people who are infected with HIV avoid developing AIDS eventually.
The human immunodeficiency virus was found in 1983 and, according to the declaration, an estimated 34.3 million people worldwide are living with HIV or AIDS, 24.5 million of them in sub-Saharan Africa. In 1999 alone, 2.8 million people died of AIDS. Southern and Southeast Asia, South America and regions of the former Soviet Union are all expected to show huge increases in HIV/AIDS incidence in the next two decades.
The declaration stresses that AIDS spreads by infection, like many other diseases, such as tuberculosis and malaria and that, while both HIV-1 and HIV-2 first arose as zoonoses - infections transmitted from animals to humans - both forms now spread among humans through sexual contact; from mother to infant; and via contaminated blood. HIV-1 is responsible for the AIDS pandemic, and HIV-2 is prevalent in West Africa and has spread to Europe and India. HIV-2 is almost indistinguishable from a simian immunodeficiency virus (SIV) that infects sooty mangabey monkeys.
It is not unusual for diseases to cross the species barrier from animals to humans: bubonic plague came from rats and their fleas, influenzas from birds, and the new Nipah virus in Southeast Asia reached humans via pigs. Brucellosis comes from cattle, and the 'variant Creutzfeldt-Jakob disease' in the United Kingdom is identical to æmad cowÆ disease.
All the same, the self-styled 'AIDS skeptics' deny the link between HIV and AIDS, even though the link is clear-cut, exhaustive and unambiguous, meeting the highest standards of science. The data fulfil exactly the same criteria as for other viral diseases, such as polio, measles and smallpox. In fact, they offer a textbook example of how the cause of a disease is identified, and the following points are taken from the declaration, edited only by the addition of bullets for each point, and the removal of the scholarly references found in the text
* Patients with acquired immune deficiency syndrome, regardless of where they live, are infected with HIV.
* If not treated, most people with HIV infection show signs of AIDS within 5-10 years. HIV infection is identified in blood by detecting antibodies, gene sequences or viral isolation. These tests are as reliable as any used for detecting other virus infections.
* People who receive HIV-contaminated blood or blood products develop AIDS, whereas those who receive untainted or screened blood do not.
* Most children who develop AIDS are born to HIV-infected mothers. The higher the viral load in the mother, the greater the risk of the child becoming infected.
* In the laboratory, HIV infects the exact type of white blood cell (CD4 lymphocytes) that becomes depleted in people with AIDS.
* Drugs that block HIV replication in the test tube also reduce virus load in people and delay progression to AIDS. Where available, treatment has reduced AIDS mortality by more than 80%.
* Monkeys inoculated with cloned SIV DNA become infected and develop AIDS.
Your reporter can recall when the alarm bells first started ringing as a peculiar and rare cancer, Kaposi's sarcoma, suddenly became common. Today, people infected with HIV are more than 100 times more likely than uninfected people to develop KaposiÆs sarcoma, which is linked to another virus, but which is clearly able to thrive when a patient's immune system is knocked out by HIV.
It is true that people who are malnourished, who already suffer other infections or who are older tend to be more susceptible to the rapid development of AIDS following HIV infection; however, many other disease organisms are more effective against such people, so none of these factors weakens the scientific evidence that HIV is the sole cause of the AIDS epidemic.
Right now, prevention is probably the most important step to take because the knowledge and tools to prevent infection are available. The sexual spread of HIV can be stopped by mutual monogamy, abstinence or by using condoms. Blood transmission can be prevented by screening blood products and by not reusing needles. Mother-to-child transmission can be reduced by half or more by short courses of antiviral drugs; however, right now, the costs of prevention are too high for many of the most heavily infected parts of the world.
The developing world needs new antiviral drugs that are easier to take, have fewer side effects and are much less expensive so that millions more can benefit from them. And perhaps, just perhaps, there may be some hope on the horizon: see \JA new HIV vaccine\j, this month.
The conclusion of the declaration is too important a message for it to be summarized. It deserves to be copied and displayed in doctors' waiting rooms, in classrooms, and anywhere else where impressionable minds may be found:
"There are many ways of communicating the vital information on HIV/AIDS, and what works best in one country may not be appropriate in another. But to tackle the disease, everyone must first understand that HIV is the enemy. Research, not myths, will lead to the development of more effective and cheaper treatments, and, it is hoped, a vaccine. But for now, emphasis must be placed on preventing sexual transmission.
"There is no end in sight to the AIDS pandemic. But, by working together, we have the power to reverse its tide. Science will one day triumph over AIDS, just as it did over smallpox. Curbing the spread of HIV will be the first step. Until then, reason, solidarity, political will and courage must be our partners."
Key names: there are 5195 of them so far, and the list is growing. See http://www.durbandeclaration.org/ for the list.
#
"A new HIV vaccine",1586,0,0,0
(Jul '00)
The XIII International AIDS Conference in Durban was told on July 11 of plans for human trials of the first AIDS vaccine candidate designed specifically for Africa. The International AIDS Vaccine Initiative (IAVI), Web site http://www.iavi.org/, says that the Medicines Control Agency (MCA) of the United Kingdom has approved Phase I testing of a DNA vaccine based on HIV subtype A, the most common strain in Kenya and in many other parts of Africa.
Volunteers will be recruited in Oxford during August and, pending local approvals, another trial is expected to follow in Nairobi three to six months later.
Key names: Andrew McMichael in Oxford and J. J. Bwayo in Nairobi. The Oxford/Nairobi Partnership is the first of four vaccine development partnerships funded by IAVI, which describes itself as ". . . a global scientific organization dedicated to accelerating the development of AIDS vaccines for use throughout the world."
#
"Protein biochips",1587,0,0,0
(Jul '00)
In late June, scientists at Indiana's Purdue University reported at a conference on nanoscience and nanotechnology that they have created the first protein 'biochips,' mating silicon computer chips with biological proteins. The idea is to have chips containing thousands of proteins in a device about the size of a handheld computer that could quickly and cheaply detect specific microbes, disease cells and harmful or therapeutic chemicals.
The biochip has proteins exposed on its surface. Each protein has a very specific shape or charge, which means that only a very few biological molecules will bind to it. If one of those molecules happens to be characteristic of the cell wall of a bacterium, any bacterial cell passing by is likely to attach to a matching protein on the chip and, when it does, the electrical properties of the chip alter in a predictable way. Other bacteria or molecules in the sample would not bind to the chip.
In other words, the protein biochip has the potential to detect just one bacterium in a sample, but that is just the beginning. Physicians could use devices containing biochips to quickly diagnose common diseases or to test the effectiveness of chemotherapy. Equally, soldiers might rely on sensors on the battlefield to sound the alarm in the event of a biological or chemical attack, while farmers could use biochips as sentinels in the fields, ready to alert them at the first sign of a disease.
Then again, medical scientists could use the biochips to investigate whether certain plants popular as folk remedies actually contain biochemicals that have useful biological activity and could develop new pharmaceutical products based on the findings.
Biochips are not new, of course. The chips are mainly found in automated sequencing of genes, including the human genome. But as proteomics (see \JProteomics 101\j, this month) takes off, a whole new area of application is going to become important because proteins are very specific about which other proteins or biochemicals they will interact with. The biochips reported this month are at best marginally related to proteomics right now, but there is always the possibility that these topics will come together later on.
Proteins match and bind in much the same way that a key fits a lock. The biochip is covered in 'keys' and, as the random 'locks' stream or slosh past, when there is a match, the two will fit and cling together. The first chip has the protein avidin attached to the surface. Avidin binds to a vitamin called biotin, and biotin molecules labeled with a fluorescent dye attached to the avidin embedded on the biochip.
Proteins also bond to each other in very specific ways, so now that the first chip has been made and the first protein has been attached, it is only a matter of time, say the researchers, before protein biochips become everyday tools. The first use is already planned: the first non-laboratory application of the new biochips will be to develop sensors to detect the deadly pathogen \IListeria monocytogenes\i in foods.
The USA had some 2500 cases of infection with this disease in 1999 and, unlike other foodborne pathogens, a high number - one out of five - of the cases of \IListeria monocytogenes\i infections are fatal, so better detection of this fatal food pathogen is a high priority for the food industry. Right now, says Arun Bhunia, associate professor of food science at Purdue, "The problem is, however, that at the present time we can only detect the pathogen if we have a large sample. To get a large number, you have to let the bacterium grow in a laboratory. You typically don't see levels that high in a food system," he says. "It can take as much as five to seven days to grow, test and confirm the presence of a specific pathogen."
The answer, say the researchers, is to build a biochip containing antibodies to \IListeria monocytogenes\i obtained from rabbits or mice. These antibodies are themselves proteins that organisms use to recognize and disable harmful proteins by bonding to them. Only \IListeria monocytogenes\i could interact with the antibodies on the chip, so a definite determination of the absence or presence of the bacterium in a sample could be made within minutes.
From time to time, a key to a new lock may be found on an old and slightly rusty bunch of keys, a key which fits by chance, even though it was made for an entirely different lock. There are, after all, only so many designs for a key and a lock, so there will always be a small risk of a 'false positive' from a diagnostic biochip. This sort of risk can be avoided by looking for two or three proteins at the same time, but the same risk can also be harnessed to do some good.
For example, there are millions of species out there, all containing huge numbers of proteins, and some of these will be biologically active: they will attach to proteins in us, or in disease organisms. The problem is screening for them and locating them. "The real bottleneck in biological research is the lack of a way to quickly interrogate the chemistry of various organisms to find out if they contain any beneficial or harmful compounds," Michael Ladisch says.
The use of penicillin, derived from the \IPenicillium\i fungus, was discovered by chance and, up until now, says Ladisch, we have been uncovering the actual proteins or molecules at the rate of just a few a year. This development has the potential to increase that number several-fold: "What we would have would be a high-tech litmus paper. It would tell us the presence of molecules with specific properties and the concentrations. There are a lot of secrets still being held by Mother Nature. Maybe this will allow us to probe for some of the more obvious ones," he said.
The research question that scientists will answer with these first biochips will be along the lines of "is component x, or something with similar properties, present in this sample?" Later perhaps, the biochips may be able to answer a genuine question from proteomics, which is more in the form "what proteins are found in only one of these two samples of the same tissue, one of them diseased (or cancerous)?" where the aim is to find targets for further work.
Michael Ladisch's professional Web page: http://fairway.ecn.purdue.edu/IIES/Faculty_and_Staff/MLadisch.html
Key names: Michael Ladisch, Rashid Bashir, Rafael Gomez
#
"Proteomics 101",1588,0,0,0
(Jul. '00)
By now, most people know vaguely what a genome is and have at least have heard of genomics, even if they are unsure what it means. Our readers, at least, should know the terms, for more than 100 entries on the World Encyclopedia mention 'genome,' while more than a dozen entries discuss the fine detail of genomics, but we will start by defining those terms.
The \Bgenome\b is the entire genetic signal of an organism, the entire amount of DNA in the cell. Every somatic, or body, cell has a full set of all of the DNA, so even though not all of that DNA is expressed by any given cell, the whole genome is still found in each cell. \BGenomics\b is the study and application of knowledge of the genome, the application of the information that is stored within the genes. By studying the genetic blueprint encoded in an organism's DNA, scientists can search for the biological cause of a disease, often a mutation in a gene, which gives them an insight into possible ways of curing the disease.
And so to the proteome and proteomics. The term 'proteome' was coined by an Australian scientist, Marc Wilkins, in 1994 when he was completing a Ph.D. at Macquarie University in Sydney under the supervision of Keith Williams. The word 'proteomics,Æ also coined in 1994 by Wilkins, now a lead scientist at PSL, ô. . . describes the study and application of PROTEins expressed by a genOME," according to Williams.
The word has come to mean something more like 'large scale analysis of proteins within a single experiment or series of experiments.' In other words, proteomics involves taking snapshots of the proteins found in a cell and drawing conclusions about the processes taking place, irrespective of what the genes may be doing at the time. In this form of the term, the proteome is the set of proteins at work in any particular tissue. This makes sense, because the proteins in different tissues do not interact with each other to any great extent, so a bit of reductionism will serve to cut down on the number of variables in play.
By 1995, Proteome Inc was founded in the US and, by 1998, specialist biotech firms were using the word among themselves. Proteome Systems Limited was formed in Sydney in early 1999. By March 1999, Proteome Inc had a proteomics-related US patent (5882495), and Proteome Sciences (UK) went public in April 1999. Yet, before March 2000, when Craig Venter at Celera announced his entry into the next big race, proteomics, hardly anybody took any real notice of the term.
Venter is putting a lot of money into this one, to the tune of $US 960 million, he says, but even that did not really make much of a splash. Early July seems to be the time when the notion exited from Boffinland into the real world, and now headlines around the world feature the terms 'proteome' and 'proteomics,' and any decent search engine will return 10,000 hits on 'proteomics' in early July 2000.
One delightful analogy for the new science is looking at the switchboard of a large department store, where the lights flicker on and off in an apparently meaningless way, but where a particular cluster that keeps lighting up actually corresponds to high sales in the hat department. At this point, we enter the realm of bioinformatics, described recently (see \JThe Icelandic Healthcare Database, ethics and bioinformatics\j, June 2000, for pointers to the other articles). Right now, this sort of analysis is being thrown at genomic data, but in the future, it may apply to proteomics as well. It is very hard to predict where a new science will go.
In an interview, Wilkins explained that, right now, proteins are separated by gel electrophoresis. This is a bit like the technique of paper chromatography that is usually used in school science, but it uses a gel, and an electric charge to separate the proteins.
With paper chromatography, there are two main differences, the attraction between the chemical(s) and the paper they are travelling through, and the attraction between the solvent and the chemical(s). In gel electrophoresis, there are three important factors, the charge on the protein (its pI value), the mass of the protein molecule, and the ease with which the protein dissolves in water (its hydrophobicity).
With paper chromatography, chemicals which travel together in one solvent can often be separated by collecting the mixture and separating at right angles to the first run with a second solvent, or even on a new sheet. This flexibility is missing in gel electrophoresis. Worse, there may be 4000 proteins in the sample and, inevitably, some of these will travel together, but some degree of separation is possible with different solvents using the hydrophobicity of the different proteins.
After that, a mass spectrometer comes into play, reporting on the peaks that are found in comparable samples, and this is where it is important to remember that the key question is "how do these two samples differ," because that difference probably indicates some way of diagnosing a disease, or working out how it will develop (its prognosis), or even spotting a target protein that can be worked on. So even though there may be many peaks in a sample, it is the few that are different which need to be looked at.
For the moment, says Wilkins, biochips are not realistic, mainly because the results that they give are not reproducible, which is the absolute requirement for all experimental data.
The proteome is a result of the genome, but it is impossible to predict the proteome from the genome because there are just too many complex factors involved in the dynamic production of the proteome from the genome. Thus, proteomics just looks at the tens of thousands of proteins in a cell, ignoring what is happening at the DNA (genome) level. What that means is that proteomics gets directly at the total protein profile of a cell, which is essential for understanding the molecular basis of life.
The proteome is in constant change as the proteins interact with the cell, with each other, and with the genome, but these provide the truest picture of what is happening inside the cell, unlike the genome, which remains very much the same from moment to moment. Proteins are modified, activated, deactivated, cleaved or degraded all in the regular course of the cell responding to stimuli or an infection. Proteomics measures all these changes directly, whereas genomics is largely blind to these changes.
This story announces the birth of an infant science after a gestation of six years. We will be watching how it develops with great interest.
#
"ASM Issues Statement On Genetically Modified Organisms",1589,0,0,0
(Jul '00)
\IThe American Society for Microbiology (ASM) has issued this statement on genetically modified organisms. It is reproduced here in full, with the society's permission.\i
In recent months, public understanding of biotechnology has been challenged by controversy concerning genetically modified organisms. The public has been confronted with charges and counter-charges regarding the risks and benefits associated with using biotechnology to produce quality food in quantity.
Since biotechnology enables well-characterized genes to be transferred from one organism to another with greater precision and predictability than is possible using traditional breeding procedures, the American Society for Microbiology (ASM) is sufficiently convinced to assure the public that plant varieties and products created with biotechnology have the potential of improved nutrition, better taste and longer shelf-life.
Nothing in life is totally free of risk. However, to minimize risk, it is important to rely on fact rather than on fear, and the ASM is not aware of any acceptable evidence that food produced with biotechnology and subject to FDA oversight constitutes high risk or is unsafe.
Rather, plant varieties created with biotechnology are grown more efficiently and economically than traditional crops. This eventually should result in a more nutritious product at less cost to the consumer as well as to reduced pesticide use and greater environmental protection.
Those who resist the advance of biotechnology must address how otherwise to feed and care for the health of a rapidly growing global population forecast to increase at a rate of nearly 90 million people per year.
However, a continued expression of public concern at the current level should be understood by federal agencies as reason to support more research, and to improve the quality and public accessibility of information on the regulation of products of biotechnology.
The ASM, which represents over 42,000 microbiologists worldwide, has special interest in issues and policies related to biotechnology research and development.
The ASM includes scientists working in academic, governmental and industrial institutions with expertise in medical microbiology and infectious diseases, molecular biology and genetics, environmental microbiology, agricultural and industrial microbiology, including the microbiology of food.
ASM members pioneered molecular genetics and were principals in the discovery and application of recombinant DNA procedures which have advanced biotechnology's prominence. Moreover, ASM members have for several decades participated in discussions concerning biotechnology before federal agencies and Congress.
The ASM methodically reviews safety issues associated with biotechnology and its applications to assure that oversight and regulation are consistent with current scientific principles and practices.
The ASM has long held the position that oversight and regulation should be based on the risk associated with products of biotechnology, and not on the processes used to create or produce these products.
This is necessary not only to protect public health and the environment, but also to encourage continued biotechnological research and development which is in the national interest, and in the interests of the health and welfare of people worldwide.
Indeed, the Food and Drug Administration (FDA) is to be commended for constructing a framework for safety evaluation that is product-based, and for taking the position "that the critical consideration in evaluating the safety of (bioengineered) foods should be the objective characteristics of the food product or its components rather than the fact that new development methods were used."
Although the public appears to recognize a direct personal benefit from applications of biotechnology in medicine, it remains skeptical that similar benefit will result from applications of biotechnology in agriculture and the environment.
It is imperative that an understanding of the rigor of oversight of the science-based regulatory systems used by the FDA, the Department of Agriculture (USDA) and the Environmental Protection Agency (EPA) to manage biotechnology is shared with consumers in this country.
A greater public awareness of our regulatory process is needed. Nevertheless, the ASM recognizes that for the public to feel secure, mandatory FDA assessment of the safety of genetically modified foods is warranted.
Scientifically based regulatory systems to identify and monitor potential adverse effects on human health and the environment need to be established in every country to ensure and promote public confidence in biotechnical advances.
Fear of the unfamiliar has created a clamor for labeling of genetically modified products. The ASM believes that labeling on the basis of process is not scientifically warranted.
Genetic modification has long been used to enhance the production of plants and animals for food. Indeed, it is doubtful that there exists any agriculturally-important product that can be labeled as not genetically modified by traditional breeding procedures or otherwise.
Biotechnology as practiced in agriculture today is part of a continuum of ever-more-refined attempts to breed better plants and animals for food or show.
Plant and animal genomic research are legitimate areas for public funding, and they deserve increased attention and support from federal and state funding agencies. Because of the improved precision and predictability of biotechnology, it can be anticipated that in the future, food will be more, rather than less, safe.
Food labeling is justified if it identifies real risk and provides information for the safety of consumers. To label a product only because it is genetically modified would be punitive.
Moreover, labeling will probably impose significant costs to farmers and others who would have to separate genetically modified from non-genetically modified products in the field, during processing and in the marketplace. This increased cost ultimately would be borne by the consumer.
Since there are no simple, inexpensive procedures to differentiate genetically modified from non-genetically modified products, a requirement to label would invite deception, and be exceedingly difficult and costly to regulate.
The ASM is strongly involved in programs of science education for the public, and encourages its membership to strive collectively and individually to increase public understanding of biotechnology.
It is important for the public to comprehend just what biotechnology and genetic modification entail as well as their history and various applications in agriculture, the environment and medicine.
Familiarity will diminish fear, and reliable and responsible knowledge will result in informed choice.
#
"The Chandler Wobble solved",1590,0,0,0
(Jul '00)
The August 1 issue of \IGeophysical Research Letters\i is to reveal the force that maintains the Chandler Wobble, a small mystery that has been dogging researchers since 1891. That was the year in which Seth Carlo Chandler, Jr., an American businessman turned astronomer, discovered this, one of several wobbling motions exhibited by the Earth as it rotates on its axis, much as a top wobbles as it spins.
This is not the sort of wobble that will shake the earth apart because it only involves a movement of about 6 meters (20 feet) at the North Pole. It takes around 1.2 years, around 433 days, to complete a full cycle, but it is an interesting wobble because it should have disappeared.
The thing about periodic vibrations is that they are commonly 'damped,' reduced by friction and other effects, and calculations have shown that the wobble should have completely died out in just 68 years. So much for theory: the wobble is still there, so clearly there is a force driving the wobble. Even back in Chandler's time, it was clear that 68 years was a small period in terms of geological time, but the driving force has not been identified up until now.
Over the years, various hypotheses have been put forward to account for the wobble, including atmospheric phenomena, continental water storage (changes in snow cover, river runoff, lake levels, or reservoir capacities), interaction at the boundary of Earth's core and its surrounding mantle, and earthquakes.
Now Richard S. Gross of NASA's Jet Propulsion Laboratory has found the answer. He writes that the principal cause of the Chandler wobble is fluctuating pressure on the bottom of the ocean, caused by temperature and salinity changes and wind-driven changes in the circulation of the ocean currents.
So how do you measure such effects over the long time-scales involved? Gross has applied numerical models of the oceans, which have only recently become available through the work of other researchers, to data on the Chandler wobble obtained during the years 1985 through 1995. He calculates that two-thirds of the Chandler wobble is caused by ocean-bottom pressure changes and the remaining one-third by fluctuations in atmospheric pressure, adding that the effect of atmospheric winds and ocean currents on the wobble was minor.
#
"Cancer causes",1591,0,0,0
(Jul '00)
The \INew England Journal of Medicine\i for July 13 has a challenging editorial on the causes of cancer: are cancers the result of environmental factors, our genes, or both? In the past, the debate has run back and forth, with geographic differences, trends over time in the risk of cancer, and detailed studies of migrant populations being cited to demonstrate that environmental exposures are major causal factors. As well, medical science has identified many of the responsible carcinogens, including tobacco, alcohol, radiation, occupational toxins, infections, diet, drugs.
So strong has the evidence been that it is often estimated that 80 to 90 percent of human cancer is due to environmental factors. Against this, in the past 15 years, the explosion of molecular genetics has revealed the genetic mechanisms underlying cancer. Now there is a great deal of confusion about the environmental and genetic risk factors for cancer.
Enter Robert N. Hoover, M.D., of the (US) National Cancer Institute, a guest editorial writer, who says the confusion is not surprising. A major twin study, cited in the same issue of the journal ("Environmental and Heritable Factors in the Causation of Cancer -- Analyses of Cohorts of Twins from Sweden, Denmark, and Finland," by P. Lichtenstein and Others) was the starting point for his comments.
The study looked at cancer incidence in unrelated people, fraternal twins, and identical twins, and revealed that there was " . . . an increased risk among the twins of affected persons for stomach, colorectal, lung, breast, and prostate cancer. Statistically significant effects of heritable factors were observed for prostate cancer (42 percent of the risk may be explained by heritable factors; 95 percent confidence interval, 29 to 50 percent), colorectal cancer (35 percent; 95 percent confidence interval, 10 to 48 percent), and breast cancer (27 percent; 95 percent confidence interval, 4 to 41 percent)."
The conclusion of the study was that: "Inherited genetic factors make a minor contribution to susceptibility to most types of neoplasms. This finding indicates that the environment has the principal role in causing sporadic cancer. The relatively large effect of heritability in cancer at a few sites (such as prostate and colorectal cancer) suggests major gaps in our knowledge of the genetics of cancer."
Hoover comments that the study is population-based, the outcomes are derived from complete data on incidence, and the size of the population studied is four times as great as in any previous effort. At the same time, he says, it has flaws: it lacks data on specific types of exposure such as tobacco use, so issues of interactions between genes and environment cannot be addressed, and it lacks data on screening methods. The study is huge, involving some 10,000 cancers in 90,000 Scandinavian twins, but it almost seems as though this very size has locked out some valuable lines of inquiry because of the statistical methods used.
But even if the study has limitations, it still has value. It shows that, in general, environmental factors were the dominant determinants of the site-specific risk of cancer, at around 65%, while estimates of the proportion of susceptibility that was due to environmental factors were generally even higher for cancer at the six next most frequent sites studied. Hoover notes that these estimates are less precise, but they are consistent with other studies. For example, argues Hoover, rates of breast cancer among women who have recently immigrated to the United States from rural Asia are similar to those in their homelands and about 80 percent lower than the rates among third-generation Asian-American women, who have rates similar to or higher than those among white women in the United States. The new study estimates a 73% environmental causation.
Like the other common cancers, prostate cancer levels vary from nation to nation, but the risk among migrant groups tends to rise toward the level in the adopted country over several generations. This indicates a substantial environmental component of the risk of this cancer, but, unlike most of the other common cancers, it is hard to pin down risk factors for prostate cancer in people's exposures and life styles. Perhaps, argues Hoover, prostate cancer does have a greater heritable component than cancer at these other sites. If some of the inherited factors are involved in modifying the risk associated with environmental factors, then success in identifying these two kinds of influences may depend on direct exploration of interactions between genes and environment.
The main message, according to Hoover, is that the fatalism of the general public about the inevitability of genetic effects should be easily dispelled. Even in identical twins, the rate of match-up is less than 15%. That is, there is a low absolute probability that a cancer will develop in a person whose identical twin - a person with an identical genome and many similar exposures - has the same type of cancer. Thus, the medical profession finds it impossible to predict accurately who will contract a disease and who will not.
This is hardly surprising, says Hoover, because what we know about the risk of second primary cancers in paired organs tells us exactly the same thing. A woman who has developed a primary cancer in one breast has only a 0.8% chance of developing a primary cancer in the other breast, and here we are considering a person with, obviously, not only the identical genome, but also the identical complex of exposures.
Perhaps, concludes Hoover, the answer is to drop the competition implied by talking about a debate over nature versus nurture in favor of efforts to exploit every opportunity to identify and manipulate both environmental and genetic risk factors to improve the control of cancer. Our knowledge of genetic components may help us to identify previously unrecognized environmental risk factors, and information about types of environmental exposure that affect the risk of cancer could point to genes that might modify this risk.
#
"Edible vaccine scores a win",1592,0,0,0
(Jul '00)
The first successful case of a virus vaccine engineered into a plant has been reported in the July issue of the \IJournal of Infectious Diseases\i. The vaccine is to the pervasive Norwalk virus -- the leading cause of food-borne illness in the United States and much of the developed world. The 'Norwalk agent,' as it is sometimes called, causes diarrhea. It is also responsible for around 2% of all cases of waterborne disease caused by microorganisms, and about 30% of all adult cases of gastroenteritis.
Norwalk virus earned its name in 1968 when nearly 100 students in a Norwalk, Ohio, school simultaneously came down with nausea, vomiting, stomach cramps and diarrhea. It was not until four years later that scientists realized the cause was a virus, explaining the common alternative name of 'Norwalk agent'. Some 23 million people in the United States are infected annually by the Norwalk virus, or by Norwalk-like viruses, while around 79,000 cases result from \IE. coli\i contamination, 2,500 cases from listeriosis and 1.4 million cases from salmonella.
The vaccine has been engineered into a potato by scientists at two American universities (Cornell and Maryland) and the first successful human trials are also reported in the same paper. "This plant-based vaccine could be the first one readily accepted in the developed world. It's very exciting," says Charles Arntzen, president and chief executive of Cornell's Boyce Thompson Institute (BTI). "It's likely that in the United States, this Norwalk virus vaccine could easily be the first licensed product based on our plant biology research."
Arntzen and his colleagues previously conducted a successful clinical trial in triggering immune response in humans to the bacterium \IEscherichia coli\i through a transgenic potato vaccine (see \JEat up your greens\j, April 1998).
In April 1999, 20 volunteers ate two or three doses of BTI-developed transgenic raw potato containing the viral antigen. Tests showed that 19 of those who ate the transgenic potatoes developed an immune response to the Norwalk virus. One down, quite a few still to go . . .
#
"Spam in the news",1593,0,0,0
(Jul '00)
Once upon a time, Spam was a brand of canned meat, but then it became a cult joke for people who appreciated the Monty Python style of comedy. More recently, Spam became a dirty word among the early users on the Internet because 'Spam' had come to mean unsolicited junk mail. Now, as most of the developed world jumps onto the Internet and World Wide Web bandwagon, spam has lost its capital, and come to mean something which is a nuisance to large numbers of people.
The problem is that it is very cheap to send the same e-mail message to a very large list of e-mail addresses, and clever entrepreneurs have found ways of gathering e-mail addresses from lists, which they then sell to hucksters.
The idea is simple: a spammer obtains an e-mail address which will be used once and then discarded, and then sends out a message to, say, a million e-mail names. Most of these will just ignore the e-mail, but if the spammer gets a 0.1% response, that is still 1000 live leads, and at no real cost. These leads call a phone number or write to an accommodation address, and they can then be contacted from the spammer's real business e-mail address.
Meanwhile, the rest of the recipients are left with a jammed mailbox, plus the costs in time and money of downloading the e-mail. If they respond to a 'please remove' address in the spam, they will probably just get themselves registered as a valid address. The cost of a mail-out campaign, maybe $1.00 per address for snail mail, has been borne by the recipient, and they are left fuming, without even the satisfaction of being able to place a leaflet in a recycling bin.
Owners of fax machines know the problem, with unsolicited advertising arriving in their offices overnight, or worse, during the busiest periods of the working day. Now a new form of spam has arisen in America, where AT&T customers with cellular phones equipped for text messages found they were getting text messages promoting services and accessories that they did not want.
There may be reason to hope, at least in America, and at least for fax-spam victims, as Representative Rush Holt (D-New Jersey) is currently drafting a bill similar to a law prohibiting unsolicited junk mail that will apply to fax machines. In all probability, though, the spammers will manage to stay a step ahead, as they seem to be able to step around the provisions of the current US laws.
There is a new law, the Unsolicited Commercial Electronic Mail Act of 1999, which requires that an accurate return address be posted on unsolicited commercial e-mail and makes it illegal for spammers to continue sending unsolicited junk mail after they've been warned by irked recipients or Internet service providers. However, it is reasonable to assume that the next step will be to begin sending e-mail from safe havens.
In the end, there is only one cure for spam, wherever it comes from: do not react, reply or respond in any way at all. Lacking feedback, the spammers will eventually lose heart and go away. Then all we will have left is the nuisance of people who spam us in meatspace, rather than in cyberspace, by conducting cellular phone discussions in public places.
Except in Campinas, Brazil, that is. Local laws just passed in this thriving city will have people sent from their movie seats, escorted out of libraries, or barred from classrooms if their cellular telephones ring. Warning signs are to be posted, with a hefty fine for failure to post the warning. It may be just in time: the total number of cellular phones in Brazil is expected to just about quadruple in three years to 58 million, meaning one phone for every three Brazilians.
Which makes Brazil sound like a good place to set up a cellular phone spamming operation, doesn't it?
See also: \JCybernews bits\j, June 1997, and \JFiltering Spam\j, November 1997.
#
"Scorpion venom can be good for you",1594,0,0,0
(Jul '00)
A report in the July 3 edition of the \IProceedings of the National Academy of Sciences\i described a new, synthesized chemical that affects cells similarly to scorpion venom, and which also can effectively suppress the immune system. The chemical could eventually be used to assist with organ transplants and to treat disorders like lupus, multiple sclerosis and rheumatoid arthritis in which the immune system causes the body to attack itself, according to the University of California at Irvine research team.
Because of its unusual action, the team believes that the chemical may well be without the severe side effects of existing anti-immune drugs. The chemical, called TRAM-34, binds to a channel located on T-cells in the immune system. TRAM-34 suppresses the activation of these T-cells and stops them increasing in numbers and performing their normal job of triggering immune responses to foreign objects.
This normal immune response protects the body against disease, but it also can cause the body to reject transplanted organs and, in autoimmune diseases, the response can induce the body to attack itself. Scorpion venom inactivates cell functions a little too enthusiastically, but the researchers used the venom's characteristics as an initial guide to develop a chemical with gentler, more therapeutic properties.
TRAM-34 suppresses T-cell function without affecting other biochemical processes in the cell, they say. Heike Wulff commented further: "If TRAM-34 proves effective in humans, we think it may be an effective way to keep the immune system from attacking itself in certain diseases or from rejecting transplanted organs."
It appears that TRAM-34 inhibits T-cells by blocking a cell membrane channel called IKCa1, the same channel that is blocked by the scorpion venom. This channel is found on all T-cells, and blocking the channel appears to curb the T-cell's ability to coordinate the body's attack against bacteria, viruses and other foreign bodies.
More importantly, tests show that TRAM-34 did not block the action of other important enzymes in the body, suggesting that the chemical may not produce the side effects of other drugs either being used or currently in clinical trials for treating immune disorders and assisting organ transplants. The side effects from cyclosporin, already used in transplant surgery, and clotrimazole, an anti-fungal drug being tested for sickle cell anemia and considered for future tests as an immunosuppressant, include a variety of gastrointestinal and urinary system problems.
But if scorpion venom was a jump-off for the synthesis of TRAM-34, that chemical will now be the starting place for further chemical work, according to the team. Recently, part of the larger group at UC Irvine found that chemicals related to sea anemone toxin also have an effect on another T-cell channel and may be able to provide yet another group of treatments against immune system disorders.
Key names: Heike Wulff, Michael Cahalan, George Chandy, Mark Miller, Wolfram Haensel and Stephan Grissmer.
#
"How meiosis gets started",1595,0,0,0
(Jul '00)
Cells divide in two ways: by mitosis, which produces two 'daughter' cells with as many chromosomes as the original cell, and meiosis, which usually produces four cells with half as many chromosomes as the original cell. The two types of division appear to have many similarities, but research at the Salk Institute for Biological Studies, reported in \INature Genetics\i at the start of July, shows that the two types of division are triggered in entirely different ways.
The work was carried out on yeast, but looks likely to have a far wider application, including mammals such as humans. The program began with strains of fission yeast with mutations in a number of genes known to be necessary for mitosis, the sort of division which is constantly going on in our bodies wherever new cells are needed.
Because mitosis happens all the time, and meiosis is a special process that takes place only at certain times and locations, the researchers thought that, although there were bound to be some differences, it seemed reasonable to assume that the DNA replication was the same in both cases. This turns out not to be the case.
The mutant yeast strains are unable to duplicate their DNA properly at temperatures above 33 degrees Celsius (slightly below the normal human body temperature of 37 ░C, or 98.4 ░F). The researchers set out to discover whether or not the mutant forms could carry out meiosis at elevated temperatures, and discovered to their surprise that, while some mutants faltered, a surprising number had no difficulty in completing meiosis normally.
In other words, the mutants seem to fall into two categories. One group seems to have mutations in genes responsible for what we'd call the 'mechanical' aspects of copying DNA in the machinery of the cell that shapes new DNA chains, and these could not complete meiosis. And the conclusion to this is that part of the process is the same in both types of cell division.
The other group of mutant yeasts seem to have a problem in getting mitosis started, but they have no problem at all in getting meiosis started. This suggested that there is some form of 'switch' to start each type of division and that, while the mutant mitosis switch was unable to operate, the mutant meiosis switch was working fine.
This is the point where researchers start to get excited because, if you can identify the switches in each case, you are suddenly a lot closer to explaining all sorts of biological problems, so it is hardly surprising that the researchers are now working to identify those switches. In crude terms, the Salk workers have hit a rather nice little jackpot.
The two types of division also showed differences in the ways cells check to ensure that the DNA is accurately copied. These oversight mechanisms are important in preventing cancers in higher animals and in developing healthy and complete eggs and sperm, so this line of inquiry may lead to new insights on such problems as birth defects and spontaneous abortion.
Key names: Susan L. Forsburg and Jeffrey A. Hodson.
#
"Ants and the state of the environment",1596,0,0,0
(Jul '00)
As we become more aware of the ways land can be degraded, and as we become aware of the need to protect the land, managers in industries such as mining, farming and forestry are finding a greater need to assess the damage done to the land. Yet while such assessment is clearly needed, the costs are high and can tip some marginal enterprises from making a profit to making a loss.
According to Alan Andersen of Australia's CSIRO, the giant government research group, the answer may be to let ants do most of the work, and reduce the humansÆ effort by as much as 90%. Andersen is attached to the Australian Tropical Savannas Cooperative Research Centre, where he is involved in the CRC's project on invertebrate biodiversity and bioindicators. He presented results from his new ant sampling technique at the annual Australian Entomological Society Conference in Darwin in late June.
He pointed out that ants already have a proven track record as bioindicators in the mining industry: ". . . what we've done now is worked out a sampling technique that gives the same results as a comprehensive survey, but takes just 10 per cent of the effort," he says.
Most importantly, the method ". . . requires far less specialist knowledge, so it's much simpler and therefore cheaper. It can be readily used by a range of land managers, such as those in the pastoral and forestry industries," according to Andersen.
He carried out his research at one of the world's largest lead, copper and silver mines, Mt Isa Mines, in remote northwest Queensland, where he investigated the impacts of sulfur dioxide (SO\D2\d) emissions on wildlife. The research showed that emissions had an effect up to 10 km (6 miles) downwind of the smelter.
The initial survey recorded 174 ant species, and analysis revealed that their patterns reflected what happened with plants and other animals. The most recent survey collected just 41 species, yet the story they told was a close match with the results from the initial survey.
For example, the area close to the smelter showed higher than usual population levels of a bird in the honeyeater family, the Yellow-throated Miner, possibly because these species are able to exploit the affected areas. The new ant sampling technique reflected this, with one common ant species being virtually restricted to areas impacted by SO\D2\d. Other common ant species occur mostly in unaffected areas, so it may be possible in the future to asses the ecological health of an area just by sampling the ants, given the right sort of database on what ants are found where.
#
"How genes jump",1597,0,0,0
(Jul '00)
In 1951, Barbara McClintock proposed something she called "controlling elements" to explain genetic patterns she had observed in corn. She was ahead of her time, and many geneticists were slow to appreciate the importance McClintock's discovery. In the end, her work \Iwas\i recognized and, in 1983, her work earned her a Nobel Prize.
Since then, other researchers have made considerable progress in understanding the molecular nature of what scientists now call transposable elements, and the public call "\Jjumping gene\js". Now, the July 7 issue of \IScience\i has now given us the first clear snapshot of a jumping gene caught in midair.
To put this in the formal language of science, the article describes the three-dimensional molecular structure of an enzyme that allows a transposable genetic element in a bacterium to "jump" from one part of DNA to another. The structure of this protein-DNA complex gives researchers a new framework for understanding how transposable elements operate, and the authors speculate that the finding may speed up the search for new drugs to inhibit AIDS. This is because the human immunodeficiency virus-1 (HIV-1) uses a process similar to DNA transposition to insert itself into human DNA, and it relies on similar enzymes.
It would be a mistake to think that jumping genes are just a novelty found in corn. Scientists now estimate that transposable elements make up as much as 30 percent of the human genome, and transposable elements are an important source of the mutations on which natural selection operates right across the living world. The transposable elements have the potential to remodel genomes and to make it easier to move genetic information, such as antibiotic resistance, from one place to another.
The new discovery relates only to the three-dimensional structure of the \IEscherichia coli\i Tn5 transposase bound to the Tn5 transposable element, but it is a start, and an important first step. It also reaches far beyond the question of jumping genes, and to the heart of one likely method of controlling HIV.
There are enzymes that are called, with the standard logic of biochemistry, transposases, which make transposition possible, while enzymes called integrases catalyze similar events in retroviruses, including HIV-1. Researchers have looked at the catalytic core of five different transposases and integrases, and they show remarkable similarity. That is why this study is important, because a clear image of any one of them provides greater understanding of all the others that are similar.
Researchers believe that they may be able to treat HIV-1 infections more easily if they can find compounds that inhibit HIV-1 integrase. Since HIV-1 integrase and Tn5 transposase have similar structures, the researchers believe they now have a model system that can help scientists identify or design compounds that could be effective in controlling HIV-1.
The work began with Davies and Rayment developing the DNA-enzyme crystals and analyzing them by X-ray crystallography before Goryshin and Reznikoff isolated and purified the transposase. Then all four worked on solving the structure of the complex.
Previous studies of these enzymes and their structures have homed in on the core region that cuts the element from DNA, rather than on identifying what the entire enzyme looks like or how it binds to and interacts with DNA, but this is exactly what the new work has achieved. So now we can begin to understand the process. Before transposition, one copy of Tn5 transposase binds to a specific region at one end of the transposon and a second copy binds to an identical region at the opposite end. As the DNA moves around, the two enzymes meet and link up. The two enzymes then cleave the opposite end of the transposable element DNA from its initial binding site. The Tn5-enzyme complex can then move freely before it inserts itself into a new location.
Key names: Ivan Rayment, Bill Reznikoff, Douglas Davies and Igor Goryshin.
See also: \JMcClintock, Barbara\j
#
"Switching off the appetite in mice",1598,0,0,0
(Jul '00)
As more and more people in the Third World starve, more and more people in the developed world are suffering from a condition almost as debilitating as starvation: they are obese. They have become fat and overweight from eating too much, and too much of the wrong foods at that. And while one can cruelly contrast the fatness of people in the developed world and the starvation of others in large parts of several continents, being overweight carries its own risks of increased heart disease, adult onset diabetes, and more. Increasing levels of obesity in the western world is a real problem, and one that needs answers.
The cause is simple: if there is food available and we are hungry, we eat. The answer might be almost as simple: stop people feeling hungry, or make sure that the food they crave is not readily available. The problem: we lack the will power to keep the food at a distance, and there seems to be no easy way of controlling hunger - or so we thought. Now, that may have changed.
A group of Johns Hopkins scientists have claimed in \IScience\i that they have produced a compound capable of rapidly turning off appetite in mice and causing weight loss similar in many ways to that achieved by fasting. Called C75, this substance is injected into the mice. It appears to be nontoxic, it wipes out the animals' interest in food within 20 minutes, and the effect wears off a few days after injections stop, at which time the mice resume normal feeding.
The molecule is a small member of a family of organic molecules called butyrolactones. It was created to inhibit fatty acid synthase (FAS), an enzyme the body uses to create fatty acids. Among other things, fatty acids are the building blocks for body fat. The pathway for creating fatty acids becomes active during times that an animal takes in excess food. The usual response to excess food is a decreased appetite, and the researchers suspect that C75 triggers this response artificially by regulating the pathway, decreasing the levels of a hormone called neuropeptide Y (NPY) and so lowering the appetite in the target mice. NPY has long been known as a major appetite regulator, acting in the appetite centers in the brain's hypothalamus.
Under normal circumstances, an animal produces NPY when it fasts, and the appetite then jumps sharply. But when C75 injections are given, NPY production drops sharply, suggesting that C75 works by blocking NPY production in the brain.
Weight losses as high as 30% were seen, most of which came from the mice stopping feeding, and even a moderate dose reduced food intake by 90% on the first day, with a return to normal feeding behavior over several days as the effects of the injection wore off.
One interesting side issue is that the C75 caused a dramatic weight drop in leptin-free mice, which are inclined to be obese, and the compound also reversed the insulin-resistant form of diabetes commonly seen in those mice. This suggests that there could be implications here for the treatment of Type II (adult onset) diabetes in the future.
According to a spokesman for the research group, Frank Kuhajda, "We are not claiming to have found the fabled weight-loss drug. What we have found, using C75, is a major pathway in the brain that the body uses naturally in regulating appetite at least in mice." In all probability, he argues, there will be a similar pathway in humans, but whether C75 has the same effect on humans remains to be seen.
What is really interesting and strange, they say, is that this pathway exists in the brain at all. It would make sense to find such a pathway in the body's fat or liver tissue, where fatty acids need to be synthesized, but not in brain cells. So when we realize that this cascade of reactions is found in the very brain cells that are known to control appetite, the find is less strange, but even more interesting.
However, the scientists believe that two things can be ruled out: C75 does not seem to be tied in with the wasting that occurs with cancer or infectious diseases, the search that brought them to this discovery. Secondly, there seem to be no tie-ins with leptins, the substances produced by fat tissue which affect appetite and which had the medical world excited a few years ago, sparking headlines and unrealized hopes (so far) for the perfect diet drug.
One final point of interest: the mice injected with C75 resembled fasting mice in the way they lost body tissue, but they were different in other ways. When an animal fasts, some muscle is lost as well as fat, and this was the case with C75; however, there were differences in the metabolism. Says Kuhajda, ". . . if you try to lose weight by starving, your metabolism slows down after a few days: It's a survival mechanism that sabotages many diets. We see this in fasting mice. Yet metabolic rate in the C75-treated mice doesn't slow at all."
In the long run, could C75 help to transfer food resources from the First World to the Third World? Sadly, the answer is 'probably not,' because even if the food is going begging, the resolve and the transport to move the food around the world is probably not there.
Key names: Thomas M. Loftus, M. Daniel Lane, Donna Jaworsky, Gabriele V. Ronnett, Gojeb Frehywot, Craig A. Townsend, and Frank Kuhajda.
#
"Prions and genes",1599,0,0,0
(Jul '00)
A note of caution: this report is not about established scientific facts, and it is not about a sudden breakthrough. It outlines a rather breathtaking speculation which appears to have some experimental support. If the speculation is right, the people supporting it have indicated where the evidence is likely to be found, and they suggest that it will change the way we think about evolution.
The process we hope to bring you lies at the very heart of what scientists think science is all about, though the process is not often exposed to the public gaze. Yet is the public are to understand the odd glimpses they get of this process, it needs to be examined thoroughly in a case study.
According to scientists, science cannot survive and progress without speculation, and careful testing of the speculations. Many of them will be overthrown, ruled out by the evidence, but it may be interesting and instructive to follow this particular line, to see where it goes, and what it reveals - or how the speculation eventually falls. But be warned: we are out on a limb here, and the arguments we will be reporting will not always be pretty, because scientists can get quite excitable about issues like this.
Most scientists now accept the existence of prions, proteins that have two stable shapes they can fall into. A few years ago, many scientists scoffed at the idea that prions could be the cause of diseases. Now the argument and scoffing is about a different matter: do prions carry hereditary information?
Remember that prions are proteins, and everybody 'knows' that hereditary information is carried on DNA (or, for a few viruses, on RNA), and not on proteins. That much was demonstrated as far back as 1944. Yet in January 2000, Howard Hughes Medical Institute researchers Liming Li and Susan Lindquist at the University of Chicago performed a set of simple experiments, providing the best evidence yet that prions exist in yeast.
In 1994, Reed Wickner had suggested that two yeast phenotypes, called URE3 and PSI+, were the result of prions. These phenotypes were being transmitted in a non-Mendelian way, and he offered four genetic arguments to support his case:
* prions have a non-Mendelian mode of inheritance, and so do these phenotypes;
* the phenotype can be cured reversibly, and so can prionic 'infection';
* prion formation is induced by overexpression of the normal protein; and
* the phenotype of the prion resembles that of a mutation in the gene encoding the normal protein.
According to Wickner, nobody objected to this hypothesis, but at that time, most people had trouble agreeing that prions were real, so it may have just been a case of scientists not even considering the hypothesis, and so not bothering to object.
Since Li and Lindquist reported their findings, there has been a small flood of new data about fungal prions, including the identification of several new suspected prions and information about their regulation, and the case now seems to be stronger than ever before that prions are a form of inheritance. But first, it may help to recall how prions are thought to work, because this is central to the view of prions as part of heredity.
A protein is made of amino acids, but the chain itself is inactive. It only becomes a protein when the chain coils up into a stable shape that is 'pinned' in place by weak links formed by different amino acids when they come near each other. The shapes that a particular protein can form are limited by the larger amino acids and also by the locations of the link-forming amino acids. In fact, it was accepted for many years that a particular sequence of amino acids could only form one shape, or one stable conformation.
Prion theory says that a prion protein has two conformations, which can be thought of as an active conformation, frequently soluble within the cell, and an inactive form, which is protease-resistant and found in insoluble aggregates. Note that this does not say that all proteins have two conformations, but that some of them, the prions, have two forms.
The notion of proteins having two separate forms is not all that surprising when you come down to it. What \Iis\i surprising is that, once one protein in a cell takes on its inactive conformation, it influences the rest of the molecules of the same species to adopt a similar conformation and to aggregate together as an insoluble mass. And worse still (for the organism the prions are found in), this influence is continued even after the cell undergoes division and new protein is made, or if the cytoplasm of two cells is mixed, as in mating.
So prions are able to pass on a heritable phenotype that is based on a change in protein conformation alone and has no underlying alteration in the DNA. In other words, prions can be part of heredity, though usually it is not a particularly desirable form of heredity. From what we know of prions at the moment, they are seen exclusively as the causes of diseases like scrapie in sheep, spongiform encephalitis - more commonly known as mad-cow disease in cattle, scrapie in sheep, and \JCreutzfeldt-Jacob disease\j (CJD) in humans. These are all caused by the mammalian PrP protein. But what if there were other prions, ones with much less drastic consequences for the carrier? Could these be present in all sorts of life forms, but so far hidden from our scrutiny?
Li and Lindquist's experiments showed that prion behavior is transferred to a distinct protein simply by fusing a prion-determining domain to it, and that the prion behavior was transmitted to all the progeny in a non-Mendelian fashion. More importantly, the prion-like behavior continues even after removing the DNA plasmid from the cell that Li and Lindquist used. This plasmid had been used to encode the fusion protein, but if the process continued in its absence, then the change has to be directed by the protein itself, it has to be, at the very least, prion-like.
Several other prion candidates have since been identified, and cellular regulators of prion formation are beginning to show up. Either the years of effort are beginning to pay off, or people are now sufficiently interested that, when they find an interesting effect, they stay with it.
The key finding is that prionic changes are only semipermanent or metastable, showing low levels of spontaneous switching from the off state to the on state, or \Ivice versa\i. If we compare this with a DNA mutation, which is essentially permanent, we can see that, if organisms can combine these two effects, they may have both permanent and semipermanent ways to adapt to an environment over an evolutionary time scale.
According to the enthusiasts, prions are likely to be widely found in biology, but even they still believe that nucleic acids remain the major agents of inheritance. All the same, now that we are beginning to learn how to recognize and find prions, perhaps we will find that they are more common than we used to think.
See also: \Jprion disease\j, \JPrions -- how they work\j, and \JYeast prion model\j.
#
"Chandra captures flare from brown dwarf star",1600,0,0,0
(Jul '00)
A \Jbrown dwarf\j is a body that has too little mass to sustain a significant nuclear reaction in its core. Brown dwarfs get most of their energy from the release of gravitational energy as they slowly contract, and they are very dim, less than 0.1% as luminous as the sun. Yet these dim objects are of great interest to astronomers because they are poorly understood. They are also probably a very common class of objects, intermediate between normal stars and giant planets.
In the middle of last December, NASA's Chandra X-ray Observatory detected a first: an X-ray flare seen coming from a brown dwarf, or failed star. This bright X-ray flare has implications for how we understand the activity and origin of magnetic fields in extremely low-mass stars, according to a report in the July 20 issue of the \IAstrophysical Journal Letters\i.
LP 944-20 is estimated to be about 500 million years old and has a mass that is about 60 times that of Jupiter, or 6 percent of the sun's mass. For fusion reactions to occur, the temperature in a star's core must reach at least three million kelvin. Because core temperature rises with gravitational pressure, the star must have a minimum mass, estimated to be about 75 times the mass of the planet Jupiter, or about 7 percent of the mass of our sun. The diameter of LP 944-20 is about one-tenth that of the sun and it has a rotation period of less than five hours. Located in the constellation Fornax in the southern skies, LP 944-20 is one of the best studied brown dwarfs because it is only 16 light years from Earth.
During a 12-hour observation of this body on December 15, 1999, using the Advanced CCD Imaging Spectrometer (ACIS), Chandra detected no X-rays at all in the first nine hours, but then the source flared dramatically before it faded away over the next two hours. Dubbed "the mouse that roared" by astronomers, nobody expected to see flaring from such a lightweight object, and it is just good luck that Chandra happened to be looking in the right direction for a long enough period.
The energy emitted in the brown dwarf flare was a billion times greater than observed X-ray flares from Jupiter, comparable to a small solar flare, and the energy is believed to have come from a twisted magnetic field. In fact, this explanation has an even more important corollary: the flare offers the strongest evidence yet that brown dwarfs and possibly young giant planets have magnetic fields.
One speculation suggests that the flare could have its origin in the turbulent magnetized hot material beneath the surface of the brown dwarf. In such a case, a subsurface flare could heat the atmosphere, allowing currents to flow and give rise to the X-ray flare, rather like a stroke of lightning. One thing is certain: astronomers are going to be spending some more time checking brown dwarfs for activity in the future.
See also: \JAn odd brown dwarf\j, October 1998.
#
"Saving Vavilov",1601,0,0,0
(Jul '00)
It took most of the last century to build, it survived the Stalin years, it came through the 872-day siege of Leningrad unscathed, but now one of the world's major treasures is at risk. If you describe it as one of the finest potato germ plasm repositories in the world, people may ask "so what?" But it matters to all of us that the valuable potato collection at the N. I. Vavilov All-Russian Research Institute of Plant Industry may soon be a worthless genetic morgue.
During World War II, the Nazis and the Finns occupied Leningrad and, during the first winter of the siege, scientists had to move the valuable germ plasm at the National Scientific Institute of Plant Growing experiment station, as it was known then, to an alternative site in the basement of a building on Gertzen Street in Leningrad which also housed a hospital. Soldiers, orderlies and even wounded patients from the hospital floors above broke up chairs, tables, buffets and other furniture to feed the potatoes' stove, keeping the collection warm. The small wood stove that kept the cultivars warm in the face of frigid temperatures was fed right through the winter, and the lines were preserved.
Now the accelerating breakdown that is modern Russia sees the collection under threat again, but the danger extends beyond Russia. Aside from the dependence of Russia on potato crops (see \JPotato late blight hits Russia\j, March 2000), many other parts of the world will one day have to rely on this plant as a food source. Sooner or later, a new form of blight or some other disease will threaten the world's potato crops, and then we will need the treasures of the Vavilov Institute.
There are about 10,000 individual plant entities of potato, called "accessions" by scientists. The collection lacks a good watering system and suitable greenhouse soil-fumigation procedures, but western authorities are unwilling to contribute because so much of the aid sent to Russia ends up in private pockets or being used in other projects. The wages of the staff, typically around $10 to $20 a week, are not being paid by the cash-strapped government, there is a lack of dependable electric heaters and ventilation, making the storage of tubers difficult, and torn window screens and broken windows allow pests in.
Right now, none of the tubers can be shipped out of the Institute because they are all quarantined to prevent the transfer of diseases that riddle the entire collection. Methods will need to be found to cultivate new plants from treated material to reconstitute the plants in a disease-free form, and old seed needs to be grown to maturity before the seeds die, just to produce new seed.
Sadly, there is no new science in this report, but it does give us an insight into the problems that will face the world if the rest of the world should go the way of Russia.
See also: \JVavilov, Nikolay Ivanovich\j.
#
"Autism and language development",1602,0,0,0
(Jul '00)
Twin and family studies have indicated fairly clearly that there are genetic effects involved in both autism and developmental language disorder, also referred to as specific language impairment (or SLI). Just how these genes may make their mark, nobody knows, but while there is about one chance in a thousand of being autistic, the recurrence risk for autism in families is around 6 to 8%. With identical twins, there appears to be about a 65% 'concordance rate' for autism, while a study of less than 100 pairs of non-identical twins revealed no cases of concordance (this is probably an error due to the low sample size: a small concordance effect should have been observed).
With SLI, the evidence is similar. Relatives of SLI cases are much more likely to show signs of SLI, and concordance in identical twins is around 70%, while in non-identical twins, it is about 45%. While the case is less strong for SLI than it is for autism, it is still possible that SLI might be controlled by a single gene.
Autism is a disorder characterized by impaired social interactions. In contrast, children with developmental language disorder have problems with pronunciation, grammar, and verbal communication. The problem for researchers and diagnosticians is that the symptoms of these disorders sometimes overlap, and two studies reported in the August issue of \IThe American Journal of Human Genetics\i have now offered us a possible explanation for this overlap.
It seems that the genes for both conditions are located in much the same position in the human genome. The localization of both disorders to the same genetic region suggests that this overlap may be due to a genetic relationship between the disorders. These results could also explain the fact that language and reading difficulties are more common in siblings and parents of people with autism and, conversely, that autism is more prevalent in the families of children with the developmental language disorder. Although the researchers cannot say for sure whether genes are shared between the disorders, since the genes for the disorders have not been identified, the evidence suggests they could be the same.
See also \JDoes a virus cause autism?\j.
#
"Microscopic life at the South Pole",1603,0,0,0
(Jul '00)
An article in \IApplied and Environmental Microbiology\i, the journal of the American Society for Microbiology, indicates that a population of active bacteria, some of which have DNA sequences that align closely with species in the genus \IDeinococcus\i, exists at the South Pole during the southern summer. This extends the previously known limits of life on Earth, but still makes us wonder how the microbes can survive the heavy doses of ultraviolet radiation and the extreme cold and darkness of winter at the South Pole. This is not all that surprising, since the species of \IDeinococcus\i which was first discovered in cans of irradiated meat in the 1950s is able to withstand extreme dryness and large doses of radiation.
The bacteria were found to be metabolically active and synthesizing DNA and protein at local ambient temperatures of -12 to -17░ C (10.4 to 1.4░ F and this surprised the researchers so much that they repeated their original observations in a second field season during January 2000.
Extremophiles like this are interesting because their existence has important implications for the search for life not only in other extreme environments on Earth, but also elsewhere in the solar system. From the point of view of applied science, the snow bacteria may possess unique enzymes and membranes able to cope with a subzero existence.
The find also sounds a note of both interest and warning with regard to the possibility of life at Lake Vostok, because the samples were taken at the edge of the clean-air sector at Amundsen-Scott South Pole Station. This should have avoided any risk of contamination of the samples by bacteria from human habitation, but in examining the snowmelt, the researchers found coccoid and rod-shaped bacteria, some of which appeared to be dividing.
This means that it will not be safe to assume that the surface above Lake Vostok is sterile, the risk of contaminating the lake if it is drilled into is greater than previously thought. It also raises the chance that there are microbes present in what is suspected to be a vast pool of liquid water thousands of meters below the Antarctic ice sheet.
For more on \IDeinococcus\i, see also: \JDeinococcus radiodurans genome sequenced\j, November 1999, \JConan the Bacterium\j, December 1999. For more on life forms that live under challenging conditions, search on the term 'extremophile.' For information on Lake Vostok, see \JEuropa fly-by\j, October 1998, \JIs there life under the ice?\j, August 1999 and \JLife at Vostok\j, December 1999.
#
"Longer-running CDs",1604,0,0,0
(Jul '00)
Legend has it that the size of a \Jcompact disc (CD)\j was set to allow a single CD to hold the whole of Beethoven's longest symphony. But, as the humble CD edges closer to the carrying capacity of a \JDVD\j, composers will soon be able to contemplate twice that running time, allowing the operas of Wagner's Ring cycle to fit on just two CDs.
The secret lies in a new CD-RW drive device that offers twice the storage capacity of current solutions, announced in early July by Cirrus Logic. This new double-density encoder/decoder, designated the CR3490, makes it possible to store 1.3 gigabytes of data on a single CD-RW disc, and boasts the industry's fastest write (16x) and read (48x) speeds, according to Cirrus.
A release from the company claims that this will be the first device able to meet the new Double-Density CD-ROM/-R/-RW (or DDCD) specification from Sony and Philips when it is completed. They also claim that their new drive is expected to be the removable storage drive of choice, replacing the floppy, as more and more high- and medium-end machines use CD-RW drives.
Future uses might include things such as multimedia school assignments, archived family photograph albums, detailed architectural plans, or libraries and art galleries on a single disc. And discs holding two versions of the same Beethoven symphony. Wagner lovers will need to wait a few more years for the one-disc work to be available.
In the more popular market, 1.3 gigabytes translates into 60 minutes of real-time video, or 144 minutes of real-time audio, which translates into more than 500 MP3audio files, or 1000 WMA Internet files, or 0.5 G÷tterdΣmmerung performances, on a single removable CD.
#
"Deep Space 1 revived",1605,0,0,0
(Jul '00)
In space technology, nothing is more crushing than a mission that fails, and nothing is more uplifting than a mission that turns around and completes a bonus task. Deep Space 1 (DS1) had a primary mission of flight testing a dozen innovative technologies and, for a while, it looked as though that was all it would do. But now it is on it way to an important but lonely meeting with comet Borrelly. The ailing spacecraft lay idle for seven months since it lost the use of its navigational camera last November, a loss that left it unable to orient itself in space.
The problem, which remains unexplained, arose two months after the completion of the primary mission, at a time when the spacecraft was in otherwise excellent health with nearly a full tank of propellant aboard. At one blow, the malfunction looked set to dash plans for the bonus flyby of the comet. Now, the advanced ion-propulsion engine is pushing the craft forward again, and the science camera has been jury-rigged as a navigational instrument. With any luck, Deep Space 1 should reach comet Borrelly in September 2001, just as the comet gets to perihelion, its closest point of approach to the sun, and so becomes very active and interesting.
The craft navigates with a system called "Star Tracker," which looks at the pattern of stars around it and uses this to align the craft in space. This was one of the established pieces of technology, not one of the dozen experiments, so when the camera in Space Tracker went down, the system appeared to be dead and, if that was so, then the craft was dead as well, because its computers need to know which way the craft's rockets are pointing when it is told to move.
The Star Tracker system originally used a 9 degree field of view (a bit less than the area covered by your clenched fist at arm's length). The spacecraft carries another camera that can also see stars (the Miniature Integrated Camera and Imaging Spectrometer, or MICAS for short), but its field of view is just a hundredth of the original camera's system. So, while the Star Tracker was able to estimate the probe's orientation in space four times every second, the MICAS camera produces a data file that takes more than 20 seconds to transfer to the computer where it can be analyzed. That meant an 80-fold drop in the speed of the guidance system, but it was still possible, the programmers decided.
The main point was that the primary mission was complete, so they were able to go out on a limb to take on risks that would otherwise be unacceptable. There may still be problems, but the brief summary of the rescue attempt is "so far, so good." We cannot leave it there though, however, because that would leave out the best part of the story.
The challenge to save DS1 had a number of novel features: DS1 was hundreds of millions of kilometers away, so there could be no hardware fixes, only software fixes. That meant programming the onboard computer to use MICAS data and doing it before an early July deadline that had to be met if DS1 were to meet the comet. The software was complete by the end of May and had worked successfully in a DS1 simulator at JPL, but it is impossible to completely simulate space conditions on earth.
The software was broken down into 90 files that were transmitted through one of the 34 meter (110 foot) diameter antennas of the Deep Space Network to be received by a 30 cm (1 foot) antenna at DS1. But now comes the challenge: with no tracking system, how do you get the small dish on DS1 to point in the right direction? This remains a mystery, as JPL news releases refer only to "an innovative method the DS1 team had developed earlier this year," so we leave that as a matter of wonder for our readers.
Within a few days, 81 of the 90 files were at DS1, but the spacecraft experienced a brief problem, and this triggered a safety reset procedure that made DS1 point its main antenna back at the Sun. The craft then rebooted the computer and deleted all the new files, so they had to start all over again, making the deadlines even tighter.
By June 8, testing was able to begin and, after three weeks, the system was declared acceptable, with just days to go to the firing deadline. Engineers brought the ion engine to maximum throttle on June 28 and began the journey toward comet Borrelly, and, as we reported, so far so good.
Comet Borrelly orbits the Sun once every 6.9 years in an elliptical path that will place it near perihelion in September 2001. During the encounter, the pair will be 1.34 AU from the sun, roughly midway between the orbits of Earth and Mars. (1 AU is the distance from the Earth to the Sun, or 149.6 million kilometers). At this distance, it is a fairly insignificant comet, perhaps explaining why it was only discovered as a 9th magnitude object in 1904. -(The brightness of a comet is inversely proportional to the fourth power of the perihelion distance.)
Comet LINEAR S4, which reached perihelion on July 26, 2000, for example, had a perihelion just 0.75 AU from the sun and, at that distance, could be seen with the naked eye. Still, even at 1.34 AU, Borrelly will have bits of dust and rocky debris boiling off from the comet's core, and these will shoot past DS1 at relative speeds of 17 km/sec, perhaps more.
NASA's Stardust spacecraft, currently on its way to comet Wild-2, and the ESA Giotto probe that passed within about 600 km of comet Halley in 1986, both had shields to protect them, but DS1 lacks this protection. In any case, as Giotto approached Halley, it was hit by a piece of dust that pushed the craft slightly off its course. In spite of the shielding, several of Giotto's instruments, including its camera, were damaged by flying bits and pieces.
So even if DS1 makes it there, the mission may yet fail. Much will depend on how close the spacecraft goes, but current plans are looking at a "safe" distance of 1000 km or more. This is still likely to return dazzling pictures of the nucleus, though at that time, the one camera will need to provide both navigational and research data.
More importantly, the spacecraft carries an ion mass spectrometer and a near-infrared spectrometer. These instruments should provide scientists with data on the mineral content of Borrelly's nucleus and the chemical composition of gases in the coma. There is life in the old flying laboratory yet!
See also: \JAsteroid visits\j, July 1999.
#
"Nevirapine, mothers, children and HIV",1606,0,0,0
(Jul '00)
Nevirapine is an inexpensive AIDS drug that has been under close observation for some time. A report issued towards the end of the 13th International AIDS Conference in Durban, South Africa, during July indicates that the drug greatly reduced mother-to-infant transmission of HIV up to a year after the medicine was given, around the time of birth.
The findings came from the continued follow-up of breastfeeding mothers and their babies who were enrolled in a clinical trial (HIVNET 012 - see \JPreventing HIV transmission to infants\j, September 1999), managed by the US National Institute of Allergy and Infectious Diseases (NIAID). Results reported last year indicated that a short regimen of Nevirapine given to both mother and child significantly reduced levels of HIV transmission. Today's announcement reported that this reduction in mother-to-infant transmission of HIV was sustained even though the infants were breastfed.
The study was carried out at the teaching hospital of Makerere University in Kampala, Uganda, and compared two treatments. In one, a group of women had a single dose of Nevirapine during labor and their infants received one dose within 72 hours of birth. The second group of women received AZT during labor while their newborns received twice daily doses for 7 days.
The researchers reported from the first part of the study that the treatments showed no problems at 6 to 8 weeks after the treatment, but mothers and infants who received Nevirapine had a 42 percent lower risk of HIV transmission when compared with those receiving AZT. Now, the reduction at 12 months is around 39%, while preliminary data at 18 months indicate a 42% reduction in transmission with Nevirapine, even though the women were breastfeeding.
There is even better news: while the HIV in the mothers shows the development of mutant strains of HIV which resist Nevirapine, those mutations fade from detection within 13 to 18 months after delivery. This means that repeat doses of Nevirapine given to the mother will continue to prevent HIV transmission during future pregnancies.
Another finding related the amount of HIV and CD4 T-cells in the mother's blood to the likelihood of her passing the virus on to her baby. As might be expected, women with more virus and fewer CD4 T-cells in their blood were more likely to transmit HIV to their offspring than were those with less virus and more CD4 T-cells.
In brief, then, a short Nevirapine treatment is an effective, simple, and extremely low cost method for preventing transmission of HIV from mother to child in developing countries, where cost is the major criterion for selecting treatment, breastfeeding is essential, and AIDS is very common.
Key Names: Brooks Jackson, Francis Mmiro and Tom Fleming.
See also: \JSaving children from AIDS\j, July 1999.
#
"Pulsars may be much older",1607,0,0,0
(Jul '00)
Pulsars are easy to explain these days. A \Jpulsar\j is a spinning, superdense \Jneutron star\j that emits powerful beams of radio waves and light, scientists will tell you with complete confidence. And, until recently, they would tell you with the same confidence that your average pulsar is quite young, because our standard model of a pulsar saw it as having a brief and rather spectacular life.
In a paper published in \INature\i in mid-July, Bryan Gaensler and Dale Frail report that evidence gathered with the National Science Foundation's (NSF) Very Large Array (VLA) radio telescope casts doubt on the standard model. The two scientists used the VLA to study a pulsar previously thought to be 16,000 years old, and they have concluded that the pulsar is at least 40,000 years old and may be as old as 170,000 years. The signals from a spinning pulsar are as regular as an atomic clock, and for three decades we have relied on them to estimate the age of pulsars, but if the ages are greater, then we need to rethink the standard theory of how pulsar signals are produced.
When a star is much larger than our sun, it ends up with a violent \Jsupernova\j explosion, leaving behind an extremely dense neutron star. In some cases, the neutron star will produce the beam of electromagnetic radiation that characterizes pulsars. Gaensler and Frail calculated the speed of pulsar B1757-24, located 15,000 light years away in the constellation Sagittarius, which is outside the shell of the supernova that created it. The pulsar and shell together form a remnant dubbed "the Duck", which indicates the shape of the unit.
The pulsar began in the middle of the shell and has been moving, even as the shell has expanded, ever since. The problem is that the pulsar would have had to travel at about 1600 km/sec (1000 miles/sec) to reach its present position if it is only 16,000 years old. But comparing a 1993 VLA image with a more recent image yields something rather more like 600 km/sec (350 miles/sec) and, if that speed is confirmed by observations over a longer period of time.
For the pulsar to have traveled from the center of the supernova remnant to its present position in 16,000 years, it would have had to move at about 1,000 miles per second. By comparing a 1993 VLA image of the region to one they made last year, the scientists calculated the pulsar's speed to be no more than 350 miles per second.
In other words, the pulsar would either have to have been slowing down, and there is no way we could explain that, or it is older than we thought and the present theory needs a bit of fine-tuning. That fine-tuning, though, is likely to extend into previous conclusions about neutron stars and how they work, and that could easily change some of the standard assumptions of particle physics.
#
"Bioengineered corneas",1608,0,0,0
(Jul '00)
Two research teams, one in Taiwan, the other in California, have reported success during July in restoring the eyesight of patients with previously untreatable corneal damage. In each case, the work was done using novel tissue bioengineering techniques. The Taiwanese report appeared in the \INew England Journal of Medicine\i, while the American report was in the journal \ICornea\i.
The Taiwanese group managed to reverse vision loss in six patients, while the Americans succeeded with ten patients, using corneal tissue from a donor. The Taiwanese group has done some interesting work on stem cells that makes their effort particularly noteworthy, so while both results are important, we will look more closely at the technique used by Tsai and his colleagues. In a gracious editorial in the \INew England Journal of Medicine\i, the two Americans have hailed the Taiwanese accomplishment as having potential beyond the cornea:
"Bioengineered or cultured tissue products are currently being produced to replace other tissues, and the progress with corneal-surface replacement indicates that such products are likely to revolutionize the treatment of many epithelial and even visceral diseases."
Many of the problems patients suffer with their corneas seem also to involve damage to the stem cells from which a natural cornea is repaired.
When a cornea is transplanted into another eye, the "donor corneal epithelium is gradually replaced, and the remaining transplanted corneal stroma, which is immunologically nonreactive, must ultimately be resurfaced with epithelial cells derived from the recipient's corneal stem cells." When there is a lack of stem cells, this triggers a cascade of reactions that leads to inflammation and eventually to a failure of the graft.
The obvious solution is to boost the stem cell population. The stem cells which maintain the corneal epithelium are located in the basal layer of the limbus. Grafts from a donor limbus (from a live relative who is a donor or a cadaver) are possible, but more than half of the donor limbus must be 'harvested,' which can put the donor eye at risk if the graft comes from a live donor. As a result, it would be better to use a bioengineered replacement tissue that replenished the pool of stem cells without endangering the corneal stem cells of the donor eye. But doing this demands "an ex vivo environment for stem cells that maintains both the replicative function of the tissue and its differentiated phenotype." That is, you need a way of culturing the stem cells away from the owner, which keeps the stem cells dividing in just the right way and makes sure that the product of division remains what in would be in the living body, proper corneal cells.
This is where Tsai and his colleagues came up with a clever ploy: using human amniotic membrane as the scaffold on which to grow their replacement corneal surface. This is not all that unusual, since there is already widespread use of human amniotic membrane for ocular-surface reconstruction, and Tsai's group cites numerous previous studies that show that this is an excellent choice. The main advance is the small amount of donor limbal tissue required (2 mm\U2\u), and the fact that the membrane provides a compatible extracellular matrix for the graft.
There are still problems to be solved, but patients with certain types of eye trauma, such as chemical burns, are now much more likely to see again.
Key Names: Ray Jui-Fang Tsai (Taiwan) and Ivan Schwab and Rivkah Isseroff (California).