Sunday, December 27, 2009

Electric Cars Are Coming !!!

Rapid Improvement in Battery Technologies combined with light weight materials for car construction are making electric-powered cars the vehicles of the future. Now the US manufacturer of Tesla electric sports cars has set a world record at the Global Green Challenge race in Australia, when the vehicle managed to travel a formidable 501 km on a single battery charge.

Its design resembles that of the Lotus Elan. Honda acheived a distance of 360 km on a single charge in the same race, but was more efficient in terms of distance travelled per watt hour of battery power. It achieved an efficiency of 85 watt hours per kilometre, which is claimed to make it the most energy efficient vehicle in the world.

Now it is no exaggeration to claim that electrical vehicles are going to be the part of our personal lives.

Sunday, November 22, 2009

Global Economic Crisis Could not Stop Emissions Growth

A study from Norwegian and Newzealand scientists provides updated number of carbon dioxide emissions from fossil fuels. While the global financial crisis may have slowed down the emission growth, it has not been sufficient enough to stop it. From 2007 to 2008 global emissions from fossil fuels increased by 2.2 %. From 2003 to 2007, the average fossil emissions increased by
3.7 % a year.

According to study published in Environment Research Letters, coal in 2006 has bypassed oil as the largest source of carbon dioxide emissions. Emissions from gas and oil have had a rather constant growth since 1990. Coal is now the driver of the strong fossil fuel Carbon dioxide emission growth.

The growth rate of emissions has been a slightly higher in India the last two years. For the first time India's emissions now increase faster than the Chinese emissions.

According to the International Panel on Climate Change (IPCC), a large reduction of fossil emissions is required to reduce global warming. The concentration of CO2 in atmosphere has been increased from 280 ppm in 1750 to 383 ppm in 2007. Around 75 % of the increase until now is due to carbon dioxide emissions from fossil energy. 25 % is due to changes in land use.

All main IPCC scenarios of fossil fuel Carbon dioxide emissions show an increase over the next few decades with a large spread in emissions estimates upto 2100. And this increasing trend is driven by enhanced economic growth.

Tuesday, November 17, 2009

'Doomsday' 2012 Prediction Explained

According to Ann Martin, a doctoral candidate in Cornell University's Department of Astronomy, the world will NOT end on December 21, 2012, contrary to what the latest Hollywood Blockbuster movie suggests. Her research focuses on the hydrogen contents of galaxies in the nearby universe.

She says that the Mayan calendar was designed to be cyclical, so the fact that the world will come to an end in December 2012 is really of no consequence. Simply, it is the end of great calendar cycle in Mayan society. It does not mean that the "world will end", she says.

For the past three years, Martin has been a volunteer with Cornell's "Curious? Ask an Astronomer" service, a website founded by Astronomy Graduate Students in 1997. This website features the answers to over 750 FAQ's regarding the field of astronomy.

For further information see:

http://curious.astro.cornell.edu/question.php?number=686

Sunday, November 1, 2009

New DNA Method Makes It Easier To Trace Criminals

DNA samples often convict criminals. But many of today's forensic tests are so polluted by soil, tobacco and food remains, for example, that they can not be used. Now researchers at Lund University in Sweden, working together with the Swedish National Laboratory of Forensic Science, SKL, have improved a critical part of the analysis process.

The first findings, published in the latest issue of the journal Biotechniques, indicate that the new method strengthens the DNA analysis so that previously negative samples yield positive and usable DNA profiles.

"The results are overwhelming. In my study I selected 32 truly difficult samples from the SKL archive, that is, with few cells, little DNA, and many so-called inhibitors, meaning lots of junk. With current methods it was impossible to get acceptable DNA profiles from any of them. But with the new method, 28 of the samples yielded more usable DNA profiles," says Johannes Hedman, an industrial doctoral candidate from SKL doing research at the Faculty of Engineering, Lund University.

Genetic information has become more and more common in forensic analyses. The analysis flow usually starts with taking a sample with a swab from a drinking glass or a blood spot, for example. The cells from the swab are then dissolved in water, and the DNA is extracted.
In forensics all over the world, much work has been done to improve the taking and handling of samples.

"The DNA analysis, on the other hand, has been something of a black box, since it is purchased as a finished product. No one has tried to improve it to be able to deal with dirty samples. But this is absolutely crucial, since the samples often have extremely small amounts of DNA. In this phase you copy certain parts of the DNA strands and then obtain a DNA profile that is unique to each person.

In the copying step, I have optimized the chemical environment and replaced a key enzyme, a so-called DNA polymerase. This yields a clearer genetic footprint, a DNA profile, to work with," explains Johannes Hedman. He has also devised a new mathematical model that makes it easy to interpret the DNA analysis.

If the copying phase is improved, stronger DNA evidence can be shown from crime scenes that today provide only partial or entirely blank DNA profiles. In other words, the chances are greater that a person can be found and linked to a particular crime.

The reason Johannes Hedman wound up at Lund is the fact that Peter Rådström, a professor of microbiological engineering, has been working since the late 1980s to improve DNA-based infection diagnostics and microbiological analyses for food. SKL was eager to find out whether these research findings could be applied to improving forensic DNA analysis.

"This collaboration opens new vistas for both SKL and Lund University, and we hope to be able to continue to work together with Peter Rådström's team. We have truly seen cross-fertilization," says Birgitta Rasmusson, research director at SKL.

Thursday, October 22, 2009

Nanotech Protection

Canadian engineers suggest that research is needed into the risks associated with the growing field of nanotechnology manufacture so that appropriate protective equipment can be developed urgently.

Patricia Dolez of the Department of Mechanical Engineering, at the École de technologie supérieure, in Montréal and colleagues point out that skin is not an impervious membrane. This is the reason that protective clothing and gloves, in addition to respirators, are often an essential and common sight in the chemical industry. However, they wonder if standard protection against chemical risks is enough for workers who are handling nanomaterials.

According to the most recent estimates from the U.S. National Science Foundation, the nanotechnology market could reach as much as $1 trillion by 2011/2012. This, says Dolez, corresponds to about 2 million workers involved in nano-related activities. She adds that it has already been shown that nanoparticles may affect biological activity through oxidative stress at the cellular and molecular levels, although these effects are yet to be manifest as health problems among workers.

The anticipated hazards associated with this incredibly diverse range of substances falling under the general and broad tag of "nanomaterials" remain largely unknown. And, some scientists have suggested that we are vigilant to emerging health problems associated with nanomaterials. The U.S. government recently updated its National Nanotechnology Initiative strategic plan to highlight the need for an assessment of nanomaterials toxicity before production begins.

Dolez and colleagues suggest that as this area of manufacturing grows it would be prudent to develop adequate workplace protection sooner, rather than later. Indeed, those workers most likely to be exposed to nanomaterials will be working in cleaning, bagging and formulation activities as well as surface functionalisation of nanoparticles.

They explain that current regulations and standards testing for protective clothing and equipment are almost devoid of references to nanomaterials specifically. Moreover, although some researchers have concluded that certified respirators offer an appropriate level of protection against nanoparticles, there remain large uncertainties, for example due to the increased potential of leaks at face seal because of the very small size of nanoparticles, a few billionths of a meter.

Wednesday, October 21, 2009

Computer Helps Deaf Children To Learn Sign Language

Three PhD candidates have spent the last few years at TU Delft simultaneously working on sign language. One of these is Jeroen Arendsen: “Very little is known about the perception of sign language. The aim of this research was to expand our knowledge of human observation of signing. For instance, it turns out that sign language users only need to see a small part of a hand movement to know it is a sign and what it means.”

Automatic Recognition:

The insights obtained into human perception can subsequently be used for developing the technology for automatic sign language recognition by cameras and computers. In the TU Delft study, this led to a remarkable application of automatic sign language recognition: an Electronic Learning Environment (ELo) for children who are deaf or hard of hearing.

Pictures:

In conjunction with the NSDSK (Dutch foundation for children who are deaf or hard of hearing), the TU Delft researchers developed a computer which can recognise sign language in real time and can therefore be used to teach children who are deaf and hard of hearing new signs more quickly.

When shown a picture, the children aged three to six were asked to make the correct sign. The computer then had to assess the sign. Arendsen: “And that is quite difficult, because a computer is more easily confused than people by irrelevant fidgeting.”

Comprehensible:

The task of fellow PhD candidate Jeroen Lichtenauer (EEMCS faculty) was to convert the signs into information which is comprehensible to a computer. As an Industrial Design Engineering student, Arendsen was more involved in the human aspects. Gineke Ten Holt was the third PhD candidate, who worked as a bridge between the two disciplines.

Sign Language Vocabulary:

Further research showed that the Electronic Learning Environment really does work. The sign language vocabulary of those children who had practised with it increased compared to that of a control group. This only applied to the slightly older children however.

Tuesday, October 20, 2009

New Technology Detects Chemical Weapons In Seconds

Scientists at Queen's University Belfast are developing new sensors to detect chemical agents and illegal drugs which will help in the fight against the threat of terrorist attacks.

The devices will use special gel pads to 'swipe' an individual or crime scene to gather a sample which is then analysed by a scanning instrument that can detect the presence of chemicals within seconds. This will allow better, faster decisions to be made in response to terrorist threats.

The scanning instrument will use Raman Spectroscopy which involves shining a laser beam onto the suspected sample and measuring the energy of light that scatters from it to determine what chemical compound is present. It is so sophisticated it can measure particles of a miniscule scale making detection faster and more accurate.

Normally this type of spectroscopy is not sensitive enough to detect low concentrations of chemicals, so here the sample is mixed with nanoscale silver particles which amplify the signals of compounds allowing even the smallest trace to be detected.

It is hoped the new sensors will also be the basis for developing 'breathalyzer' instruments that could be of particular use for roadside drugs testing in much the same way as the police take breathalyzer samples to detect alcohol.

At present, police officers are only able to use a Field Impairment Test to determine if a person is driving under the influence of drugs. The accuracy of this method has been questioned because of concerns that it is easy to cheat.

To ensure the technology is relevant, senior staff members from FSNI (Forensic Science Northern Ireland) will give significant input into the operational aspects of the technology and give feedback as to how it might be used in practice by the wider user community.

Stan Brown, Chief Executive of FSNI said:

"We consider the work being carried out by researchers at Queen's University extremely important and potentially very useful in driving forward the effectiveness, efficiency and speed of forensic science practice. The combination of leading edge research and hands-on experience of FSNI's practitioners has already proven very fruitful and is likely to lead to significant developments in forensic methodologies across a range of specialisms."

In the future this technology could have a number of important applications and according to Dr Bell: "There are numerous areas, from medical diagnostics to environmental monitoring, where the ability to use simple field tests to detect traces of important indicator compounds would be invaluable."

Video Camera That Records At The Speed Of Thought

European Researchers who created an ultra-fast, extremely high-resolution video camera have enabled dozens of medical applications, including one scenario that can record ‘thought’ processes travelling along neurons.

The Megaframe project scored a staggering number of breakthroughs to create the world’s first 1024 pixel, photon-resolution, million-frame-per-second CMOS camera.

Their work has pushed the boundaries of CMOS (a type of semiconductor) miniaturisation and sophistication. But it is in the application of their technology that the most stunning impacts of the Megaframe project will be seen, particularly in medical applications.

That is because the camera can detect a single photon at a million times a second, and so it can record molecular processes in unprecedented detail. “We need this sort of detail because biomedical scientists are studying processes at the intra-cellular and molecular levels,” underlines Edoardo Charbon, coordinator of the EU-funded Megaframe project.

Scientists have developed extremely ingenious ways to infer or deduce what is happening at the molecular level, and Megaframe could make that process even more detailed. Essentially, scientists use a variety of emissive materials to see what is happening in microscopic biomedical processes.

Take Fluorescence Lifetime Imaging Microscopy (FLIM). Here, a fluorescent material is introduced to the area of interest. Fluorescence has some interesting properties, for example a particular spectrum of emission and a rate of decay.

One particular fluorophore, Oregon Green Bapta (OGB-1), decays at a rate proportionate to the presence of calcium. Interestingly, calcium is an important indicator of neuron activity.

So it is possible, for example, to go inside neurons and look at their ion channels. These are the channels that allow neurons to communicate with other neurons. And you can basically see the amount of calcium that is present. You can probe optically how neurons communicate with other neurons just by looking at the concentrations of calcium in real time.

So scientists can use the OGB-1 to indicate the presence and concentration of calcium, and the whole process can be recorded in ultra-fine detail thanks to single-photon detectors, such as the ones present in the Megaframe camera. The camera is recording at the speed of thought.

“Biomedical scientists could in principle use this microscopic information about calcium to learn about macroscopic conditions like Parkinson’s, or Alzheimer’s or epilepsy,” Charbon stresses.

Megaframe could have a significant impact on any medical science that uses visible light emissive scanning technologies like FLIM. But it can even have an impact where visible light is not present.

Other Applications:

Other applications currently under exploration by Megaframe include intracellular DNA sequencing and proteomics, two huge areas for drug discovery, as well as basic scientific research for gene sequencing and protein-folding.

Other areas where Megaframe’s work could boost research results include cell membrane scanning, to discover what bacteria or other material are present, and this research could be extended to look at issues like water purity, and waterborne bacteria.

Exploring further Potential:

Another very promising technique is the combination of fluorescence imaging with MRI, or magnetic resonance imaging. “In MRI you need very strong magnetic fields in the cavity where you are performing the imaging, up to 10 Tesla, but conventional fluorescence technology won't work in these conditions,” says Charbon.

But Megaframe’s choice of photo detector – the Single-Photon Avalanche Diode (SPAD) – have been tested successfully in fields up to 9.4 Tesla, he reveals.

“Thus, it can be envisaged to have a system where fluorescence-enhanced imaging and functional MRI may be used simultaneously,” Charbon enthuses. “This is very useful in a number of biomedical applications, where one wants to monitor the correlation between the presence of certain molecules in organs, such as the brain, and their function.”

Again, pharmacology could benefit from this technique enormously, as well as epidemiological research.

“Our preliminary tests were conducted in an animal MRI, which in general has much higher fields than a human MRI. Human MRI tests will follow,” reveals Charbon, adding that the technique has been tested with other SPAD-based microsensors and has yielded good results.

“Even though we have not tested it with the Megaframe chip, it is a guaranteed success because the technology is in principle the same,” Charbon predicts.

The Megaframe project has just begun to explore the potential for their camera in biomedical applications, and the list just keeps on growing as their research continues. And that is just in the biomedical field. There are dozens of potential applications in fields as diverse as high-energy physics, entertainment and automotive diagnostics.

Sunday, August 23, 2009

Unleashing The Power In Beer

Wolfgang Bengel, the technical director at German biomass company BMP Biomasse Projekt, saw a business opportunity in solving the breweries’ grain waste headache. He reasoned that the leftover grain could be used to create steam and biogas, which would provide energy for the breweries, cheapening their energy costs as well as their costs of transporting grain to farms.

Bengel has successfully treated the residue from rice and sugar cane in boilers with atmospheric fluidized bed combustion systems, to produce energy in China and Thailand, and Bengel thought a similar process could be developed for the breweries’ spent wet grain. Water would first have to be removed from the wet spent grain, the grain would have to be dried and then burned to produce energy. “Beer making is energy intensive – you boil stuff, use hot water and steam and then use electric energy for cooling – so if you recover more than 50 percent of your own energy costs from the spent grain that’s a big saving,” says Bengel.

BMP turned to a long-standing business partner, fellow German biogas plant specialist INNOVAS, which had worked with it in China, to help develop the method as a EUREKA project. Germany’s BISANZ, which works on engineering projects, was also enlisted, as was Slovakian partner Adato, which designs boilers. By chance, BISANZ had been working on a boiler plant for a waste management company which entered bankruptcy, with assets being sold. The partners decided to buy the unwanted plant and to adapt the equipment to the process of burning spent grain.

Researchers had to add extra cleaning and filtering equipment to the combustion equipment they had bought. There are extremely high European standards for combustion and the team had to extend the research timetable as its initial burning tests failed to meet the requirements. “We had more than 50-60 test periods of burning mixtures of spent grain,” says Bengel.

They have managed to refine the process so that the burning met the requirements. They also perfected a process for the anaerobic treatment of the waste water from breweries, thereby producing a complete system for breweries to treat their complete waste stream, wet spent grain and waste water. One of Germany’s environmental protection agencies (TÜV) certified the burning process as up to standard.

Breweries who sign up could become greener breweries, creating their own energy and cutting down on lorries travelling to and from their factories. “Out of 100,000 tonnes of wet spent grain, you have 2,000 tonnes or even less of ashes,” says Bengel.

Sunday, May 24, 2009

SOUTH AMERICA: Rich in Biodiversity, Lagging in Protection

South America is the only region that has not submitted a report of its actions in the last year to implement the Convention on Biological Diversity, although it accounts for 40 percent of the world’s plant and animal species and the deadline was Mar. 30.

"These reports are very important in order to combat threats against biodiversity," David Cooper, Programme Officer at the Secretariat of the Convention on Biological Diversity (CBD) told IPS, referring to the fourth annual report, which none of the 12 countries of South America have presented.

Without their reports, there is no record of the actions, strategies and progress they have made toward the protection of biodiversity, which holds up regional progress on the issue, the experts complained.

Cooper and more than 30 representatives of South American governments, environmental groups and other other civil society organisations are in Lima participating in a high-level meeting to assess the region's progress and interests in relation to the CBD's 2010 biodiversity target to significantly reduce the current rate of biodiversity loss at the global, regional and national level.

The two-day meeting at the headquarters of the Andean Community trade bloc ended Wednesday with a declaration on progress and limitations in reaching the target, and a list of priorities for action as the 2010 deadline looms.

The CBD, approved in 1992 at the Earth Summit in Rio de Janeiro, Brazil, is the foremost instrument for stemming the loss of biological diversity and ensuring equitable and sustainable access to the resources and benefits of that wealth. To promote compliance with the Convention, the Countdown 2010 Initiative was created within the scope of the International Union for the Conservation of Nature (IUCN), in 2004 for Europe and in 2007 for Latin America.

There are 150 partners in the initiative, including government representatives, private companies and organisations around the world committed to curbing the loss of biodiversity.

"Biodiversity is a local issue for each country, but we share a common responsibility," Sebastian Winkler, the head of the Countdown 2010 Initiative and an adviser on European policy, told IPS.

"But as a region, Latin Ameria has not lived up to the commitments made in signing the Convention because so far, the countries have not reported on their national strategies, unlike Africa, for example," he said.

Winkler stressed that the countries of Latin America must make an effort to "monitor the present state of biodiversity and play a more active part in international processes."

South America not only possesses 40 percent of the planet’s biodiversity, but also 25 percent of the forests and 26 percent of the fresh water sources. This vast natural wealth has also made it one of the most vulnerable regions. Among the main threats that were pointed out at the Lima meeting were the impact of climate change which causes animal and plant species to die out, the uncontrolled extraction of natural resources and the modification of land use in the Amazon - that is, the expansion of areas devoted to agriculture and other productive activities to the detriment of the jungle.

Statistics on damage to biodiversity worldwide are worrying. According to the IUCN's Red List, last updated in 2008, there are 16,928 species threatened with extinction, equivalent to 38 percent of the species catalogued. Thirty-six million hectares of pristine forests have been lost every year since 2000.

The Andean Community, made up of Bolivia, Colombia, Ecuador and Peru, has encouraged countries to create a regional policy to make progress on their commitments, although those involved have admitted that they will not be able to meet the target.

On Tuesday, the first day of the Lima meeting, a review of progress was carried out, which was far from encouraging, and preliminary priorities for 2010 were proposed. Among these were strengthening alliances between government and civil society representatives, defining indicators to measure progress on the commitments, and enlisting the private sector in the defence of biodiversity. Another proposal was to create communication strategies to spread simple, non-technical information, in order to involve the public in this global problem.

Monday, May 18, 2009

Can One Inherit the Happiness ?

A new study suggests that our feelings in our lifetime can affect our children.

Dr. Halabe Bucay suggests that a wide range of chemicals that our brain generates when we are in different moods could affect 'germ cells' (eggs and sperm), the cells that ultimately produce the next generation. Such natural chemicals could affect the way that specific genes are expressed in the germ cells, and hence how a child develops.

In his article in the latest issue of Bioscience Hypotheses, Dr Alberto Halabe Bucay of Research Center Halabe and Darwich, Mexico, suggested that the hormones and chemicals resulting from happiness, depression and other mental states can affect our eggs and sperm, resulting in lasting changes in our children at the time of their conception.

Brain chemicals such as endorphins, and drugs, such as marijuana and heroin are known to have significant effects on sperm and eggs, altering the patterns of genes that are active in them.
"It is well known, of course, that parental behavior affects children, and that the genes that a child gets from its parents help shape that child's character." said Dr. Halabe Bucay. "My paper suggests a way that the parent's psychology before conception can actually affect the child's genes."

"This is an intriguing idea" commented Dr. William Bains, Editor of Bioscience Hypotheses. "We wanted to publish it to see what other scientists thought, and whether others had data that could support or disprove it. That is what our journal is for, to stimulate debate about new ideas, the more groundbreaking, the better."

Sunday, May 17, 2009

How Oil Gets Stuck Underground In Inaccessible Places

It is a mystery to many people why the world is running out of oil when most of the world’s oilfields have only been half emptied. However some of the oil that has been located is trapped as droplets of oil in small cavities in the surrounding rock or is stuck to the walls of the underground cavity and cannot be accessed by the techniques currently used in the oil industry.

Now, new research may have come up with an explanation as to where and how North Sea oil clings to underground rocks. This explanation could turn out to be the first step on the way to developing improved oil production techniques with the intent of increasing oil production from Danish oil fields.

A research group at the Nano-Science Center, part of the Institute of Chemistry at University of Copenhagen has investigated drill cores collected from North Sea oil fields using an atomic force microscope. Their investigations show that the spaces which contain oil have totally different surface qualities than expected from our knowledge of the minerals which make up the rock. The rocks which contain oil in the Danish part of the North Sea are primarily chalk – the same type of rock that the cliffs of Stevns and Møns are made of. Assistant Professor Tue Hassenkam lead the research, whose preliminary results were published in the respected scientific publication PNAS (Proceedings of the National Academy of Sciences) this week. He says that this is the first time that investigations of this type have been carried out on chalk from an oil field in the North Sea.

'Previous investigations were carried out on the surface properties of pure mineral crystals. But our investigation has shown that this chalk has a different and more complex structure' says Tue Hassenkam.

The oil bearing layers in the subsurface are reminiscent of a sponge. The oil "hides" in tiny pores and gaps and only some of the oil can be pressed out of the chalk and into the borehole by injecting water into the chalk layer. The rest is left behind as small droplets of oil surrounded by water either in small gaps in the rock or stuck to the walls of the pores. The chalk particles ought to repel oil if they act like particles of the mineral calcite, which chalk is almost 100% made up of. However the new investigations, carried out with a particularly powerful microscope, have shown that the surfaces of the pores in the chalk are partially covered in a material which oil can stick to. Ass. Prof. Hassenkam believes that the surprising behaviour of the material in the surface of the chalk can be explained by studying how the chalk was formed.

'Chalk is actually the casings of ancient algae. The algae gave their cases a type of "surface coating" to make them resistant to water. And it is probably this surface coating that we can see in action here, even 60 million years later' according to Ass. Prof. Hassenkam.

If we can manage to squeeze even a few percent more oil out of the seabed under the North Sea it could be worth millions of Danish crowns (DKK) for Denmark. Therefore Mærsk Oil and Gas AS on behalf of DUC (Dansk Undergrunds Consortium) along with Danish National Advanced Technology Foundation are supporting a project being carried out by Professor Susan Stipps' research group – the so-called Nano-Chalk Venture, which has been ongoing for the last two years. Tue Hassenkam originally became interested in chalk because he found the algae casings so beautiful. Today, after a year's work in front of a microscope, he is glad that his work also has a practical application. An understanding of how the oil clings to the chalk can possibly help develop a method to release it. And that will be the second part of the Nano-Chalk Venture.

High Blood Pressure Could Be Caused By A Common Virus

A new study suggests for the first time that cytomegalovirus (CMV), a common viral infection affecting between 60 and 99 percent of adults worldwide, is a cause of high blood pressure, a leading risk factor for heart disease, stroke and kidney disease.

Led by researchers at Beth Israel Deaconess Medical Center (BIDMC) and published in the May 15, 2009 issue of PLoS Pathogens, the findings further demonstrate that, when coupled with other risk factors for heart disease, the virus can lead to the development of atherosclerosis, or hardening of the arteries.

"CMV infects humans all over the world," explains co-senior author Clyde Crumpacker, MD, an investigator in the Division of Infectious Diseases at BIDMC and Professor of Medicine at Harvard Medical School. "This new discovery may eventually provide doctors with a whole new approach to treating hypertension, with anti-viral therapies or vaccines becoming part of the prescription."

A member of the herpes virus family, CMV affects all age groups and is the source of congenital infection, mononucleosis, and severe infection in transplant patients. By the age of 40, most adults will have contracted the virus, though many will never exhibit symptoms. Once it has entered the body, CMV is usually there to stay, remaining latent until the immune system is compromised, when it then reemerges.

Previous epidemiological studies had determined that the CMV virus was linked to restenosis in cardiac transplant patients, a situation in which the heart's arteries "reblock." The virus had also been linked to the development of atherosclerosis, the hardening of the heart's arteries. But, in both cases, the mechanism behind these developments remained a mystery. This new study brought together a team of researchers from a variety of disciplines – infectious diseases, cardiology, allergy and pathology – to look more closely at the issue.

"By combining the insights of investigators from different medical disciplines, we were able to measure effects of a viral infection that may have been previously overlooked," explains Crumpacker.

In the first portion of the study, the scientists examined four groups of laboratory mice. Two groups of animals were fed a standard diet and two groups were fed a high cholesterol diet. After a period of four weeks, one standard diet mouse group and one high-cholesterol diet mouse group were infected with the CMV virus.

Six weeks later, the animals' blood pressures were measured by the cardiology team using a small catheter inserted in the mouse carotid artery. Among the mice fed a standard diet, the CMV-infected mice had increased blood pressure compared with the uninfected group. But even more dramatically, 30 percent of the CMV-infected mice that were fed a high-cholesterol diet not only exhibited increased blood pressure, but also showed signs of having developed atherosclerosis.

"This strongly suggests that the CMV infection and the high-cholesterol diet might be working together to cause atherosclerosis," says Crumpacker. In order to find out how and why this was occurring, the investigators went on to conduct a series of cell culture experiments.

Their first analysis demonstrated that CMV stimulated production of three different inflammatory cytokines – IL6, TNF, and MCP1 – in the infected mice, an indication that the virus was causing inflammation to vascular cells and other tissues.
A second analysis found that infection of a mouse kidney cell line with murine CMV led to an increase in expression of the renin enzyme, which has been known to activate the renin-angiotensin system and lead to high blood pressure. Clinical isolates of human CMV in cultured blood vessel cells also produced increased renin expression.

"Viruses have the ability to turn on human genes and, in this case, the CMV virus is enhancing expression of renin, an enzyme directly involved in causing high blood pressure," says Crumpacker. When the scientists inactivated the virus through the use of ultraviolet light, renin expression did not increase, suggesting that actively replicating virus was causing the increase in renin.

In their final experiments, the researchers demonstrated that the protein angiotensin 11 was also increased in response to infection with CMV. "Increased expression of both renin and angiotensin 11 are important factors in hypertension in humans," says Crumpacker. "What our study seems to indicate is that a persistent viral infection in the vessels' endothelial cells is leading to increased expression of inflammatory cytokines, renin and angiotensin 11, which are leading to increased blood pressure."

According to recent figures from the American Heart Association, one in three U.S. adults has high blood pressure, and because there are no known symptoms, nearly one-third of these individuals are unaware of their condition. Often dubbed "the silent killer," uncontrolled high blood pressure can lead to stroke, heart attack, heart failure or kidney failure, notes Crumpacker.

"We found that CMV infection alone led to an increase in high blood pressure, and when combined with a high-cholesterol diet, the infection actually induced atherosclerosis in a mouse aorta," says Crumpacker. "This suggests that further research needs to be directed at viral causes of vascular injury. Some cases of hypertension might be treated or prevented by antiviral therapy or a vaccine against CMV."

This study was funded by grants from the National Heart, Lung and Blood Institute of the National Institutes of Health.

Helping Economy may Hurt Environment

The European Economic Recovery Plan devised by the European Commission last year to help deal with the financial crisis is likely to fast-track environmentally damaging projects in the new member states.

One of the tenets of the European Economic Recovery Plan (EERP), launched in November 2008 by European Commission (EC) president Jose Manuel Barroso, is acceleration of payments to new EU member states from the European Structural and Cohesion Funds and the European Investment Bank.

The accelerated funds, amounting to about 23 billion euros, are destined mainly for infrastructure development, and are considered essential by the EC to creating employment and assisting the economic recovery of the Central and Eastern European countries.

The EERP, which was approved by the European Parliament in March 2009, stresses the need for "smart" investments through promotion of clean technologies, support for micro-enterprises, and programmes for re-training labour.

But environment groups warn that these payments could be used by the new member states for infrastructure projects that are environmentally costly, have better alternatives, or are not sustainable in the long run.

Bankwatch, an independent group monitoring the impact of investments by financial institutions and corporations Europe-wide, has published a map of 55 EERP projects that are "environmentally threatening and economically unsound." The list includes 22 incinerators - 12 of them in Poland - and several transport routes that pass through naturally protected areas.

These projects are high on the list of government priorities, and are the ones most likely to get financing through the EERP.

"EU funds granted to post-socialist states provide hard cash for heavy investments, but fail to deliver capacity building and knowledge transfer for small-scale projects, which usually have more development effect for local and regional communities," Keti Medarova from Bankwatch told IPS. "Because of this, the money allocated for small initiatives cannot be absorbed, and gets re-allocated towards the ever-growing costs of large infrastructures."

Medarova warns that the combination of "the Keynesian approach promoted around the crisis to pump in public money for big infrastructure" can have negative consequences in the new member states, where politicians are keen to use this opportunity to "undertake grand promises and plans at the expense of promoting local and regional developments."

The 55 controversial projects lined up for EERP funding are still to receive the green light. Medarova says the map is intended as "an early warning" for the EC, which has a say in granting the money, and could also monitor procedures such as the environmental impact assessment and public consultation.

Many of the 55 projects lined up for EU funding have drawn considerable local opposition. Ignoring protests against incinerators in places such as Warsaw or Krakow in Poland, the government increased the number of planned incinerators from eight to 12.

Regardless of technological progress achieved over the past years in making incinerators less damaging to health and the environment, most incinerators still run the risk of producing carcinogenic emissions such as dioxides and metal particles.

Bankwatch figures also show that the 12 Polish incinerators would use up 66 percent of the cohesion funds granted to the country for waste management, restricting investment in more environment friendly and cost-efficient forms of waste management such as collection and recycling schemes. The EU is in fact going against its own policy of promoting recycling, reduction and reuse, according to Bankwatch. Currently, the Polish recycle only 3 percent of municipal waste.

According to the independent Global Alliance for Incinerator Alternatives (GAIA), lobbying from companies building incinerators has led to a policy that is more permissive for investors. In June 2008, the group revealed that Caroline Jackson, Member of the European Parliament and rapporteur for the EU Waste Framework Directive, held a remunerated position in the environmental advisory board of waste industry company Shanks PLC.

It is not just incinerators that are controversial. The R52 motorway planned in the Czech Republic to connect Brno city with Vienna would affect several sites protected under the European framework Natura 2000. After evaluating the environmental impact of the route, as well as an alternative proposed to it, the Czech government decided in June 2008 to go ahead with both projects, even though they would service the same transportation needs.

In Bulgaria, the planned nuclear plant at Belene, prioritised by the government for EU funding, has been opposed by environmentalists and specialists for years, principally on the grounds that it would lie on a highly seismic area, making it more prone to accidents.

Bankwatch has sounded an additional warning over the public-private partnership formula being promoted for investment through the EERP. The EC says public-private partnerships have strong stimulus effects for economic recovery, but critics say fast access to this money encourages corrupted politicians and greedy companies. The International Monetary Fund (IMF) has cautioned that "the money received by private corporations in the current setting is more likely to be hoarded than reinvested."

"The economic crisis should not be used as a political momentum to push forward controversial infrastructure projects with little recovery effects for the economy," says Medarova. "Instead, what the EC should insist on is even stricter implementation of environmental legislation, especially as regards the impact assessment procedures, the evaluation of alternative solutions - both for waste and transport - and increased transparency and public debate over how the money for economic recovery and stimulus measures is spent."

More Investment in Production Won’t Cure African Food Crisis

The food crisis in African states will not be solved by investment to spur agricultural production because the problem is not food output but poverty that is making food unaffordable for urban Africans.

This is the argument of Gilles Saint-Martin, the head of international relations for the French Agricultural Research Centre for International Development, known by its French acronym CIRAD. CIRAD’s approach to sustainable development focuses on the long-term ecological, economic and social consequences of change in developing communities and countries.

Saint-Martin talks to Hilaire Avril about the dire need for investment in African agricultural research; the effects that the economic partnership agreements will have on food production; and whether the African Union should adopt its own Common Agricultural Policy (CAP).

Q: You recently wrote that, despite last year’s food riots in many developing countries, ‘‘African agriculture is not disaster-stricken, and […] agricultural production has steadily increased across Africa since the 1960s, picking up even more speed since the 1980s.’’ How do you explain the food crisis currently affecting several African countries?

Gilles Saint-Martin (GSM): First and foremost, the problem is poverty. Last year’s food riots were mainly urban crises, affecting city-dwellers who could not afford to buy basic food anymore. The problem is not agricultural output, which is sufficient, but poverty, which makes it unaffordable.

The solution to the crisis is to tackle both issues, that is, to increase agricultural production and to decrease poverty by fostering rural economic activity.

The problem is that, for several years now, the traditional solidarity system between the African countryside and the cities has been severely undermined by repeated crises and that it has now broken down. Therefore, city-dwellers bear the full brunt of rising food prices.

Several CIRAD studies show that agricultural production has significantly risen in several African countries. Cassava outputs in Central and West Africa, for instance, have increased. But the problem is that the demographic increase is faster. We, at CIRAD, do not believe massive foreign investments will solve the problem of African food security. We’re still waiting to see what the outcome of investments such as those made in Senegal last year will be.

For the moment all the examples we’ve seen are geared towards ensuring food security for rich countries, which invest in agriculture for their own food security (and) not to share the production with the African country hosting the investment.
However, there are some under-reported but interesting initiatives - mainly in the Indian Ocean. Mauritius, for instance, which faces significant food supply challenges, says it would consider investing in countries with a high agricultural potential such as Mozambique, and sharing the production between investors, producers and the global market.

But, again, the main problem remains that of urban populations accessing affordable food. It’s going to get worse as migrants send fewer remittances back home to urban Africans because of the global economic downturn

Q: The African Development Bank, the International Fund for Agricultural Development and other organisations recently created an ‘‘Investment Fund for African Agriculture’’. Are such funds the solution to the food crisis?

GSM: There is currently a trend towards increasing agricultural output rather than implementing regional policies that foster rural activity to help urban populations access affordable food while regulating prices on a regional level. What worries us is that, whether public or private, these funds’ position seems to be ‘‘let’s simply produce more, and we’ll all be fine’’.

Q: How do you expect the proposed EPAs (the trade liberalisation deals that the European Union is pressing African, Caribbean and Pacific countries to sign) to affect this situation?

GSM: We at CIRAD tend to think these agreements should be signed with caution. But many countries have signed them anyway, as they would otherwise have lost access to European markets.

Our partners in Cote d’Ivoire and Cameroon recently told us ‘‘if we hadn’t signed the agreement, we would not have been able to keep selling our bananas in the European Union’’. Several countries signed these agreements essentially under the pressure of producers’ associations, who were afraid they would lose access to markets.

I haven’t followed the latest developments but, as they were proposed two years ago, EPAs seemed based on an outmoded model.

Q: What is CIRAD’s answer to the food crisis?

GSM: We’re very preoccupied with innovation, which is a key element in solutions for the North as well as the South. CIRAD focuses on research, so we naturally invest in innovation, whether in urban or rural environments, to mitigate poverty and to enhance food security and a more efficient use of resources.

Unfortunately, supporting innovation and research has never been considered a priority. Many donors tell us ‘‘we want immediate results, so you must implement innovating ideas urgently’’. But the period of time needed to implement innovating ideas in real social settings, with tangible objectives, are not compatible with the expected response time to a food crisis.

Our African research partners have been entirely de-structured by the (International Monetary Fund’s) structural adjustment policies in the 1980s and 90s. Most have still not recovered. We work mainly with young African scholars and researchers, but young recruits are scarce in Africa.

This is our main warning call. African research capacities need to be rebuilt, created or consolidated in order to foster innovations allowing us to cope with evolving societies, to preserve limited resources and to secure food supply.

Also, after several years of soul-searching, we have identified the development of the rural sector, by intensifying ecological production, as a priority. That means not relying on more fertilizers or herbicides, but optimising the use of ecosystems’ natural cycles and learning more about the way plants and soil work.

Q: Should Africa adopt its own version of the European Union’s (EU) Common Agricultural Policy?

GSM: Some versions of it already exist. In West Africa, the Economic Community of West African States and the West African Economic and Monetary Union have adopted regional agricultural policies which are not structured like the CAP but resemble it in that they harmonise national policies, including tariffs.

The Southern African Development Community and East African Community are also thinking of similar schemes. I think it’s unavoidable. Solutions can’t be found on a national level, they have to be regional.

The 2005 food crisis in Niger, for instance, was not a national but a regional emergency, which could have been solved if regional procedures had been put in place to share resources between Niger, Nigeria and Mali. This must be the priority for food security policies.

Q: What role could Europe play in constructing these regional clusters?

GSM: Europe’s organisational model for regional agriculture cannot be replicated but the EU could assist in setting up African region-wide systems.

But these regions’ agricultural products must also be protected, from time to time. The European CAP was built on these principles and still protects European farmers to some degree. The CAP so far focuses on markets, resources and consumer protection. In 2013, when the CAP is to be reformed, I think it should include food security as one of its main objectives.

European farmers, when you talk to them, are preoccupied by their production and purchasing power, of course. But they are also conscious of the food security problems the world faces. Incorporating world food security in the CAP’s objectives would be a positive evolution. It would help decompartmentalise the EU from global agriculture.

Sunday, May 3, 2009

Scientists Warn: Two-Degree Rise Ever More Likely

Climate scientists are calling for a phase-out of fossil fuels because humans are now pumping so much carbon dioxide (CO2) into the atmosphere that the '2-degree-C climate balloon' will burst otherwise, new studies show.

That 2-degree C climate balloon has a maximum capacity of less than 1,400 gigatonnes of CO2 total emissions from the year 2000 to 2050, Malte Meinshausen and colleagues report in the current issue of Nature. The European Union and others consider a global temperature rise of more than 2 degrees C as dangerous and potentially catastrophic. Temperatures are already 0.8 C warmer than the pre-industrial period.

The reality is that global emissions for the last seven years amounted to almost 250 gigatonnes of these long-lived greenhouse gases, meaning that the current and growing rates of fossil fuel emissions would burst the balloon in about 20 years – or less. Even if emissions are held to 1,400 gigatonnes maximum for the next 40 years, there is still a 50-percent probability of exceeding 2 degrees C, said Meinshausen, lead author of the study and climate researcher at the Potsdam Institute for Climate Impact Research.

Indigenous peoples from around the world also called for a phase-out of fossil fuels at the conclusion of the first Indigenous Peoples' Global Summit on Climate Change in Anchorage, Alaska, that concluded last week.

"That call is well-supported by the evidence in this study," Meinshausen told IPS.

However, the world's future global carbon budget is likely less than 1,400 gigagtonnes. When other short-term warming gases like methane are included, then the total 'forcing', i.e. warming, could be 10 to 40 percent greater by the year 2100, said Meinshausen.

And some climate feedbacks - changes that will amplify or accelerate the warming - are absent from computer models. "Our modeling cannot account for emissions in methane from melting permafrost," he said.

Permafrost - permanently frozen bog and peatland - contains enormous amounts of organic carbon, perhaps enough to triple the amount currently in the atmosphere.

"Only a fast switch away from fossil fuels will give us a reasonable chance to avoid considerable warming," said Meinshausen. "We shouldn’t forget that a 2-degree C global mean warming would take us far beyond the natural temperature variations that life on Earth has experienced since we humans have been around."

This will be a serious challenge, he said, because there is plenty of carbon left in the ground. Proven reserves of oil, gas and coal represent four times the amount of carbon that would burst the 2-degree climate balloon. Burning just one quarter of what's left in the ground will bring humanity to the 50-50 point of tipping into dangerous climate change.

Delay is not an option when it comes to the fossil fuel phase-out, scientists stress. Even though a tonne of carbon is a tonne of carbon, whether released today or in 50 years' time, there is only so much the atmosphere can take before a 2-degree rise or more is inevitable, Meinshausen, Myles Allen of the University of Oxford and others write in a Nature Reports Climate Change commentary.

"Emitting CO2 more slowly buys time, perhaps vital time, but it will only achieve our ultimate goal in the context of a strategy for phasing out net CO2 emissions altogether," they conclude.

"Climate policy needs an exit strategy: as well as reducing carbon emissions now, we need a plan for phasing out net emissions entirely," Allen said in a release.

So what are the targets for the negotiators United Nations Framework Convention on Climate Change (UNFCCC) in the Copenhagen this December?

If negotiators heed the scientific evidence, then a new global agreement's goal will be to reduce global emissions by 50 percent compared to 1990 and do that by 2050. To achieve this, the current three-percent annual growth in carbon emissions must flatline by 2015 and start the decline by 3 percent per year, reports Martin Parry of the Grantham Institute for Climate Change and Centre for Environmental Policy, Imperial College London in another Nature study.

"If we do this it leaves an even chance of exceeding 2-degree C of warming," Parry and colleagues write.

If mitigation efforts are not substantial enough and emissions peak in the year 2025, then a 3-degree C rise in temperatures will likely occur. The damage from this level of warming could be substantial, placing billions more people at risk of water shortage and millions more at risk of coastal flooding. To avoid such damage will require massive investment in adaptation, such as improving water supply and storage, and protecting low-lying settlements from rising seas.

A final cautionary note: "The true sensitivity of the Earth system may well be higher, implying that any temperature-based target will become progressively harder to maintain as slower feedbacks kick in," write Gavin Schmidt, of the NASA Goddard Institute for Space Studies, and David Archer of the University of Chicago in short article in Nature Wednesday.

"The bottom line? Dangerous change, even loosely defined, is going to be hard to avoid," they said.

Like an oil spill, it is far better and cheaper to avoid making the mess in the first place, they conclude

Friday, May 1, 2009

World Bank Provides Support to Improve Afghanistan’s Financial Sector‏

The World Bank approved a US$8 million grant to help improve access to formal banking services in Afghanistan as well as strengthen Da Afghanistan Bank’s core function of banking supervision and regulation on April 30, 2009.

In 2002 after the fall of the Taliban regime, the formal financial sector in Afghanistan was almost inoperative and the legal framework was virtually non-existent. Since then, Afghanistan’s financial sector has gone through two phases of development. During the first phase (2002-04), a basic legal and institutional framework for a modern financial sector was introduced, which laid the foundation for the re-establishment of Da Afghanistan Bank (DAB) as the central bank with autonomous regulatory authority to implement monetary policy and banking regulation and supervision.

In the second phase (2005-present), formal financial services emerged and a number of private commercial banks were established. Currently, there are 17 commercial banks operating in Afghanistan, which include 2 state-owned commercial banks, 10 private commercial banks, and 5 branches of foreign commercial banks. Despite these achievements, a weak financial sector still remains one of the major binding constraints to private sector development in Afghanistan.
The Financial Sector Strengthening Project supports Afghanistan National Development Strategy’s vision to establish a modern and competitive financial sector. The project will specifically strengthen the capacity of Da Afghanistan Bank (DAB) in the areas of banking supervision, accounting, internal audit, and human resource management. It will also develop necessary financial infrastructure such as public credit registry, collateral registry and Afghanistan Institution of Banking.


"The legal and regulatory framework of Afghanistan’s financial sector has improved significantly. But many challenges remain, notably increasing access to financial services as well as ensuring sustainability of the sector,” said Md. Reazul Islam, World Bank Senior Private Sector Development Specialist and Project Team Leader. “To overcome these challenges, the government needs to enforce implementation of rules and regulation. The World Bank remains committed to provide technical as well as financial resources necessary to build a sustainable and accountable financial sector in Afghanistan.”

The project also supports some of the key areas that have been agreed by the Government of Afghanistan and its development partners at the Enabling Environment Conference Road Map in 2007.

The total cost of the project is estimated around US$9.46 million. In addition to IDA’s US$8 million grant, International Financial Corporation, the private sector arm of the World Bank Group, has provided US$0.59 million in technical support. Some US$0.87 million have been contributed through counter funding by Da Afghanistan Bank, Afghanistan Bank’s Association and Microfinance Investment Support Facility for Afghanistan (MISFA).

For more information on the Bank’s work in Afghanistan, please visit: http://www.worldbank.org.af

World Bank Provides More Support to India’s Small and Medium Enterprises‏

The World Bank approved a US$400 million additional financing loan to the Small Industries Development Bank of India (SIDBI) on April 30, 2009, designed to improve access to finance for Small and Medium Enterprises (SMEs). This additional financing will help scale up the fully disbursed original project which had been approved by the World Bank on November 30, 2004.

Access to adequate and timely financing on competitive terms, particularly longer tenure loans remains a challenge for Indian SMEs. This problem has been exacerbated by the current global financial crisis, the ensuing liquidity constraints and the slowdown in credit growth in the Indian financial sector. In particular, credit growth to SMEs has declined over the last year, which has held back the growth of SMEs and impacted overall growth and development.

"This Project is part of a larger program of support in response to the Government of India request for funding in light of the financial crisis. It is targeted particularly at SMEs, to help address the credit slowdown that has resulted from the financial crisis,” said Roberto Zagha, World Bank Country Director for India. “Achieving and sustaining growth and employment will require a sharp step up in industrial and services growth. This needs to be spurred by SMEs which have the greatest potential to provide employment.”

The credit facility supported by the Project will channel long-term and working capital loans for SMEs in geographical areas beyond those that were covered in the original Project. This includes expanding to new geographical areas, possibly to India’s low-growth states, thereby promoting inclusive growth.

Under the credit facility SIDBI will also explore refinancing other banks and financial institutions for on-lending to SMEs. In addition, this Project will build linkages with an on-going DFID financed technical assistance component which is helping banks enhance the quality of their SME loan portfolios, strengthening business development services and building market linkage programs. “This integrated Project will help SMEs improve their profitability and competitiveness, and become more creditworthy,” said Niraj Verma, World Bank Senior Financial Sector Specialist and project team leader.

Finally, the Risk Sharing Facility supported by the Project will expand the coverage of this innovative initiative launched under the parent Project.

The lending from the original project has covered 927 SMEs spread across 10 Indian states. A survey showed that nearly two-thirds of the SMEs financed upgraded their technology, which helped increase productivity.

The loan, from the International Bank for Reconstruction and Development (IBRD), is backed by a Republic of India guarantee. It has a 15 year maturity which includes a 5-year grace period.

For more information on the Bank’s work in India, visit http://www.worldbank.org.in

Monday, March 23, 2009

Discovery of New Microorganisms In Earth's Stratosphere

Three new species of bacteria, which are not found on Earth and which are highly resistant to ultra-violet radiation, have been discovered in the upper stratosphere by Indian scientists.

One of the new species has been named as Janibacter hoylei, after the distinguished astrophysicist Fred Hoyle, the second as Bacillus isronensis recognising the contribution of ISRO in the balloon experiments which led to its discovery and the third as Bacillus aryabhata after India’s celebrated ancient astronomer Aryabhata and also the first satellite of ISRO.

The experiment was conducted using a 26.7 million cubic feet balloon carrying a 459 kg scientific payload soaked in 38 kg of liquid neon, which was flown from the National Balloon Facility in Hyderabad, operated by the Tata Institute of Fundamental Research (TIFR). The payload consisted of a cryosampler containing sixteen evacuated and sterilised stainless steel probes.

Throughout the flight, the probes remained immersed in liquid Neon to create a cryopump effect. These cylinders, after collecting air samples from different heights ranging from 20 km to 41 km, were parachuted down and safely retrieved. These samples were analysed by scientists at the Center for Cellular and Molecular Biology, Hyderabad as well as the National Center for Cell Science (NCCS), Pune for independent examination, ensuring that both laboratories followed similar protocols to achieve homogeneity of procedure and interpretation.

The Findings

In all, 12 bacterial and six fungal colonies were detected, nine of which, based on 16S RNA gene sequence, showed greater than 98% similarity with reported known species on Earth. Three bacterial colonies, namely, PVAS-1, B3 W22 and B8 W22 were, however, totally new species. All the three newly identified species had significantly higher UV resistance compared to their nearest phylogenetic neighbours. Of the above, PVAS-1, identified as a member of the genus Janibacter, has been named Janibacter hoylei. sp. nov. The second new species B3 W22 was named as Bacillus isronensis sp.nov. and the third new species B8 W22 as Bacillus aryabhata.
The precautionary measures and controls operating in this experiment inspire confidence that these species were picked up in the stratosphere. While the present study does not conclusively establish the extra-terrestrial origin of microorganisms, it does provide positive encouragement to continue the work in our quest to explore the origin of life.

This multi-institutional effort had Jayant Narlikar from the Inter-University Centre for Astronomy and Astrophysics, Pune as Principal Investigator and veteran scientists U.R. Rao from ISRO and P.M. Bhargava from Anveshna supported as mentors of the experiment. S. Shivaji from CCMB and Yogesh Shouche from NCCS were the biology experts and Ravi Manchanda from TIFR was in charge of the balloon facility. C.B.S. Dutt was the project director from ISRO who was in charge of preparing and operating the complex payload.

This was the second such experiment conducted by ISRO, the first one being in 2001. Even though the first experiment had yielded positive results, it was decided to repeat the experiment by exercising extra care to ensure that it was totally free from any terrestrial contamination.

Thursday, March 19, 2009

How Brain Records Memories

It may be possible to "read" a person's memories just by looking at brain activity, according to research carried out by Wellcome Trust scientists. In a study published in the journal Current Biology , they show that our memories are recorded in regular patterns, a finding which challenges current scientific thinking.

Demis Hassabis and Professor Eleanor Maguire at the Wellcome Trust Centre for Neuroimaging at UCL (University College London) have previously studied the role of a small area of the brain known as the hippocampus which is crucial for navigation, memory recall and imagining future events. Now, the researchers have shown how the hippocampus records memory.

When we move around, nerve cells (neurons) known as "place cells", which are located in the hippocampus, activate to tell us where we are. Hassabis, Maguire and colleagues used an fMRI scanner, which measures changes in blood flow within the brain, to examine the activity of these places cells as a volunteer navigated around a virtual reality environment. The data were then analysed by a computer algorithm developed by Demis Hassabis.

"We asked whether we could see any interesting patterns in the neural activity that could tell us what the participants were thinking, or in this case where they were," explains Professor Maguire, a Wellcome Trust Senior Research Fellow. "Surprisingly, just by looking at the brain data we could predict exactly where they were in the virtual reality environment. In other words, we could 'read' their spatial memories."

Earlier studies in rats have shown that spatial memories – how we remember where we are – are recorded in the hippocampus. However, these animal studies, which measured activity at the level of individual or dozens of neurons at most, implied that there was no structure to the way that these memories are recorded. Hassabis and Maguire's work appears to overturn this school of thought.

"fMRI scanners enable us to see the bigger picture of what is happening in people's brains," she says. " By looking at activity over tens of thousands of neurons, we can see that there must be a functional structure – a pattern – to how these memories are encoded. Otherwise, our experiment simply would not have been possible to do."

Professor Maguire believes that this research opens up a range of possibilities of seeing how actual memories are encoded across the neurons, looking beyond spatial memories to more enriched memories of the past or visualisations of the future.

"Understanding how we as humans record our memories is critical to helping us learn how information is processed in the hippocampus and how our memories are eroded by diseases such as Alzheimer's," added Demis Hassabis.

"It's also a small step towards the idea of mind reading, because just by looking at neural activity, we are able to say what someone is thinking."

Professor Maguire led a study a number of years ago which examined the brains of London taxi drivers, who spend years learning "The Knowledge" (the maze of London streets). She showed that in these cabbies, an area to the rear of the hippocampus was enlarged, suggesting that this was the area involved in learning location and direction. In the new study, Hassabis, Maguire and colleagues found that the patterns relating to spatial memory were located in this same area, suggesting that the rear of the hippocampus plays a key role in representing the layout of spatial environments.

Sunday, March 15, 2009

Reseach on How to Improve Individual Decesions

Herd mentality. Angry mob. Mass hysteria. As these phrases suggest, we are not always confident that a large group of people will come up with the smartest decisions. So it may be surprising to learn that numerous studies have shown that a crowd of people usually gives more accurate responses to questions compared to a mere individual.

Averaging the responses provided from a group increases accuracy by canceling out a number of errors made across the board (such as over- and under-estimating the answer).

What happens when we are on our own? What if there is no one else around to consult with before making a judgment - how can we be confident that we are giving a good answer? Psychologists Stefan M. Herzog and Ralph Hertwig from the University of Basel wanted to know if individuals could come up with better answers using a technique they designed and called "dialectical bootstrapping."

Dialectical bootstrapping is a method by which an individual mind averages its' own conflicting opinions, thus simulating the "wisdom of the crowd." In other words, dialectical bootstrapping enables different opinions to be created and combined in the same mind. For example, in this study, participants were asked to identify dates of various historical events. After they gave their initial answer, the participants were asked to think of reasons why the answer may be wrong and were then asked to come up with an alternative second (dialectical) answer.

The results, reported in Psychological Science, a journal of the Association for Psychological Science, reveal that the average of the participants' first answer with the second answer was much closer to the correct answer, compared to the original answers on their own. In addition, the dialectical bootstrapping method (that is, thinking about why your own answer might be incorrect and then averaging across estimates) resulted in more accurate answers compared to simply making a second guess without considering why the first answer may be wrong.

These findings suggest that dialectical bootstrapping may be an effective strategy in helping us come up with better answers to many types of problems. The researchers note that while it may be frustrating going back and forth between two different answers, "as dialectical bootstrapping illustrates, being of two minds can also work to one's advantage." They conclude, "Once taught about the tool, people could make use of it to boost accuracy of their estimates across a wide range of domains."

Saturday, March 14, 2009

Costa Rica Invests in Geothermal Power

The government of Costa Rica hopes to increase its power generation by tapping into volcanic hot spots, and to that end it has introduced a controversial bill in Congress that would allow drilling into volcanoes in national parks.

In January, the governmental Costa Rican Institute of Electricity (ICE) announced that it is contracting equipment for the geothermal power station of Las Pailas, on the side of the Rincón de la Vieja volcano, in the northwest province of Guanacaste.

The plant is scheduled to become operational in 2011, adding 35 megawatts to the 163.5 that are already supplied by the five units of the Miravalles volcano power station, in operation since 1994. That same year, a third project, the Borinque, on the northeast side of the Rincón de la Vieja volcano, will be launched.

Geothermal power uses underground steam from volcanic regions. The energy is harnessed by extracting the heat from within the earth’s crust, in the form of a fluid that is used to move the turbines. Two holes are drilled in each case: one is used to draw hot water, and the flow of water is then cooled and re-injected into the other.

In Costa Rica’s case, high temperature wells (150 to 400 degrees Celsius) are used, but there are also medium and low temperature wells.

One of the goals of the ICE is to increase the percentage of geothermal energy that is channelled to the country’s power grid. ICE president Pedro Pablo Quirós told Tierramérica that several sites have been identified in northern Costa Rica, in an area stretching from the Poás volcano to the Nicaraguan border, from which up to 800 megawatts will be generated.

The problem is that these areas identified for geothermal power generation are located in national parks, and thus congressional authorisation is necessary, which explains the bill currently under consideration. Geothermal prospecting is similar to oil prospecting, with drilling usually penetrating 1.7 kilometres deep, but in some cases going down as far as 3.7 kilometres.

The opposition to the project comes from environmental organisations.

The president of the Wildlife Preservation Association (APREFLOFAS), Angeline Marín, told Tierramérica that she was "against the opening up of national parks for any purposes."

Marín believes that by opening the parks up to tourism and putting their habitats at risk, the Ministry of Environment, Energy and Telecommunications has already demonstrated that it is incapable of implementing "precautionary" regulations. APREFLOFAS advocates other forms of power generation, such as solar and wind, "which are less harmful to the environment," she said. Marín fears the effects on wildlife, and suspects that the new power projects are not intended to meet domestic demand, but instead are export-oriented.

Quirós insists that the power generated by these plants will stay in Costa Rica. "We can’t tell the country to stop growing," he said.

The use of existing natural resources and the "definition of regulations to protect the environment" are both essential to growth, he said. He also defended the ICE by pointing out that it is the largest investor in reforestation and that it promotes "environmentally-friendly" projects.

"There wasn’t a single tree" in Miravalles, "and today it is completely reforested," Quirós said.

"If we can’t touch our natural sources, like water or steam, all we have left are nuclear plants," he said. I

n his opinion, geothermal power reduces the country’s dependency on imported fuel.

During the dry season, from December through April, the ICE consumes 90 percent of the national gas-fuel bill - some 260 million dollars - to keep its thermal plants running. Quirós claims that this expense will be cut in half as geothermal power generation increases.

Another advantage of geothermal power is that it is continuously generated, as it is not dependant on weather conditions, like hydroelectric power, which is stretched to its limits during the driest months, when water reserves are low.

Geologist Eddy Fernández, an expert on geothermal energy, says that it is "the ideal complementary source" for hydroelectric power, which accounts for around 80 percent of the country’s power generation.

There is a risk of pollution from toxic gas leakage, but safe operation can be achieved by re-injecting the gases, Fernández told Tierramérica.

Central America could become a leading geothermal power generator, as it is located in the Circum-Pacific Seismic Belt, an area of high volcanic activity in the Pacific coast, both in Asia and the Americas.

And, Fernández said, Costa Rica must position itself as the subregion’s leading geothermal producer, as "we have been researching this field since the late 1960s."

The North Volcanic Mountain Ridge, in Guanacaste, is the ideal region for geothermal power generation, with its Miravalles, Rincón de la Vieja and Tenorio volcanoes. These are rural areas, and geothermal production would foster "their development, without harming the population," he said.

Wednesday, March 11, 2009

Sports Injuries in 3D

For several years now doctors have been using ultrasound scanning as a tool for diagnosing sports injuries. Medicine is now in the hands of technology to achieve a clear improvement in imaging quality, which will not only result in a better diagnosis, but also a more effective treatment and subsequent recovery.

This research, led by José Fernando Jiménez Díaz, a specialist in sports medicine from the University of Castilla la Mancha, analysed the usefulness of these new applications in injuries, particularly those produced in work or sports contexts. It has already been used for several years in specialist areas such as gynaecology for the diagnosis and monitoring of pregnancies.
To carry out the assessment, the study, published in the journal Advances in Therapy, compared two high definition ultrasound portable devices. One of the devices had the traditional applications and the other had in its system: harmonic imaging, real time ultrasound, panoramic view, 3D imaging and virtual convex.

Fives types of injuries were compared: muscle contusion, intrinsic muscle lesion, patellar tendonitis, calcified patellar tendonitis and partial rupture of the medial ligament of the knee. The results showed that the new systems incorporated improve the scanning of injured tissues in all types of injury analysed.

"Applications of this technology focus on both the diagnosis and treatment of injuries," Jiménez Díaz explained to SINC. "The new branch of ultrasound scanning, known as intraoperative ultrasound, makes it possible to avoid some of the surgeries that were previously unavoidable when applying ultrasound-guided treatment to the musculoskeletal system."

The promising future of 3-D technology:

While new technological applications have been adopted in major hospitals over the last three to four years, three-dimensional applications in portable or compact devices have only been applied since the beginning of 2007 in the diagnosis of soft tissue injuries (those on the skin, the subcutaneous tissue, the aponeuroses and muscles).

As the researcher indicated to SINC, "the idea behind an improvement in imaging quality is not to give the patient a prettier photo, but rather to improve the scanning of structures, particularly small injuries which are difficult to interpret. This is where the 3-D experience can help achieve optimum injury recovery".

Experts are optimistic about the future of these types of technologies. "The blooming of the ultrasound in diagnosing injuries is yet to come. I hope that applications for scanning structures which we still consider partially blind improve even more. The improvement will enable a safer diagnosis and the application of a more reliable treatment", concluded Jiménez Díaz.

Tuesday, March 10, 2009

Mobility of Birds due to Climate Change

Researchers at the SUNY College of Environmental Science and Forestry (ESF) have documented that a variety of North American bird species are extending their breeding ranges to the north, adding to concerns about climate change, according to a study published by the journal Global Change Biology.

In a study published on the journal’s web site, the SUNY-ESF researchers state the change in the birds’ breeding ranges “provides compelling evidence that climate change is driving range shifts.”

“There are a wide spectrum of changes that are occurring and those changes are occurring in a relatively short amount of time. We’re not talking centuries, we’re talking decades,” said William Porter, an ESF faculty member and director of the college’s Adirondack Ecological Center,
Porter worked on the study with Ph.D. student Benjamin Zuckerberg and AEC staff educator Annie M. Woods.

“The most significant finding is that this is the first time in North America that we’re showing the repeating pattern that’s been shown before in Europe,” Woods said. “It’s the first time we’ve been able to replicate those European findings, using the same kind of study.

Focusing on 83 species of birds that have traditionally bred in New York state, the researchers compared data collected in the early 1980s with information gathered between 2000 and 2005. They discovered that many species had extended their range boundaries, some by as much as 40 miles.

“They are indeed moving northward in their range boundaries,” Zuckerberg said.

“But the real signal came out with some of the northerly species that are more common in Canada and the northern part of the U.S. Their southern range boundaries are actually moving northward as well, at a much faster clip.”

Among the species moving north are the Nashville warbler, a little bird with a yellow belly and a loudly musical two-part song, and the pine siskin, a common finch that resembles a sparrow. Both birds have traditionally been seen in Northern New York but are showing significant retractions in their southern range boundaries, Zuckerberg said.

Birds moving north from more southern areas include the red-bellied woodpecker, considered the most common woodpecker in the Southeastern United States, and the Carolina wren, whose “teakettle, teakettle, teakettle” song is surprisingly loud for a bird that weighs less than an ounce.

The study compared data collected during the state Department of Environmental Conservation’s Breeding Bird Atlas census, which engaged thousands of citizen volunteers to observe and report the birds they could identify. The first atlas was created between 1980 and 1985; the second was done between 2000 and 2005.

New York was the first state to complete two breeding bird atlases, Zuckerberg said, making it the only state that is able, at this point, to produce this kind of research.

Zuckerberg said similar changes were found in birds that breed in forests and those that inhabit grasslands, in both insectivores and omnivores, and even in new tropical migrants that are typically seen in Mexico and South America.

“What you begin to see is a systematic pattern of these species moving northward as we would predict with regional warming,” he said.

“New York citizens need to recognize that these changes are occurring,” Porter said. “Whether they are good or bad, whether they should be addressed, whether we should adapt to them, whether we should try to mitigate some of this, those are questions that really, rightfully, belong in the political arena.”

Woods said the innate mobility of birds made them an excellent animal to study in connection with adaptation to climate change.

Monday, March 9, 2009

Nutrition Problems for many Middle-aged & Old Americans

A study determined that many middle-aged and older Americans are not getting adequate nutrition.

Using data drawn from the Multi-Ethnic Study of Atherosclerosis (MESA), a prospective cohort study designed to investigate the prevalence, correlates and progression of subclinical cardiovascular disease, researchers examined over 6200 participants from 4 ethnic groups, Caucasian, African American, Hispanic and Chinese. Dietary intakes were determined from food frequency questionnaires and respondents were asked to provide amounts and frequencies of micronutrient consumption using label information from their supplements. These data were used to calculate whether the RDAs or Adequate Intake (AI) levels were being met. The large sample size and multiple ethnic groups in this population gave investigators enough power to examine interactions between supplementation and ethnicity.

Over half of the population took supplements, and supplement users were more likely to be older, women, Caucasian and college-educated. Calcium and vitamin C supplements were most common. Although dietary intake of calcium, magnesium, potassium and vitamin C was similar between supplement users and non-users for both men and women, there were differences in median dietary intake levels between the different ethnic groups. Chinese Americans tended to have the lowest dietary intakes, particularly in calcium where both Chinese and African Americans had significantly lower dietary intakes of calcium than Caucasians and Hispanics.

The study also evaluated differences between multivitamins and high-dose supplements. While high-dose calcium was associated with meeting RDA/AIs for all ethnic groups, some high-dose supplements could also cause users to exceed their Tolerable Upper Intake Levels (ULs). For calcium, 15.0% of high-dose users exceeded the UL compared to 1.9% of multivitamin users and 2.1% of non-users. For magnesium, 35.3% of high-dose supplement users exceeded the UL compared to 0% of both multivitamin users and non-users. In addition, 6.6% high-dose vitamin C users exceeded the UL compared to 0% of both multivitamin users and non-users.

The study also found that potassium intake was very much below the RDA whether supplements were taken or not. This could point to a need to reformulate supplements to deliver higher potassium doses.

Writing in the article, Pamela J. Schreiner, MS, PhD, Professor and Director of Graduate Studies, Division of Epidemiology and Community Health, University of Minnesota, states, "The present study indicates a clear association between meeting RDA/AIs and supplement use for calcium, magnesium and vitamin C. However, even with the assistance of dietary supplements many middle-aged and older Americans are not getting adequate nutrition, and there was no association between supplement use and meeting the AI for potassium. In addition, those taking high-dose vitamin supplements were more likely to exceed the UL for that nutrient. Future studies should explore dietary supplementation along with other methods to improve nutrition in middle-aged and older Americans."

Sunday, March 8, 2009

Designing Cockpit for World`s Fastest Car

World land speed challenger Andy Green, OBE visited the University of the West of England (UWE Bristol) on Thursday to try out for the first time a mock-up of the cockpit he will use in his 1000 mph record attempt. The cockpit test rig, designed and built by second-year product design students, will ensure that cockpit components such as chair and controls are in the optimum ergonomic position for the challenge.

UWE is a founder partner of the Bloodhound Project led by Richard Noble, a previous world land speed record holder. Product Design Senior Lecturer David Henshall said, “The challenge for the students was to consider the performance and ergonomics of the driver's position for a unique event that will take the driver across ten miles in 85 seconds.

“The test rig means that fine adjustments to the position and relationship of all components can be measured and fed into a computer, ensuring the cockpit functions as it should do at such high speeds.”

Twenty students designed and built the cockpit test rig as part of their design studio class during a five week project. The students formed teams of five and each team was allocated a particular part of the rig to work on - steering, controls, seating and pedals.

Product Design Senior Lecturer Drew Batchelor said, “The students worked in conjunction with the Bloodhound team and they have done an exceptional job. After an initial briefing from John Piper (JCB Dieselmax Chief Designer), Andy Green and the Bloodhound design team, the student groups then developed concepts in the product design studios. Drawing on ergonomic data and refining their ideas through prototypes, the various individual elements were then assembled to create the test rig unveiled today. Ensuring that all these components worked together to create a cockpit environment that would function safely at 1000 mph was the key challenge for the group."

Student Hywel Vaughan said, “It isn't often that you can go home at the end of the day and say that you have worked on a land speed record attempt vehicle. Everything had to be spot on. Andy Green (the driver)'s eye line needed to be dead on the 4 degree mark. Any lower and he wouldn't be able to see over the front of the car, any higher and it could interfere with the aerodynamics. It was an exciting challenge to build a rig that could deliver that level of accuracy.”

Driver of the 1,000mph car, Andy Green OBE, said: "There isn't a book to build a car like this and the students can't just look at their dad's car for guidance. The only requirement is to have four wheels. To be faced with a blank sheet of paper is quite frightening. That said, the students at UWE have done incredibly well and its the support of universities such as The University of the West of England that will make Bloodhound SSC possible.”

The Bloodhound Project was launched at the Science Museum in London in October 2008. Engineers from UWE have already produced a scale model for Bloodhound SSC, the car that aims not just to break the current land speed record but to achieve an astounding land speed of 1000 mph. The Bloodhound Design team is using the specialist facilities at UWE to help realise the formative stages of the project.

Making Plutonium Unsuitable for use in Nuclear Arms

Ben-Gurion University of the Negev engineers have developed a technique to "denature" plutonium created in large nuclear reactors, making it unsuitable for use in nuclear arms. By adding Americium (Am 241), a form of the basic synthetic element found in commercial smoke detectors and industrial gauges, plutonium can only be used for peaceful purposes.

This technique could help "de-claw" more than a dozen countries developing nuclear reactors if the United States, Russia, Germany, France and Japan agree to add the denaturing additive into all plutonium. An article on the technique and findings will appear next month in the Science and Global Security journal.

"When you purchase a nuclear reactor from one of the five countries, it also provides the nuclear fuel for the reactor," explains Prof. Yigal Ronen, of BGU's Department of Nuclear Engineering, who headed the project. "Thus, if the five agree to insert the additive into fuel for countries now developing nuclear power -- such as Bahrain, Egypt, Kuwait, Libya, Malaysia, Namibia, Qatar, Oman, United Arab Emirates, Saudi Arabia and Yemen -- they will have to use it for peaceful purposes rather than warfare."

Ronen originally worked on Neptonium 237 for the purpose denaturing plutonium, but switched to Americium, which is meant for pressurized water reactors (PWRs), such as the one being built in Iran.

"Countries that purchase nuclear reactors usually give the spent fuel back to the producer," explains Ronen. "They wouldn't be able to get new plutonium for weapons if it is denatured, but countries that make nuclear fuel could decide not to denature it for themselves."

Nuclear fuel used in nuclear reactors has two isotopes of uranium. One is fissionable, while the other is not. The unfissionable component undergoes a number of nuclear reactions, turning some of it into plutonium. The plutonium also includes fissionable and unfissionable components. The amount of fissionable components created in nuclear reactors is enough to be used as nuclear weapons.

Saturday, March 7, 2009

Hares are being Affected by Climate Change

University of Montana researcher Scott Mills and his students have noticed an exceptional number of white snowshoe hares on brown earth. He contends that climate change and the color mismatch are causing much more hare mortality.

On an unseasonably warm May afternoon, University of Montana wildlife biology Professor Scott Mills treks into the shadowy forests above the Seeley-Swan Valley in pursuit of his quarry. He skirts the rivulets of water melting from snow patches. In one hand he holds an antenna and in the other a receiver that’s picking up signals from a radio-collared snowshoe hare. The beeps increase in volume as he draws nearer. Mills picks his way over downed branches, steps out from behind a western larch and spots the white hare crouched on the bare brown earth.

“That’s just an embarrassing moment for a snowshoe hare to think that it’s invisible when it’s not,” said Mills with a grin, quickly adding that seeing such mismatched colors is becoming all too common and disturbing.

For the past decade, Mills has directed teams of biologists and students to investigate snowshoe hares on more than 35 study sites in Montana, Wyoming and Washington, including just outside UM’s back door near Seeley Lake. His findings have led to improved forest thinning practices that maintain patches of dense trees for hares. He’s delved into population dynamics and genetics of hares in their southern range. His research has turned directly to lynx, too, as a key predator of snowshoe hares and a threatened species.

Increasingly Mills and his students have noted an exceptional number of white hares on brown earth. Radio telemetry data revealed spring and fall to be the most deadly seasons for hares and a bonanza for predators.

That leads Mills to the “sexiest part” of snowshoe hare research – how they respond to climate change. While a warming planet affects all wildlife, a cute white hare has the makings of the next version of the polar bear as poster animal for global warming.

Will hares continue to shift coat colors on cue regardless of the presence or absence of snow? Will this drive them to extinction? Or will they be able to adjust their seasonal pattern in time to fit new conditions?

“Climate trends for mountainous areas clearly show that while snow levels may vary from year to year, the number of days with snow on the ground is decreasing,” Mills said.

Snowshoe hares evolved with plentiful winter snow in the boreal forests that form a swath across Alaska and Canada and dip down into the lower 48 states. In winter, they grow long white guard hairs to match the snow. In summer, they shed white for mostly rusty brown coats to blend with trees and soil. They depend on their cryptic coloration to hide from predators that include lynx, coyotes, foxes, wolves, pine martens and birds of prey. A hare that’s the wrong color stands out like the emperor in his new clothes.

The signal for a hare to shift coat color comes from the pineal gland in the brain that senses changes in daylight length. Shortening days of autumn trigger the coat color change from brown to white. (People also have pineal glands that produce melatonin, the hormone that affects our waking and sleeping patterns and responses to seasonal day lengths.)

Like most subjects in science, the deeper you delve, the more complexities you find. Mills points out that in the Cascades, some snowshoe hares stay a mottled brown and white year-round. In the Olympic Mountains of Washington, snowshoe hares never turn white. Does this suggest some ability to evolve in response to temperature changes? If so, how quickly?

To find out, Mills will add an intensive genetic component to his fieldwork, teaming with the University of Porto in Portugal, where scientists are sequencing the rabbit genome. Together they will analyze the genetic drivers of coat color change. Mills will start with his core research areas and then expand his studies to compare coat color genetics as well as synchrony of hare cycles in southern versus northern ranges.

Mills isn’t starting from scratch. He and his team have collected genetic samples from thousands of hares and several generations for the past eight years. On a typical field day, they rise before dawn to check the 80 “have-a-heart” live traps that they’ve baited with alfalfa and apple. The traps are placed in prime snowshoe habitat such as moist forests of larch, lodgepole and Douglas fir with dense brush and overhanging branches.

Finding a hare in a trap calls for prompt action. Mills describes the process that has become routine. First, you put a pillowcase over the entrance to the trap, so the hare will run in. You keep the hare in the pillowcase while you weigh it, add an ear tag and take a tiny plug of tissue from the ear. That tissue contains DNA and is placed in a special vial. You also check the sex and assess the hare’s general health. You might add a radio collar as well, depending on the project. The whole procedure takes a matter of minutes.

Snowshoe hares seemed like a natural choice for study soon after Mills arrived at UM from the University of Idaho in 1995. They’re a local species with excellent opportunities for delving into their ecology and introducing students and the public alike to fieldwork. Hares also are known for a classic predator-prey relationship with the lynx. The two species are so closely associated that they even share a key attribute for winter living – thick furry hind feet for bounding atop snow.

Across Canada, snowshoe hares follow a synchronized population cycle of 10-year highs and lows. Hare numbers in the Yukon can peak at 200 to 300 per square kilometer and then drop to about seven. Lynx follow a cycle that’s just slightly behind the hares. When lynx numbers are down, hares start to go up. The more hares, the better the lynx do until finally the lynx drive the hare populations down again. Mills’ work has proven that those cycles are dampened in the southern range because hares don’t have the same vast, dense boreal forest, thus hares never reach the high peak counts. As their numbers rise, they disperse into habitat openings, where they become easy dinners for waiting predators. In Montana and other parts of the southern range, forests tend to be patchier naturally, with added challenges for hares from logging and thinning.

Today, as a result of Mills’ studies comparing survival rates in experimentally thinned forests, Plum Creek Timber Co. now leaves patches of unthinned trees to benefit hares, and in turn lynx.
His research has translated directly into useful management, a result that Mills always aims for and advocates in his widely used 2006 textbook, “Conservation of Wildlife Populations, Demography, Genetics, and Management.”

Until now, the lynx-hare relationship has proved Mills’ most high-profile research. After the U.S. Fish and Wildlife Service added lynx as a threatened species in 2000, his phone rang with calls from the National Park Service and timber companies alike on how to manage forests for lynx health. Mills’ subsequent studies led to findings that lynx are highly mobile in their southern range. One cat might travel 1,000 km (620 miles) in a season.

“Conserving where lynx are now is important, but it’s also important to conserve the places in between because lynx may move into those places as well,” Mills said.

Taking the next leap to examine snowshoe hare response to climate change is both a natural progression and an exciting new phase in his long-term research.

“Wildlife will either move, adapt or die in response to climate change,” explains Mills. “The study becomes important because we need to know how much natural selection will help animals deal with climate change that is happening at a very fast rate.”

That knowledge in turn will help managers focus their efforts to save species through such actions as conserving movement corridors from south to north.

“Hares are important because they are prey for almost everything in the forest that eats meat,” Mills said. “Without hares, the ecosystem unravels.”