Monday, December 3, 2007

Exotic scientific discoveries of our era-1

There have been temendous developments in electronics and communication resulting in the development of affordable Personal Computers (P.C), the Internet and the mobile phones which have changed our lives in the last decade. I am presenting to you some other exotic discoveries which have the potential to make sea changes in our day to day life.

1. Cold Fusion:

On March 23, 1989, Stanley Pons and Martin Fleischmann in U.S announced their discovery of "cold fusion." It was the most heavily hyped science story of the decade.

What is cold fusion? To understand this we should see what is fusion?. Present nuclear reactors produce energy by a process called 'fission' in which atoms of a heavy element like uranium-235 is split and the balance of mass in split is converted into energy according to Einstein's famous discovery. The fission has problems of high radiactivity emission, radiation release and waste management. Also, it depends on the fuel element of Uranium which has limited supply.

Similar energy production is possible if two light atoms like Hydrogen are allowed to fuse. But, this fusion reaction requires a very high temperature like that of sun. Almost four stories high, framed in steel beams and tangled in pipes, conduits, cables, and coils, the Joint European Torus (JET) claims to be the largest fusion power experiment in the world. Located near Oxford, England, JET is a monument to big science, its donut-shaped containment vessel dwarfing maintenance workers who enter it in protective suits. Here in this gleaming nuclear cauldron, deuterium gas is energized with 7 million amperes and heated to 300 million degrees Celsius - more than 10 times hotter than the center of the sun. Under these extreme conditions atomic nuclei collide and fuse, liberating energy that could provide virtually limitless power. For a few magic seconds in 1997, JET managed to return 60 percent of the energy it consumed, but that's the best it's ever done, and is typical of fusion experiments worldwide. The US Department of Energy has predicted that we'll have to wait another five decades, minimum, before fusion power becomes practical.

Cold fusion experiment showed that nuclear reactions can take place at room temperature. If low-temperature fusion does exist and can be perfected, power generation could be decentralized. Each home could heat itself and produce its own electricity, probably using a form of water as fuel. Even automobiles might be cold fusion powered. Massive generators and ugly power lines could be eliminated, along with imported oil and our contribution to the greenhouse effect. Moreover, according to some experimental data, low-temperature fusion doesn't create significant hazardous radiation or radioactive waste.

But,the awed excitement of Stanley Pons and Martin Fleischmann quickly evaporated amid accusations of fraud and incompetence. When it was over, Pons and Fleischmann were humiliated by the scientific establishment; their reputations ruined, they fled from their laboratory and dropped out of sight. "Cold fusion" and "hoax" became synonymous in most people's minds, and today, everyone knows that the idea has been discredited.

Or has it? In fact, despite the scandal, laboratories in at least eight countries are still spending millions on cold fusion research. During the past nine years this work has yielded a huge body of evidence, while remaining virtually unknown - because most academic journals adamantly refuse to publish papers on it. At most, the story of cold fusion represents a colossal conspiracy of denial. At least, it is one of the strangest untold stories in 20th-century science.

I had the opportunity to view a video made by a Canadian Television channel on Cold Fusion in which Stanley Pons and Martin Fleischmann are shown hiding in a basement cellar in France and are working on a small cold fusion flask which can supply electricity to a house perpetually. The enormous commercial interest in the results of this experiment is creating problems for these scientists.
_______________________________________________________________________________________

Exotic scientific discoveries of our era-2

Posted 8th November 2007 at 09:34 AM by Tamildownunder
High Temperature Superconductivity:

When cooled to extremely low temperatures, certain materials experience zero electrical resistance at DC. They conduct current with no heat loss and are called superconductors. Zero electrical resistance logically provides better circuit performance for electronic components. Dr. Josephson from U.K was awarded the Nobel Prize in Physics for this discovery.

In 1986, a new class of materials called high-temperature superconductors (HTS) was discovered that made the necessary cooling more cost-effective. HTS material was discovered in late 1986 when Müller and Bednorz of IBM's Zurich Lab announced a superconducting oxide at 30 Kelvin (K). In 1987, Paul Chu of the University of Houston announced the discovery of a compound, Yttrium Barium Copper Oxide (YBCO), which became superconducting at 90 K. The next months saw a race for even higher temperatures that produced bismuth compounds (BSCCO) superconductive up to 110K and thallium compounds (TBCCO) superconductive up to 127K.

Superconductors have many different applications, ranging from levitating trains to ultra-efficient power lines. Maglev research in the 1960s in the United States was short-lived. In the 1970s, Germany and Japan began research and after some failures both nations developed mature technologies in the 1990s. However, superconductor related costs remain a barrier to acceptance.

Magnetic levitation transport, or maglev, is a form of transportation that suspends, guides and propels vehicles (especially trains) using electromagnetic force. This method can be faster than wheeled mass transit systems, potentially reaching velocities comparable to turboprop and jet aircraft (900 km/h, 600 mph). The highest recorded speed of a maglev train is 581 km/h (361 mph), achieved in Japan in 2003, which is 4 mph more than the conventional TGV speed record.


Picture shows a maglev train in Japan. Many countries including China, Japna, U.K, U.S.A have many projects to run meglev trains.

Mumbai – Delhi:A maglev line project was presented to the India transportation minister Lalu Prasad by an American company, if approved would serve between the cities of Mumbai and Delhi, the Prime Minister Manmohan singh said that if the line project is succeeded Indian government would build lines between other cities and also between Mumbai centre and Chattrapati Shivaji International Airport.

How a meglev train works:

The "8" figured levitation coils are installed on the sidewalls of the guideway. When the on-board superconducting magnets pass at a high speed about several centimeters below the center of these coils, an electric current is induced within the coils, which then act as electromagnets temporarily. As a result, there are forces which push the superconducting magnet upwards and ones which pull them upwards simultaneously, thereby levitating the Maglev vehicle.

Other applications of superconductivity:

Some experts have suggested that the new superconducting materials could finally make fusion a practical reality.

In the meantime, the weapons laboratories use powerful magnets in research on beam weapons. The U.S Navy has actively explored the possibility of using a combination of superconducting generators and motors in ships to replace gigantic mechanical drive shafts.

With all its mystery, superconductivity has intimate links to esoteric phenomena at the forefront of basic physics. Detectors using superconductors have extraordinary sensitivity to different kinds of radiation, raising the possibility of new kinds of applications from astronomy to the analysis of brain waves.

''There's a tremendous amount of work to be done, but there's also a tremendous potential market out there for people who can bring this technology into commercial practice,'' said Donald K. Stevens, head of basic energy sciences research for the U.S Department of Energy
________________________________________________________________________________________

Exotic scientific discoveries of our era-3

Posted 19th November 2007 at 11:00 AM by Tamildownunder
Updated 19th November 2007 at 11:42 AM by Tamildownunder
Nano-technology:

The world demand for energy is expected to double to 28 terawatts by the year 2050. Compounding the challenge presented by this projection is the growing need to protect our environment by increasing energy efficiency and through the development of “clean” energy sources. These are indeed global challenges, and their resolution is vital to our energy security. Recent reports on Basic Research Needs to Assure a Secure Energy Future and Basic Research Needs for the Hydrogen Economy have recognized that scientific breakthroughs and truly revolutionary developments are demanded. Within this context, nanoscience and nanotechnology present exciting and requisite approaches to addressing these challenges.

An interagency workshop to identify and articulate the relationship of nanoscale science and technology to the U.S's energy future was convened on March 16-18, 2004 in Arlington, Virginia. The meeting was jointly sponsored by the Department of Energy and, through the National Nanotechnology Coordination Office, the other member agencies of the Nanoscale Science, Engineering and Technology Subcommittee of the Committee on Technology, National Science and Technology Council. This report is the outcome of that workshop.

The workshop had 63 invited presenters with 32 from universities, 26 from national laboratories and 5 from industry. This workshop is one in a series intended to provide input from the research community on the next NNI strategic plan, which the NSTC is required to deliver to Congress on the first anniversary of the signing of the 21st Century Nanotechnology R&D Act, Dec. 3, 2003.
At the root of the opportunities provided by nanoscience to impact our energy security is the fact that all the elementary steps of energy conversion (charge transfer, molecular rearrangement, chemical reactions, etc.) take place on the nanoscale. Thus, the development of new nanoscale materials, as well as the methods to characterize, manipulate, and assemble them, creates an entirely new paradigm for developing new and revolutionary energy technologies. The primary outcome of the workshop is the identification of nine research targets in energy-related science and technology in which nanoscience is expected to have the greatest impact:

• Scalable methods to split water with sunlight for hydrogen production
• Highly selective catalysts for clean and energy-efficient manufacturing
• Harvesting of solar energy with 20 percent power efficiency and 100 times lower cost
• Solid-state lighting at 50 percent of the present power consumption
• Super-strong, light-weight materials to improve efficiency of cars, airplanes, etc.
• Reversible hydrogen storage materials operating at ambient temperatures
• Power transmission lines capable of 1 gigawatt transmission
• Low-cost fuel cells, batteries, thermoelectrics, and ultra-capacitors built from nanostructured materials
• Materials synthesis and energy harvesting based on the efficient and selective mechanisms of biology

The report contains descriptions of many examples indicative of outcomes and expected progress in each of these research targets. For successful achievement of these research targets, participants recognized six foundational and vital crosscutting nanoscience research themes:

• Catalysis by nanoscale materials
• Using interfaces to manipulate energy carriers
• Linking structure and function at the nanoscale
• Assembly and architecture of nanoscale structures
• Theory, modeling, and simulation for energy nanoscience
• Scalable synthesis methods

Discovery of carbon nano-tubes:

In 1991, Sumio Iijima of NEC, Japan observed multiwall nanotubes formed in a carbon arc discharge, and two years later, he and Donald Bethune at IBM independently observed single-wall nanotubes – buckytubes. The important observation of these nano-tubes is that their physical and electrical properties are drastically different from materials of the same element in macroscopic scale. The biggest impact of carbon nano-tubes is expected to be in what is called carbon-based electronics as against the present silicon-based electronics. The carbon-based electronics is expected to achive very high packing densities of components in integrated circuits and increase the speed of operation enormously. With silicon both packing densities and speed of operation have reached a saturation.

Nanotechnology gives sensitive read-out heads for compact hard disks:

This year's physics Nobel prize is awarded for the technology that is used to read data on hard disks. It is thanks to this technology that it has been possible to miniaturize hard disks so radically in recent years. Sensitive read-out heads are needed to be able to read data from the compact hard disks used in laptops and some music players, for instance.

In 1988 the Frenchman Albert Fert and the German Peter Grünberg each independently discovered a totally new physical effect – Giant Magnetoresistance or GMR. Very weak magnetic changes give rise to major differences in electrical resistance in a GMR system. A system of this kind is the perfect tool for reading data from hard disks when information registered magnetically has to be converted to electric current. Soon researchers and engineers began work to enable use of the effect in read-out heads. In 1997 the first read-out head based on the GMR effect was launched and this soon became the standard technology. Even the most recent read-out techniques of today are further developments of GMR.

A hard disk stores information, such as music, in the form of microscopically small areas magnetized in different directions. The information is retrieved by a read-out head that scans the disk and registers the magnetic changes. The smaller and more compact the hard disk, the smaller and weaker the individual magnetic areas. More sensitive read-out heads are therefore required if information has to be packed more densely on a hard disk. A read-out head based on the GMR effect can convert very small magnetic changes into differences in electrical resistance and there-fore into changes in the current emitted by the read-out head. The current is the signal from the read-out head and its different strengths represent ones and zeros.

The GMR effect was discovered thanks to new techniques developed during the 1970s to produce very thin layers of different materials. If GMR is to work, structures consisting of layers that are only a few atoms thick have to be produced. For this reason GMR can also be considered one of the first real applications of the promising field of nanotechnology.
____________________________________________________________________________________

Exotic scientific discoveries of our era-4

Posted 27th November 2007 at 07:04 AM by Tamildownunder
Dear ILites,

Those of you who have read my serial story 'Vigyani' will remember the machine I was mentioning that will produce water from air. This Fuel cell is one such machine and once developed and produced in mass numbers it will be a great boon to India in electricity generation and water production in all villages and remote places. The National Institute of Standards and Technology here in U.S where I am working is engaged in the development of these fuel cells.

In 2003, President Bush announced a program called the Hydrogen Fuel Initiative (HFI) during his State of the Union Address. This initiative, supported by legislation in the Energy Policy Act of 2005 (EPACT 2005) and the Advanced Energy Initiative of 2006, aims to develop hydrogen, fuel cell and infrastructure technologies to make fuel-cell vehicles practical and cost-effective by 2020. The United States has dedicated more than one billion dollars to fuel cell research and development so far.

Sir William Grove invented the first fuel cell in 1839. Grove knew that water could be split into hydrogen and oxygen by sending an electric current through it (a process called electrolysis). He hypothesized that by reversing the procedure you could produce electricity and water. He created a primitive fuel cell and called it a gas voltaic battery. After experimenting with his new invention, Grove proved his hypothesis. Fifty years later, scientists Ludwig Mond and Charles Langer coined the term fuel cell while attempting to build a practical model to produce electricity.

A fuel cell is an electrochemical device that combines hydrogen and oxygen to produce electricity, with water and heat as its by-product. As long as fuel is supplied, the fuel cell will continue to generate power. Since the conversion of the fuel to energy takes place via an electrochemical process, not combustion, the process is clean, quiet and highly efficient – two to three times more efficient than fuel burning.

No other energy generation technology offers the combination of benefits that fuel cells do. In addition to low or zero emissions, benefits include high efficiency and reliability, multi-fuel capability, siting flexibility, durability, scalability and ease of maintenance. Fuel cells operate silently, so they reduce noise pollution as well as air pollution and the waste heat from a fuel cell can be used to provide hot water or space heating for a home or office.

The biggest application of fuel cells will be in automobiles reducing the cost, dependence on supply of oil, reducing global warming etc.
_____________________________________________________________________________________
Exotic scientific discoveries of our era-5

Time standards and GPS:

The Global Positioning System (GPS) is a radionavigation system that is available worldwide. GPS signals are broadcast from a constellation of 24 or more earth orbiting satellites. Because the GPS signals are derived from the atomic frequency standards on board each satellite, they are widely used as a reference for time synchronization and frequency calibration.

Heart of the GPS system is the atomic frequency standards and the National Institute of Standards and Technology (NIST) where I am working has developed these standards. Dr. William Philips of NIST won the Nobel Prize for Physics in 1996 for his discovery of cold trapping of atoms which have paved the way in the development of these standards. You may ask what is the connection between this standard and GPS. In GPS the image is processed or place is identified by the time difference in signals sent and received to one or more of the 24 satellites. The standards ensure the accuracy of measurement and the resolution of image depends on the accuracy. Present standard is a caesium atomic clock with which a resolution of 10 cm. only is achieved. But, there is a discovery of Strontium based atomic clock which will improve the resolution. Put it in simple terms, imagine you have parked your car in a big car space and forgotten where exactly you have parked, the present GPS cannot help as it cannot resolve the digits in your number plate. But, with strontium clock it would be possible.

When people talk about "a GPS," they usually mean a GPS receiver. In U.S. allmost all cabs are fitted with GPS and many people have installed a GPS receiver in their cars. Recently, when my daughter visited me from Phoenix, she had brought her GPS and with its help she could go around Washington D.C although she had not been driving here before.

The Global Positioning System (GPS) is actually a constellation of 27 Earth-orbiting satellites (24 in operation and three extras in case one fails). The U.S. military developed and implemented this satellite network as a military navigation system, but soon opened it up to everybody else.

Each of these 3,000- to 4,000-pound solar-powered satellites circles the globe at about 12,000 miles (19,300 km), making two complete rotations every day. The orbits are arranged so that at any time, anywhere on Earth, there are at least four satellites "visible" in the sky.

A GPS receiver's job is to locate four or more of these satellites, figure out the distance to each, and use this information to deduce its own location. This operation is based on a simple mathematical principle called trilateration.

Imagine you are somewhere in the United States and you are TOTALLY lost -- for whatever reason, you have absolutely no clue where you are. You find a friendly local and ask, "Where am I?" He says, "You are 625 miles from Boise, Idaho."
This is a nice, hard fact, but it is not particularly useful by itself. You could be anywhere on a circle around Boise that has a radius of 625 miles, like this:

You ask somebody else where you are, and she says, "You are 690 miles from Minneapolis, Minnesota." Now you're getting somewhere. If you combine this information with the Boise information, you have two circles that intersect. You now know that you must be at one of these two intersection points, if you are 625 miles from Boise and 690 miles from Minneapolis.

If a third person tells you that you are 615 miles from Tucson, Arizona, you can eliminate one of the possibilities, because the third circle will only intersect with one of these points. You now know exactly where you are -- Denver, Colorado.


This same concept works in three-dimensional space, as well, but you're dealing with spheres instead of circles

No comments: