Tuesday, December 8, 2009

A Report on ‘Homi Bhabha Birth Centenary Symposium’; Tata Inst. Of Fundamental Research, Mumbai 3-5 Dec. 2009

Birth centenary function started with inaugural address of former Chief of Atomic Energy Commission, Dr. Anil Kakodkar. He linked the growth and capability in nuclear energy to address climate change challenge. In doing so he reviewed new possibilities in renewable energy sector i.e. bio fuel, hydrogen fusion and solar energy. He disclosed that all laboratories and institutions of Department of Atomic Energy will be linked under one parent consortium which will be known as “Homi Bhabha National Institute” This institute as he believed will be responsible for pursuit of new ideas, speedier transfer of research in applied technology and development of new scientific frontiers.

Another inaugural lecture following this was given by Dr. C.N. R. Rao, Honorary Professor, Jawaharlal Nehru Advanced Centre for Research. He said that Dr. Bhabha was only visionary having unparalleled imagination other than Nehru as far as development of Indian science is concerned. He said, “Bhabha realised importance of science for growth and development of India by working on wrong problems. Hailing Bhabha as true ‘colossus’. Dr. Rao regretted that value system in India does not recognise science today. He tried to throw light on possibilities about how we can excel in science. “India must be visible power in science either in publications, discoveries or in technologies and their use in human life is concerned. We should be part of mainstream science by virtue of which we should be publishing 7-10 % of top one percent of quality scientific papers in peer reviewed journals. Like Bhabha picking and working on right problems must be priority of Indian science.”

Lecture series commemorating Dr, Bhabha started with 1957 Nobel Laureate Dr. C. N. Yang. (for his work on Special problems of statistical mechanics) He talked about “Vector potential to connections on a fibre bundle.” The vector potential was first used to calculate the magnetic field due to an electric curve. Maxwell originally seized upon the idea to mathematically express Faraday`s intuitive ‘electronic state.’ In the 20th century, the vector potential evolves into a fundamental quantity, the flexibility of which determines all fundamental interactions. Furthermore, the vector potential is identical to the mathematician`s concept of connections on a fibre bundle.

Citing successful example of agricultural revolution in India Dr. M.S. Swaminathan heralded Dr. Bhabha`s vision for basic science in transforming lives of millions of people. He observed four basic reasons for success of green revolution as technology, services offered to farmers, public policies of government and enthusiasm shown by farmers in achieving the landmark results of production which were completely more than that of last 4000 years of our civilisation. Dr. Bhaha presided over the first UN Conference on Peaceful Uses of Atomic Energy in Geneva in 1955 and he was instrumental in organising exhibition ‘Atoms on the Farm’ in Delhi in 1958. Today, our agriculture is crying for a technological up gradation. It will be timely if we harness nuclear tools in areas such as: a) Soil health and water management, b) New genetic strains of crops, c) Plant protection, d) Harnessing tools of microbiology, e) Food irradiation and post-harvest technology, f) Nuclear energy for agriculture, g) A whole range of applications in research work relating to the use of radio isotopes in areas such as enhancement of physiologic efficiency of crop plants and farm animals. It is high time we make great use of the tools of atomic energy in improving the productivity, profitability and sustainability of our major farming systems.

Prof. Knut Urban, Director of Institute of Solid State Research, Helmoholtz, Germany threw light on ‘Feynman`s dream come true: Studying the atomic structure of materials by ultra high-resolution transmission spectroscopy’. Emphasising the vision of Dr. Bhabha in creating the great institutions, he underlined the importance of inspiration being the principle reason behind quality scientific research.

He continued, “Fifty years ago, Richard Feynman in his famous lecture ‘There`s plenty of room at the bottom’ demanded the development of atom-resolving electron microscopy as one of the key instruments of what in meanwhile has become nanoscience. Recently transmission microscopy has indeed been able to take this visionary step forward with the introduction of aberration-corrected electron optics. An entirely new generation of instruments enables studies in condensed matter physics and materials science to be performed at genuine atomic scale resolution. The accuracy of spatial instruments has reached the picometer range. These are dimensions where many physical effect and functions have their origin. This fulfils the long standing dream to be able to be able to derive values for atomic-position dependent parameters of physical properties of materials by measurements atom by atom. These new possibilities are meeting the growing demand of nanoscience and nanotechnology for the atomic scale characterisation of materials, nanosynthesised products and devices, and the validation of expected functions. However, understanding the atomic scale results is generally not straightforward and only possible with extensive quantum-mechanical computer calculations.”

On the thinking of Feynman probing atoms, Prof. W. Mark Rainforth discussed question; how does structure at atomic scale affect tonnages of material and can we extrapolate that? He talked about case studies ranging from structure of nanoscale multilayer coatings, complex structure of modern high strength steels to surface of hot rolled aluminium.

In his talk on ‘Graphene and beyond’ Dr. C.N. R. Rao said, “Graphene is a fascinating new nanocarbon possessing single, bi or few layers of carbon atoms forming six-membered rings. Different types of grapheme have been investigated by X-ray diffraction, atomic force microscopy, scanning tunnelling microscopy and Raman spectroscopy. Main challenge in graphene crystal, wires, walls and films is that of functionalisation. The extraordinary electronic properties of single and bi-layer graphenes are indeed most unique and unexpected. Other properties of Graphene such as gas adsorption techniques, magnetic and electrochemical properties and the effects of doping by electrons and holes are equally noteworthy. Interestingly, molecular charge-transfer also markedly affects the electronic structure and properties of graphene. Many aspects of graphene are yet to be explored including synthetic strategies which can yield sufficient quantities of graphene with the desired number of layers. While graphene continues to be a in limelight we are thinking in the direction of graphene analogues and making graphene in layered inorganics.

Dr. R. Chidambaram, Principle Scientific Adviser to Govt. of India in his talk on some challenging problems in condensed matter and materials Physics ranging from explanation of structure-function relationship in biological macromolecules after crystallising them to behaviour of materials under extreme conditions of temperature and pressure to design of better electronic and energy materials. He also focussed his talk on fundamental research being done in DAE on the areas of mathematics, astronomy, biology, particle physics, condensed matter physics, computers & communication, neutron crystallography. He also reviewed the progress of Three stage Nuclear power programme. DAE has taken initiative to explore new research areas like Nanoelectronics, Nanobiotechnolgy and solar photovoltaic and solar thermal energy related research. He expressed his views on diminishing boundaries between academic science and strategic science and success of DAE in proving that much of the self-directed strategic research can have enormous influence on applications in our common life originated from fundamental science. He praised TIFR-CERN collaboration. He described in detail how important it is to have technological foresight first creating knowledge base and then investing in applied research. In essence Dr. Chidambaram was consistently appealing to Indian scientists to work on challenging problems like Bhabha did in his whole life.

Dr. Claude Guet, Director of Office the French High Commissioner for Atomic Energy talke about ‘Basic sciences: issues for sustainable nuclear energy’. He said, “Our aim should be twice more energy, twice less carbon. Energy is one of the biggest challenges to be faced in the next decades in order to cope with world population needs, depletion of traditional resources, and obligations to mitigate emissions of green house gases. The strong asset of nuclear energy is to ensure on long term a substantial fraction of base load of electricity production and thus to efficiently adapt future intermittent renewable energy supplies such as solar or wind. However safety requirements, reduction of proliferation risks, waste management, highly reliable and competitive cost industrial operation, as well as public acceptance appeal for important and long term efforts in science and technology. Even though there are large-scale doubts about the viability of nuclear energy as an option, nuclear industry`s ability to tackle with errors is increased due to robust facility to stop evolution of accident.” His talk was focussed on recent advances in understanding and prediction of materials property under severe conditions, advanced chemistry for mastering the fuel cycle, nuclear data and large scale computer simulations. Fast breeder reactors

Fomer S&T Minister, GoI and former Director, TIFR M.G. K. Menon shared with audience Dr. Homi Bhabha`s life moments. He said, “Homi`s life made fantastic difference to India with his commitment to the growth of modern science in India with the realisation that science and technology were key to India transforming to a developed country by his commitment to fundamental research and resolve to create ambience in new institutions with highly innovative patterns of administration and management. Dr. Bhabha focused on completely new opportunities in space and electronics.

Three academies in India; Indian National Science Academy, National Academy of Sciences of India and Indian Academy of Science paid tribute to the legacies of Dr. Homi Bhabha.

Then continuing the Dr. Ramaswamy Raghavan talked about ‘Neutrinos inward bound-frontier beyond particle Physics.’ He said, “Major advances in experimental neutrino science in the past decade have created new paradigms for elementary particle physics with the epochal discovery of the non-zero mass of neutrino. There are surprising developments of neutrinos ‘inward bound’-the monumental opportunity to look into centre of sun using neutrinos and define the evolution of the earth and the planets. Recently, resonance reactions of low energy neutrinos with extremely precise energy and the very large cross sections promise surprising new landscapes for neutrino research and for exploring the deepest foundations of fundamental physical theory.

Prof. Sunil Sinha from University of California San Diego talked about ‘use of coherent X-ray beams in studying the structure and dynamics of condensed matter. He recalled that it was Dr. Bhabha who introduced to the Indian scientific community the use of neutron scattering as a powerful probe of the structure and dynamics of condensed matter, by ushering in the age of nuclear power. Since then, use of scattering techniques has rapidly developed in sophistication, and capabilities in the form of new radiation sources. The use of coherent X-ray beams to study condensed matter has attracted increasing attention from researchers with the advent of high brilliance synchrotron X-ray sources based on free electron LASERs. The application include studying the slow dynamics of condensed matter in real time, obtaining bond-orientational order in glasses and imaging nanostructures in real space. These studies will be greatly facilitated by the powerful new X-ray LASER sources currently being constructed.

Prof. Obaid Siddiqi, founder Director of TIFR Centre for Biological Sciences talked about Dr. Bhabha`s vision to explore new areas of scientific research and also his understanding of need to create research environment for different sciences and technologies in one organisation. He said, “Science is at the root of the history. There is no fixed way of doing science although there is some standard method. Government`s attitude towards science is not changed much since those days. Our politics has become least scientific. Bhabha showed the way how we can govern science by cultivating coherence in organisation. He is known to have been an advocate of the policy of growing science around people, not around buildings and equipments.”

In a presentation titled, ‘The emotional brain-imprints of life history’ talked about early childhood experiences shaping the neurocircuitary of emotion and determine adult emotional behaviour. She said, “While nurture in early life evokes resilience, early life stress induces vulnerability to mood disorders like anxiety and depression. Neurocircuits of emotion are thought to be altered and modified based on early life experience. What was traditionally realm of psychology has been energised with exciting evidence from neuroscience that indicates how life experience results in imprints that are carried at the genetic, molecular, cellular and behavioural levels across the lifespan.”

Newly appointed Atomic Energy Commission chief Dr. S.Banerjee talked about progress of Indian Nuclear Energy programme and it`s success in yielding new applications for civilian sector. He said, “DAE is in process of perfecting Pressurised Heavy Water Reactors as a part of 3 stage nuclear power programme. Closing fuel cycle of Uranium for best use of fissile and fertile material is significant step in this direction. Reduction of waste also forms important ingredient of this programme. Apart from robust strategic programme DAE is committed to technology development on basic research. He talked in detail about technical details about uranium fuel production, heavy water treatment, on power fuelling, neutron economy. He described in detail India`s indigenous capability in fabrication of fuel, sophisticated equipment development, technology transfer to and from Indian industries. He told audience that capital cost of Indian power plants and thus of per KWe is lowest in the world being around 1700$/KWe. He described in detail roadmap for generation of power from Thorium in third stage of nuclear power production. Considering the facts that 60 % of Indians have no access to any form of power challenge in front of Indian Nuclear Power Establishment is huge. Thus DAE requires ten fold improvement in generation of nuclear power. He also highlighted other achievements of DAE like Bhabhatron Desalination plant, Arihant(nuclear submarine), Proton Linac Device, new university named after Dr. Bhabha , collaboration with National Inst. Of Science Education and Research, Bhuvneshwar, Giant Meter wave Radio Telescope etc. In his opinion sophisticated quality of research must expand to other areas of research so as to have early solutions and efficiency of those fuels in low degree of emissions and being eco-friendly.

Dr. Mustansir Barma, Director TIFR shared the current and the historical contributions of TIFR. Programes and institutions like SAMEER, CDoT, NCST, CDAC, IUCAA, IUAC, DAE, CBS, PRL all originated in TIFR. He highlighted the path breaking work related to pulsars, dwarf galaxies done recently by GMRT. Also he told about TIFR`s contribution in air-defence, army radio-network, dedicated design of micro programmed computation. TIFR will be exploring in new areas like soft matter, biosciences, material sciences, optics, nanosciences, high energy infrared LASERs.

Former Director General of CERN and Physics Nobel Laureate of 1984, Carlo Rubiya talked about ‘An undetectable universe’ of dark matter and dark energy. He said, “Luminous matter accounts for only a tiny fraction of the total mass density of the universe and only a tenth of the ordinary matter (baryons). The present bulk of matter in the universe is invisible and dark and therefore only indirectly observable. The gravitational evidence for such a very profound conclusion is mounting for almost two decades; there is much more matter than there are baryons, and thus baryonic matter e.g. ordinary matter, is not the dominant form of matter of the universe.

Particle Physics provides an attractive solution to the non-baryonic dark matter problem: relic elementary particles left over from the Big Bang. Long lived or stable particles with very weak interactions can remain from the earliest moments of particle democracy in sufficient numbers to account for a significant fraction of critical density. The experimental search for such new forms of matter outside of the standard model is an extremely exciting programme. Evidence for dark matter comes from galactic rotational curves. But all evidence is limited to gravitational effects. Inparticle physcis, supersymmetry (often abbreviated SUSY) is a symmetry that relates elementary particles of one spin to other particles that differ by half a unit of spin and are known as superpartners. In a theory with unbroken supersymmetry, for every type of boson there exists a corresponding type of fermion with the same mass and internal quantum numbers, and vice-versa. This SUSY is increasingly being attributed for the source of non-baryonic dark matter.

Prof. Peter Littlewood from Cambridge University refreshed the days of Dr. Bhabha in Cambridge University. He said that Bhabha had tremendous intensity to persuade stalwarts in scientific and governments due to his immense passion for work. Prof. Littelwood hoped that India can do significant contributions in solar cells, electrical storage, refrigeration and lightening. Each of these require breakthrough in material sciences and condensed matter physics. He further added that it is not important how much Big science is being done but quality of coordination between Little and Big science. His lecture concentrated on ‘Bose Einstein Condensation of Polaritons’. He said, “Macroscopic phase coherence is one of the most remarkable manifestations of quantum dynamics, yet it seems to be the inevitable ground state of interacting many-body systems. Recently, examples super fluid Helium and conventional superconductors have been joined by exotic and high temperature superconductors, ultra cold atomic gases, both bosonic and fermionic, and more recently systems of excitons, magnons and exciton-photon superpositions called polaritons.

Fomer Chairman of Space Commission Dr. G. Madhavan Nair talked about recent advances in Space Sciences. The path breaking discovery of the presence of water on the lunar surface by Chandrayan-I promises to open new frontiers of planetary research and observation of outer space. He also shared about new initiatives like Astrosat, a space based observatory scheduled for launch during 2011 are expected to contribute further to our understanding of this field. “The future thrust in space sciences will aim at understanding the evolution of planetary systems and galaxies, and exploring the mysteries of the universe.

Symposium concluded with talk by Shyam Benegal who was Homi Bhabha Fellow during 1970-72. He attributed his devotion to filmmaking as a result of motivation he received during that fellowship.

Wednesday, November 18, 2009

Indian Acadamy of Sciences Platinum Jubilee Meet 2009, IISc Bangalore


As per the Academy’s tradition, the first lecture of the meeting was the Presidential Address by D. Balasubramanian. In his talk, ‘When science looks you in the eye’, he discussed a number of novel ways to correct disorders of the eye. Defects such as presbyopia among adults and myopia among children may be treated by replacing the lens with polymer gels which will then be controlled by the ciliary muscles of the eye. Corneal defects can be treated by procedures such as corneal polymer onlays, corneal sculpting (for long term correction), replacement of the entire cornea with biopolymer or co-polymer contact lens, or by reconstruction of outer corneal layer by stem cell methods.

These are important in a country like India where 140,000 cornea donors are needed but only few thousands are available. Balasubramanian also talked about methods, such as gene therapy, for correcting retinal defects, and methods for effective drug delivery to the retina, such as nanoparticle encapsulated delivery of carboplatin in the treatment of retinoblastoma. He also spent some time discussing the use of ‘adaptive glasses’ and photoelectric methods to correct eye disorders. He concluded his speech with questions that are as yet unanswered. Some of them are: (a) Is it possible to deliver genes using nanomaterials? (b) Can tear fluid be used as a diagnostic material? (c) Can reengineered taste bud cells be used to repair the retina? and (d) Is it possible to transplant the entire donor eye?







This was followed by a highly inspiring talk by C. N. R. Rao, ‘Emerging India as a great centre of science.’ He said that the only things he had when he started off on his scientific career were enthusiasm and the inspiration he got by interacting with eminent scientists like Sir C. V. Raman; Indians did not get much money for doing science in those days. He wondered why the quality of science has not improved drastically in recent times though the availability of monetary support has greatly improved. He also discussed a number of problems that scientific progress in India faces, such as the relatively low importance that science is given in society, ‘referee fatigue’ when papers are published in international journals and the low output of research papers and PhD holders from India.


Public Lectures:

a) Nandan Nilekani: 12th nov. 2009

Unique ID Number

Chairman, UIDAI in a lecture ‘Unique Identification Project-Issues and Challenges’ explained the road mapping of Unique Identification Authority of India. Increasing rate of aspirations, government's renewed emphasis on ‘Inclusive Policies of Economic Development’, need to acquire multiple services in private and public sector have forced the urgency of this Unique Identification Number. The multiple services require authentication for fulfilling the demands of the citizens and customer. The process of issuing such cards is time and cost intensive. This projects aims at reducing the duplication of efforts in creating multiple databases and also to provide a formal identity to the benefit sharing citizens of the government's welfare schemes. UID will provide a unique 16 digit number to all the citizens who voluntarily apply for it. However this number will only be used for verification of demographic information of citizen concerned and it will not award any rights, citizenship and other additional benefits. This system will enable online verification. Also, UIDAI will not share the data with anyone else envisioning balance between privacy and purpose.

Thus through UIDAI India will be first country to implement biometric based Unique ID system on such a large scale. In essence this project targets to enable government to address direct benefit programs more efficiently and allow departments to coordinate investments and share information. Mr. Nilekani elaborated on technological challenges viz. Unlikely uniformity in biometric modality, unavailability of biometric pattern in significant population, institutional cost, use of distributed computing (along with cloud computing and virtualisation) in the process of database sharing and authentication, optimization for networks, rural Internet connectivity, security, privacy, scalling, sustainability and adoption standards.


b) Mark Tully: 13th Nov. 2009

The need for balance in unbalanced world

His argument centered around promoting more dialogue between Sciences and non-sciences, especially religion, theology, mystical and traditional knowledge systems. He was advocating greater need to have liberal education in order to build the bridges between TWO CULTURES; sciences and social sciences. He was pointing out finger at inability of sciences in appreciating the viewpoint of the people at large involved in alternative methods of inquiry like poetry, music and painting. It was in the context of the rigidity of scientists, the authenticated WISDOM largely relying on rational analysis ignoring the popular beliefs and myths.


He was asserting that we should give more space, tolerance, platform for greater dialogue the spirit which is reflected in ARGUMENTATIVE INDIAN written by Amartya Sen. In Q &A session one lady asked, "Whether he will preach the same TOLERANCE OF ARGUMENTS to religious fanatics just like he is conveying it to the scientists?" Some tangential discussion happened about literature of Richard Dawkins, a famous Biologists who as Tully said does less biology and more criticism of catholic church. He was trying to convey the point that in this era religion is the system which is trying most to be in the process of dialogue rather than rigid science.



Cold gas at high redshifts:12th Nov. 2009

R. Srianand from IUCAA, Pune talked about understanding of physical conditions in protogalaxies by analyzing cold gas at high redshifts. He highlighted another useful method of probing thermal state of interstellar medium of galaxies 21-cm absorption in the spectra of background quasars. Cosmology is a function of cosmic curve and red shift can be understood by application of Doppler Effect. Stars which are formed by cold gas the physical conditions of which are influenced by local radiation field, cosmic ray energy density, photoelectric energy by dust. 21-cm absorbers are capable of measuring correlation between spin and temperature. Challenge to measure Hydrogen molecules through 21-cm using GMRT, GBT and WSRT ( all major telescopes) was really the high excited state of H atoms. Therefore Hydrogen lines are not available for the purpose of Astronomical study. Survey conducted by R. Srianand provides representative sample of systems to be used in combination with various follow up observations like a) physical conditions, b) effect of metalicity, c) morphology of absorbing gas, d) time evolution of various fundamental constants.



Raman Spectroscopy: 13th Nov. 2009:-> Hallmark of the Platinum Jubilee meeting was symposium on Raman Spectroscopy as a tribute to founder of Acadamy C.V. Raman. Prof. Ajay Sood from IISc presided the proceedings. Inaugural lecture focused on versatility of the principle of Raman Spectroscopy and diverse applications in the research and life. Prof. Hiro-O Hamaguchi (University of Tokyo) talked about recent developments of Raman spectroscopy in enabling in vivo imaging of living cells with high time, space and molecular specificity. He mentions it as a spatial boundary of living organism be accounted for by physics and chemistry citing this phenomenon being Raman spectroscopic signature of life when metabolic activity of mitochondria is reflected by using this technique. Changes in spontaneous cell death process have been traced by excellent molecular specificity by time resolved Raman imaging. The advantages of this technique explained by Prof. Hamaguchi include enabling knowledge about molecular structure and dynamics, no pre-treatment of samples needed, LASER excitation is possible for space resolved measurement, no water interference and usable in fibre optics techniques of endoscope and sensing. Disadvantages account for irradiation damages, low sensitivity, fluorescence interference etc. Major question being addressed by this research is can we measure and quantify molecular life so as to apply this knowledge in curing the diseases like cancer etc.


Volker Deckert (Friedrich-Schiller University, Germany) talked (Raman spectroscopy beyond the diffraction limit) about the evolution of Raman Spectra applications on the sidelines of the development of LASER and efforts to increasing the intensity of very weak signal. The advent of multichannel detectors, utilizing of field enhancing substrates helped to increase spatial information. His talk centers around recent experiments Raman experiments performed by near field geometers/optics. Instead of conventional lenses, light was squeezed through a tiny little aperture with dimensions already beyond the Abbe limit. (According to Abbe, a detail with a particular spacing in the specimen is resolved when the numerical aperture of the objective lens is large enough to capture the first-order diffraction pattern produced by the detail at the wavelength employed.) This research was major breakthrough due to combination of near field optical concepts and field enhancing properties of plasmonic nanoparticles.


A. W. Parker (Seeing below surfaces: Developments in Raman spectrocopy for chemical and medical analysis) talked about efficiency of Raman Spectroscopy among different ways of molecular structure determination like X-ray,NMR, vibrational spectroscopy, UV etc. This field was hastily driven forward by rapid growth of tunable pulsed(nanosecond and picoseconds) LASER capable of operating across the ultraviolet.., visible and near infrared spectral regions. Raman effect can be successfully adopted to characterize bone diseases in both mineral and organic component of bone matrix. This research can be used to develop and analytical non-invasive method assessing the composition of bone matrix (for supporting diagnose) and mother treatments.


Prof. Siva Umapathi of LASER Spectroscopy group of IISc talked about changing paradigm of Raman Spectroscopy passing from Physics to Biology through Chemistry. These trends include nonlinear spectroscopy, Raman loss spectroscopy, biophotonics. The new wave of applications of Fluorescence to biology, to materials was discussed. Time dependent dynamics especially Femtosecond dynamics was discussed. He talks about new form of spectroscopy which use the principal of stimulated Raman spectroscopy. Here the signals are observed as negative (LOSS) peaks and on the high energy side with respect to the excitation wavelength. He said that advantage of Ultrafast Raman Loss Spectroscopy (URLS) has the advantage of the very intense signals compared to normal stimulated Raman, the ability to record signals even from highly fluorescent signals at Femtosecond time resolution. URLS is tipped to be competitor for Coherent Anti-Stokes Raman Spectroscopy and thus find applications in Biology and Medicine in the years to come. The rapid data acquisition, natural fluorescence rejection and experimental ease are the making it unique and valuable molecular structure determining device.


Multifunctional Poly(vinylidene fluoride)using Supramolecular interactions (Arun K. Nandi: 14th Nov. 2009) Achieving enhanced physical and mechanical properties of commercial polymeric materials by blending with other polymers, co-polymerizing with other monomers, grafting of the main chain with suitable polymeric/oligomeric moiety and making composites with nanofillers e.g. Clay, carbon nanotube, metal nanoparticle etc. is the main thrust in polymers materials research in past few years. Dr. Nandi reported use of supra-molecular interaction to achieve interesting properties by grafting of N-N-dimethylaminoethyl; methacrylate (DMAEMA) and n-butyl methacrylate directly from poly(vinylidene fluoride)(PVDF) backbone in solution phase by atom transfer radical polymerisation(ARTF). Gel permeable chromatography (GPC), Nuclear magnetic resonance (NMR) and polymerization kinetics study conclude ATRP nature of the polymerization where four graft polymers are prepared. Few graft co-polymers show super gluing properties and can carry a weight of 16-18 kg for a sample of 0.015 cubic centimeter volume. Because of its water solubility, the polymer promises great use in biotechnology, nanotechnology, energy research, and separation processes.


Efficient graph algorithms: T. Kavitha (IISc): 14th No. 2009

Many real world problems can be posed as on graphs. As Kavitha tried to visualize problems of Road network in the context of efficient graph. Efficient graph algorithms are concerned about optimum path amongst the computing connections between all the pairs. These types of graphs may have edge connectivity problems. The focus of this research is time taken by algorithm should be small. To search for efficient algorithms it is necessary to know steps of optimal matching.


Interfacial electrochemistry using functionalised surfaces: S Sampath (IISc): 14th Nov. 2009

Molecules are intentionally attached to various surfaces to study their properties and to subsequently use them for variety of applications. Prof. Smapath talked about interfacial electro chemistry i.e. Interaction between solid-liquid, solid-solid and liquid-liquid. He talked about organic thin films and their possible applications in molecular electronics, chemical sensors, fuel cells, corrosion protection and studying novel surfaces. Organic thin films due to their single molecular layer on substrate, common orientation, high degree of order and packing and due to amenability of them are highly probable for the advanced applications. Here challenge being complicated pattern, robust stretch and catalyze use. Materials being used in this regard are donor-spacer-acceptor mono layers achieved by step wise assembly. In the reactions molecules are sandwiched between two metal surfaces while applying the orientation dependent electro-catalysis.



NMR as probe for strongly correlated electron behavior in mesoscopic devices: Vikram Tripathi, TIFR : 14thNov. 2009

Semiconductor mesoscopic structures are associated with strongly correlated electron phenomena of fractional quantum hall effect, Kondo effect, Coloumb blockade etc. Nuclear magnetic resonance techniques have proved very useful in the study of strongly correlated electron phenomenon in bulk systems. Dr. Tripathi showed that with suitable adaptations, NMR can prove similarly useful for ptobing electrons in mesoscopic structures. He illustrated the advantages of NMR with respect to transport measurements by considering two examples of strongly correlated behavior in semiconductor mesoscopic structures: a) 0.7 conductance anomaly in ballistic quantum wires, & b) the Kondo lattice scenario in disordered two-dimensional electron gases in heterostructures.




Glycosidase is an enzyme that brings about cleavage of glycosidic bonds and helps in the
metabolism of starch, glycolipids and glycoproteins in most living cells. Disorders such as diabetes can be controlled using glycosidase inhibitors as drugs. D. D. Dhavale talked about his work on iminosugars as glycosidase inhibitors and immunomodulatory agents, especially focusing on 1-deoxy-1-hydroxymethyl castanospermine and 1-deoxy-1-epi-hydroxymethyl castanospermine. These non-toxic compounds were synthesized from castanospermine that naturally occurs in black beans using traditional methods of organic chemistry. Apart from being glycosidase inhibitors, they are also immunomodulatory and show cell-proliferating capacity, ability to reduce levels of interleukin-4 (IL-4), increase levels of interleukin-6 (IL-6) and might trigger improved response to antibodies. Dhavale said that similar studies have been carried out on five-, six-, seven-membered and bicyclic pyrrolizine, indolizine, quinolizine and ozaazulene iminosugars in his lab.






Pramod Aggarwal spoke about the impact of climate change on Indian agriculture and also listed some steps that may be taken to protect our farmers’ interests. In the short term, increase in carbon dioxide concentration might improve yields of crops such as cereals and pulses. But even a one degree centigrade rise in temperature will decrease the yield of crops like wheat, mustard and groundnut by 3% to 7%. He also projected that the productivity of rice, wheat etc. will drop by 10% to 40% by 2100. Climate change will cause droughts, floods, increased activity of pathogens and increased heat stress all of which will directly affect agriculture. He pointed out that over the last three decades rice yields in the Indo-Gangetic plains and apple yields in Himachal Pradesh have shown a decline due to changing weather. Some of the adaptive strategies he suggested are assisting farmers to cope with climate change by setting up agro-advisories, insurance policies and seed banks, intensification of food production systems, improved land and water management, introduction of sustainable policies, strengthening research to develop resistant varieties, resource conservation, and sharing and evaluation of traditional wisdom.






Rudiger Wehner talked about the remarkable navigational capability of the ant, Cataglyphis bicolor, that helps it return in a more or less straight line to its nest once it finds food. This 10 mg desert ant with a 0.1 mg brain solves complex computational tasks required to achieve this feat relying mainly on visual cues. Wehner, in his talk ‘A neuroethologist’s look into the cockpit of an insect navigator’, said that the ant uses polarized light of the sun as a compass, and makes a ‘cognitive map’ of the landmarks around it. He also explained how they studied the navigational capabilities of this ant through behavioural and neurophysical studies coupled with computer and robotic simulations.






R. Gadagkar, in his talk ‘Communication of social status in a primitively eusocial wasp’, drew lessons from various studies done in his lab on the wasp, Ropalidia marginata. He explained how a queen rubs a non-volatile pheromone secreted in her Dufours gland on the ‘floor’ of the nest to maintain her reproductive monopoly. He also talked about the intriguing behavior of the wasps which accept the successor to the queen’s ‘throne’ without challenging it at all. We still do not know where the successor’s ‘prestige’ comes from, nor can we predict which wasp will be the successor to the queen. But we do know now that the wasps know who the heir designate is!






Every other cardiac patient in the world will be an Indian in two years’ time. Efficient diagnosis of heart disease, hence, is very important. Explosive detection too has become essential in today’s world. V. Ramgopal Rao, in his talk ‘Polymer-based sensor systems for healthcare and homeland security applications’, talked about the sensory applications of micro-cantilever platforms made of polymer materials such as SU-8. Heart diseases can be diagnosed based on vibrations of the cantilevers when antibodies coated on them react with antigens. Explosive detection is more complex as explosives have very low vapour pressure. Cantilevers coated with materials that have an affinity to explosives can be used. However, sensitivity of the cantilevers is a problem that needs to be resolved.






A. Bhardwaj spoke about the basic design of the Sub-keV Atom Reflecting Analyzer (SARA), which was an experiment onboard Chandrayaan I, and its recent findings. SARA consists of three parts – Chandrayaan I Energetic Neutrals Analyzer (CENA), Solar Wind Monitor (SWIM) and a Digital Processing Unit (DPU). CENA was designed to face the lunar surface and SWIM was placed at right angles to it, facing the sun. CENA was used to observe energetic neutral atoms (ENAs) in the energetic range of 10 – 3 keV, and SWIM was used to measure ions in ~ 10 – 15 keV range. The main objective of SARA was to study the interaction of solar wind ions with the lunar surface by measuring ENAs and ions.


The major findings of the experiment are: (a) The solar wind ions are scattered from the lunar surface after impaction (b) ~20% of the impinging solar wind protons are reflected back as ENAs (c) The moon is a strong source of ENAs (d) Detection of lunar night-side ions (e) Discovery of a mini-magnetosphere on the moon near the Gerasimovic crater on 17 June 2009. Bhardwaj concluded his speech by saying that lunar surface interactions are much more complex than what was known before, and the microphysics of plasma-surface interaction is poorly understood.






In his talk ‘Prey-predator response: current research and paradigm shift’, Joydev Chattopadhyay examined a new solution for the classical ecological problem, ‘the paradox of the plankton’. The problem is about the driving force behind the ever-changing species abundance in plankton communities and the resulting non-equilibrium. Based on field experiments and mathematical modeling, he sought to provide a plausible answer to this 48-year old problem. In a system of Atremia (a predator), Chatocerous (non-toxic prey) and Microcystis (toxic prey), for example, Atremia consumes Chatocerous in the first few days. As the population of the non-toxic prey diminishes, Atremia begins to consume Microcystis. This causes a reduction in Atremia population, allowing the Chatocerous population to increase again. Such a cycle of events allows the coexistence of these species in nature. Chattopadhyay suggested that a paradigm shift may be required to solve such problems, involving derivation of new Functional Responses (FRs) based on various phonological events, rather than depending on already existing FRs.






One of the important tasks of polymer materials research has been to develop commercially viable materials with better physical and mechanical properties, through various methods such as copolymerizing with other monomers, grafting the main chain with another polymeric or oligomeric moiety etc. Arun K. Nandi, in his talk ‘Multifunctional poly(vinylidene fluoride) using supramolecular interactions’, described the development and properties of a material obtained by grafting N,N-dimethyl aminoethyl methaacrylate (DMAEMA) and n-butyl methacrylate directly from poly(vinyl fluoride) (PVDP) backbone in solution, by atom transfer radical polymerization (ATRP). Four graft polymers, varying in density according to the number of hours of polymerization, were prepared and named PD-6, PD-12, PD-18 and PD-24 respectively. Some of the properties of these polymers include induced solubility in water, decrease in melting point by 5 to 6 degrees centigrade, 45 times higher tensile strength, 1970% increase in toughness, 5 times increase in dielectric constant and better gluing property when compared to PVDF. These properties suggest their applicability in biotechnology, nanotechnology, energy research etc.







In the context of rising global temperature due to climate change, development of heat tolerant rice varieties becomes important especially for rice-dependent countries like India. Anil Grover, in this speech ‘Molecular components involved in mounting response to high temperature stress in rice’, talked about a class of proteins called heat-shock proteins (HSPs), with particular reference to HSP100. HSPs are produced in all organisms in response to heat (or any other stress). Due to their chaperone action, they remove denatured proteins and also help to restore the native conformation of proteins. HSP100 is particularly important as it is produced very quickly in response to stress and in large quantities. The gene for HSP100 has a 2kb promoter region, an essential component.


Regulation of HSP100 in rice involves the action of cytoplasmic, mitochondrial and chloroplast associated ClpB proteins too. The function of the latter two ClpB proteins is not fully understood as yet. ClpB proteins act in association with small HSPs (sHSP). Of the 40 HSP20 genes identified in rice, 23 were found to constitute sHSPs. Further work is aimed at building a comprehensive model of rice HSP gene expression, which would aid the development of heat tolerant varieties.





Navroz Dubash, from Centre for Policy Research in his talk ‘What should be India’s strategy in climate negotiations?’, spoke about international and national debates about climate change. International debate ranges from ‘who should decrease emissions – when and by how much’ to ‘what should be India’s role in climate change mitigation’. Indians agree that being labeled as a ‘major emitter’ is unfair as we contribute only 4% of global emissions (though that makes us the fourth largest emitter) and our per capita emissions in 70% below world average. Our efforts to mitigate climate change are not being appreciated, and we already have the burden of still being a developing nation. Dubash said there are three varying views in India: (a) the ‘growth-first stonewallers’ believe that it is our turn to become a developed nation and international commitment must be stonewalled (b) the ‘progressive realists’ believe that it is unfair to have international commitments imposed on us, and that the world is using India as an excuse. However, they suggest that co-benefits need to be explored, and (c) the ‘progressive internationalists’ agree that India is being used as an excuse, but believe that we need to develop an effective climate regime as it is impossible to delink domestic and global issues. Dubash suggested that ‘the two progressive groups have to join forces for a renewed Indian climate politics’.






Malaria is a major disease in tropical countries like India. Estimates show that there are around 500 million cases of malaria worldwide, resulting in 1 million deaths annually. This warrants the need for the development of a vaccine against Plasmodium. Chetan E. Chitnis presented a new approach to vaccine development in his talk, ‘Rational design of a malaria vaccine.’ Plasmodium has a number of polymorphic strains, which has made vaccine development difficult. Chitnis concentrated on the possibility of developing a vaccine against the highly conserved binding site of the Duffy binding protein (DBP) of the pathogen which binds to the Duffy antigen chemokine receptor (DARC) on RBCs. This will inhibit RBC invasion by Plasmodium merozoites. This approach might be an effective way of protecting people in regions where malaria is endemic.

--

====================================================================

Sunday, November 8, 2009

The World of imagination Set Free before the Future Shock of inevitable breakthrough prediction !

Sir Carl Popper famously said, “By making their interpretations and prophesies sufficiently vague...; and in order to escape falsification, astrologers destroyed the testability of the theory”;(1) In reply legendary Thomas Kuhn says that “The History of Astrology during the centuries when it was intellectually reputable records many predictions that categorically failed. Not even astrology`s most convinced and vehemant exponent doubted the recurrence of such failures. Astrology cannot be barred from the sciences because of the form in which it`s predictions are being cast.” (2)

As soon as we try to understand the role and significance of prediction, we are talking about definite objective, goal to be followed in certain direction in times ahead. And mind well, when the goal is far illuminated as shining as the coveted prize like Nobel then we should really listen to Salvador Luria, a Nobel Laureate in Physiology or Medicine in 1969. He says, “The goal in science should be to find out things- not to win a prize…Yet, in conversations with some of my younger colleagues, I get a sense that it has become a goal, and that is not good. I think it would be better if there were no prizes.”

As P. Balaram pointed out in his recent editorial: “Most professional scientometric analysts have little feeling for science itself, hoping that insights might emerge from impersonal quantitative methods. In recent times practicing scientists, often physicists, have entered the field of scientometrics; bringing with them increasingly complex ways of analyzing the exploding volume of scientific literature.”(3)These lines and talk about ‘Reinventing the Research University in India’ he delievered in Jawaharlal Nehru University on 31st August 2009 ignited my curiosity about ability of scientific mind in speculating future; both about discovery/invention and prizes.(4)

Thomson Reuters have recently come up with the practice of declaring Citation Laureate, a step in the direction of predicting Nobel Laureate. Surprisingly, Thomson Reuters had predicted Elizabeth H. Blackburn, Carol W. Greider and Jack W. Szostak as probable winners for year 2009 Nobel Prizes for their discovery "of how chromosomes are protected by telomeres and the enzyme telomerase". In order to validate their claim Thomson Reuters said that “Citations are lagging indicators as to research but leading indicators as to peer system, prizes.” They owe their methodology to the cumulative work of Eugene Garfield who argues that “Nobel Laureates publish five times the average number of papers but their work is cited 30 to 50 times the average contributing to high h-index among the peer community.” (5)


How do scientific breakthroughs happen?
Before ascertaining actual correctness of forecast or prediction we should know what are the distinct features by which we assess the “Breakthrough” in the array of discovery and inventions so to qualify it one of the pioneering work waiting for global recognition. Possibilities about Breakthrough are explained by Martino(6)as:
a)Breakthrough should never cause surprise because it is inevitable.
b)Some breakthroughs are surprising but they are of no value to forecaster
c)While breakthroughs are not inevitable, they should be surprising for only those people who are not watching them


So, if we read above three possibilties in the context of work done by Moed the visualisation of new scenarios in prediction making will be less difficult. Moed says, “ The use of citation analysis in research evaluation is more appropriate (when) the more it is formal, open, scholarly founded, supplemented with expert knowledge, carried out in a clear policy context with clear objectives, stimulating users to explicitly state basic notions of scholarly quality and enlightening rather than formulaic.” (7)


I think predictability of discovery/invention is an issue of larger significance than that of award itself. Degree of Evitability or inevitability of discovery/invention matters in this world of “Big Science, Competition and National Innovation Systems”. Martino (ibid pp.211) makes a strong case of inevitability of inventions and at the same time explains the hindrances in the process of innovation due to compartmentalisation of science and due to restricted access of scholarly literature. He says, “...compartmentalisation of knowledge causes unnecessary duplication of work i.e. it forces multiple invention through stifling the normal diffusion of innovation....improving information storage and retrieval systems for scientific and technical information, that much work is repeated simply beacause it is buried in the literature and is not available for the people who need it.”


Then Martino explains the roadblocks in assessing signals providing information about forthcoming events. Two hurdles are described; a) People in general do not recognise possibility of signals and b) Signals are buried in noise. Further this analysis emphasise role of ‘pattern formation’. Pattern is formed by our attention to each signal(research) becoming meanigful when placed in the context of preciding one. Even two kinds of errors may occur at this level; a) Failure to recognise that two or more research works are interrelated and 2) Failure in connecting two research areas to form overall pattern of wider research front.


In In Kuhn’s words, “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like. Much of the success of the enterprise derives from the community’s willingness to defend that assumption, if necessary at considerable cost. Normal science, for example, often suppresses fundamental novelties because they are necessarily subversive of its basic commitments.” (8)
According to May, “Although the history of science is full of revolutions, scientists do not expect them or look for them (by definition –because the revolution involves the overthrow of the paradigm that represents the scientist’s concept of ‘truth’)”

He continues: Standard methods for forecasting future developments in technology cannot predict the impact of major breakthroughs or ‘revolutions’ in basic science that might lead to radically new technology and a fundamental change in the way wars are fought. While such revolutions cannot, by definition, be predicted in any detail, it is possible to identify many of the broad factors that are involved in turning scientific breakthroughs into feasible technology.”


Eventually our knowledge about future, even if it is dependent on quantitaive techniques of forecasting is bounded by famous lines of Einstein:” Not everyhting that couunts can be counted, and not everything that can be be counted counts."

References:
1)Karl Popper, Conjecture and Refutations, 1963

2)Kuhn, Thomas, Logic of Discovery or Psychology of Research (1997) ,
Criticism and the Growth of Knowledge, Ed. Imre Lakatos, Alan Musgrave

3)Balaram P. Current Science Vol.96, No. 10, 25 May, 2009

4)David Pendlebury, Discover the Power of Quantitative Analysis, The Art &
Science of Identifying Future Nobel Laureates, Presentation made at DST,
New Delhi on 5th Nov. 2009

5)Eugene Garfield, Identifying Nobel Class Scientists and the
uncertainties thereof, Thomson ISI, European Conference on Scientific
Publication In Medicne and Biomedicine, Sweden, 2006

6)Joseph P. Martino, Technological Forecasting for Decision Making,
American Elsevier Publishing Company, New York 1972

7)Henk F. Moed (July 2005), Citation Analysis in Research Evaluation,
Springer

8)Kuhn T S. The structure of scientific revolutions. Third edition,
University of Chicago Press (1996)

9)May, Andrew (2001), Science forecasting: predicting the unpredictable,
Journal of Defense Science From a sign hanging in Albert Einstein`s
office at the Institute of Advanced Study in Princeton


-------------------------------------------------------------------------------------

Monday, November 2, 2009

New Hopes towards Greater Research Collaboration through progress in the Worldwide LHC Computing Grid

“Is Internet Looking towards future through the GRIDs having Silver Clouds of Transformation ?”


In the much awaited moments ahead of November 2009, Large Hadron Colider (LHC) in CERN (European Organization for Nuclear Research) was successful in channeling a proton beam in its accelerator after one year of gap in its operations. In parallel were the excited discussions about enormous progress been made about the potential of Worldwide LHC Computing Grid (WLCG) to transform Internet into more accessible, fast medium enabling voluminous data transfer and data sharing for collaborative exchanges of information leading towards greater capabilities in Scientific Cooperation across the borders.


Grid computing revolutionizes the way scientists share and analyses data by enabling researchers to share computer power and data storage over the Internet. Grid projects already help researchers search for new wheat genes, predict storms, or simulate the Sun’s interior. The 7000-odd physicists working on experiments at the Large Hadron Collier will rely entirely on grid computing, specifically on the Worldwide LHC Computing Grid, to connect them with LHC data. The computing centres providing resources for WLCG are embedded in different operational Grid organisations, in particular EGEE (Enabling Grids for E-SciencE) and OSG (the Open Science Grid), but also several national and regional Grid structures such as GridPP in the UK, INFN Grid in Italy and NorduGrid in the Nordic region.


The Enabling Grids for E-sciencE (EGEE) project is funded by the European Commission and aims to integrate current national, regional and thematic Grid efforts, in order to create a seamless Grid infrastructure available to scientists 24 hours-a-day, for the support of scientific research. LCG and EGEE are tightly coupled and provide complementary functions. OSG (Open Science Grid) is a U.S. distributed computing infrastructure for large-scale scientific research, built and operated by a consortium of universities, national laboratories, scientific collaborations and software developers. The OSG integrates computing and storage resources from more than 50 sites in the United States, Asia and South America. The OSG is supported by the U.S. National Science Foundation and Department of Energy's Office of Science.


The Globus Alliance involves several universities and research laboratories conducting research and development to create fundamental Grid technologies and produce open-source software. The WLCG project is actively involved in the support of Globus and uses the Globus-based Virtual Data Toolkit (VDT) as part of the project middleware.


During the development of the LHC Computing Grid, many additional benefits of a distributed system became apparent: (Courtesy-CERN) Multiple copies of data can be kept in different sites, ensuring access for all scientists involved, independent of geographical location. It allows optimum use of spare capacity for multiple computer centres, making it more efficient. Having computer centres in multiple time zones eases round-the-clock monitoring and the availability of expert support. There are no single points of failure. The cost of maintenance and upgrades is distributed, since individual institutes fund local computing resources and retain responsibility for these, while still contributing to the global goal. Independently managed resources have encouraged novel approaches to computing and analysis. So-called “brain drain”, where researchers are forced to leave their country to access resources, is reduced when resources are available from their desktop. The system can be easily reconfigured to face new challenges, making it able to dynamically evolve throughout the life of the LHC, growing in capacity to meet the rising demands as more data is collected each year. It provides considerable flexibility in deciding how and where to provide future computing resources. Also, it allows community to take new advantage of new technologies that may appear and that offer improved usability, cost effectiveness and energy efficiency.


Widely reported news that “The The Grid will revolutionaize the Internet” is clarified by CERN itself. They say: “Grid computing, like the World Wide Web, is an application of the Internet. When the LHC turns on, data will be transferred from CERN to 11 large computing centers around the world at rates of up to 10 gigabits per second. Those large centers will then send and receive data from 200 smaller centers worldwide. All this data transfer will take place over the Internet. Dedicated fibre-optic links are used between CERN and the large centres; the smaller centres connect together through research networks and sometimes the standard public Internet.” (1)


Going ahead to comment on speculation on unprecedented increase in capacity of internet in sharing and downloading capacity, CERN explains: “ First, in order to get such data-transfer rates, individuals would have to do what the large particle physics computing centres have done, and set up (or lease) a dedicated fibre-optic link between their home and the source of their data. Second, today’s grid computing technologies and projects are geared toward research and businesses with highly specific needs, such as vast amounts of data to process and analyse within large, worldwide collaborations. While other computer users may benefit from grid computing through better weather prediction or more effective medications, they may not be logging onto a computing grid anytime soon. (Something called “cloud computing”, where your programs are run in a central location rather than on your own computer, may also be on the horizon.) (ibid)


But scientists and engineers looking ahead towards the Grid for not only enabling sharing of documents and MP3 files, but also connecting PCs with sensors, telescopes and tidal-wave simulators. Though the task of standardizing everything from system templates to the definitions of various resources is a mammoth one, the Global Grid Forum (GGF) can look to the early days of the Web for guidance. The Grid that organizers are building is a new kind of Internet, only this time with the creators having a better knowledge of where the bottlenecks will be. Computers on the grid can also transmit data at lightning speed. This will allow researchers facing heavy processing tasks to call on the assistance of thousands of other computers around the world. The aim is to eliminate the problem experienced by internet users who ask their machine to handle too much information. The real goal of the grid is, however, to work with the LHC in tracking down nature’s most elusive particle, the Higgs boson. Predicted in theory but never yet found, the Higgs is supposed to be what gives matter mass. The latest spin-off from CERN (the particle physics centre that created the web), the grid could also provide the kind of power needed to transmit holographic images; allow instant online gaming with hundreds of thousands of players; and offer high-definition video telephony for the price of a local call.


Research Colloboration
The WLCG project is also following developments in industry, in particular through CERN openlab, where leading IT companies are testing and validating cutting-edge Grid technologies using the LCG environment. The CERN openlab is a collaboration between CERN and industrial partners to study and develop data-intensive solutions to be used by the worldwide community of scientists working at the next-generation Large Hadron Collider. These experiments will generate enormous amounts of data - 15 million gigabytes a year - and will require a globally distributed Grid of over 150 computing centres to store and analyse the data, with a computing capacity of more than 100,000 of today’s cores.




There is no question that scientific research over the past twenty years has undergone a transformation. This transformation has occurred as a result of new technologies leading to new methods of working, have accelerated the pace of discovery and knowledge accumulation not only in the natural sciences but also in the social sciences and arts and humanities. Research today is often critically dependent on computation and data handling. The practice has become known under various terms such as e-Science, e-Research, and cyberscience. Irrespective of the name, many researchers acknowledge that the use of computational methods and data handling is central to their work.



Advances in scientific and other knowledge generated vast amounts of data which need to be managed for analysis, storage and preservation for future re-use. Larger scale science enabled by the Internet, and other information and communication technologies (ICTs), scientific instrumentation and automation of research processes has resulted in the emergence of new research paradigms that are often summarised as 'data-rich science'. A feature of this new kind of research is an unprecedented increase in complexity, in terms of the sophistication of research methods used, in terms of the scale of phenomena considered as well as the granularity of investigation. (2)





e-Research
involves the use of computer-enabled methods to achieve new, better, faster or more efficient research and innovation in any discipline. It draws on developments in computing science, computation, automation and digital communications. Such computer-enabled methods are invaluable within this context of rapid change, accumulation of knowledge and increased collaboration. They can be used by the researcher throughout the research cycle, from research design, data collection, and analysis to the dissemination of results. This is unlike other technological "equipment" which often only proves useful at certain stages of research. Researchers from all disciplines can benefit from the use of e-Research approaches, from the physical sciences to arts and humanities and the social sciences.




e-Research Technologies Supporting Collaboration
e-Research technologies support the research collaborations described above by introducing a model for resource sharing based on the notions of “resources” that are accessed through “services”. Resources can be computational resources such as high-performance computers, storage resources such as storage resource brokers or repositories, datasets held by data archives or even remote instruments such as radio telescopes. In order to make resources available to collaborating researchers, their owners provide services that provide a well-described interface specifying the operations that can be performed on or with a resource, e.g., submitting a compute job or accessing a set of data.




Computer-enabled methods of collaboration for research take many forms, including use of video conferencing, wikis, social networking websites and distributed computing itself. For example, researchers might use Access Grid for video conferencing to hold virtual meetings to discuss their projects. Access Grid and virtual research environments provide simultaneous viewing of participating groups as well as software to allow participants to interact with data on-screen. Wikis have also become a valuable collaborative tool. This is perhaps best demonstrated by the OpenWetWare website, which promotes the sharing of information between researchers working in biology, biomedical research and bioengineering using the concept of a virtual Lab Notebook. This allows researchers to publish research protocols and document experiments. It also provides information about laboratories and research groups around the world as well as courses and events of interest to the community.



Social networking sites have been used or created for research purposes. The myExperiment social website is becoming an indispensible collaboration tool for sharing scientific workflows and building communities. Such sharing cuts down on the repetition of research work, saving time and effort and leading to advances and innovation more rapidly than if researchers were on their own, without access to similar work (for comparison to their own). Other social networking sites such as Facebook have been adopted by researchers and extensions have been built to allow them to be used as portal to access research information. For example, content in the ICEAGE Digital Library can be accessed within Facebook. (ibid)




The Role of Databases for LHC Data Processing and Analysis:
Database services are required by the experiments’ online systems, for most if not all aspects of offline processing, for simulation activities as well as analysis. Some specific examples include for the PVSS Supervisory Control and Data Acquisition (SCADA) system, for detector conditions (e.g. COOL), alignment and geometry applications, for Grid Data Management (LCG File Catalog, File Transfer Service) and Storage Management (e.g. CASTOR + SRM) services, as well as Grid infrastructure and operations tools (GridView, SAM, Dashboards, VOMS). (3)



In late spring 2007 four high-energy physics (HEP) laboratories, The European Organization for Nuclear Research (CERN), the Deutsches Elektronen Synchrotron (DESY), the Fermi National Accelerator Laboratory (FNAL) and the Stanford Linear Accelerator Center (SLAC), ran a user poll to analyze the current state of HEP information systems. The goal was to achieve a better understanding of the perceptions, behaviors and wishes of the end users of these information systems. The poll received more than 2100 answers, representing about 10% of the active HEP community worldwide. The poll showed that community-based services dominate this field of research with the metadata- only search engine SPIRES-HEP [1] being the primary information gateway for most scholars. Users also gave their preferences regarding existing functionalities like access to full text and to citation information, and a list of features that they would like to have in the coming years. The results showed that the scholars attach paramount importance to three axes of excellence: access to full text, depth of coverage and quality of content. (4)



Real challenging question being faced by LHC-CERN is “Can concept of cloud computing replace that of grids?”(5) Worldwide LHC Computing Grid (LCG)– has been established, building on two main production infrastructures: those of the Open Science Grid (OSG) in the Americas, and the Enabling Grids for E-sciencE (EGEE) Grid in Europe and elsewhere. The machine itself – the Large Hadron Collider (LHC) – is situated some 100m underground beneath the French-Swiss border near Geneva, Switzerland and supports four major collaborations and their associated detectors: ATLAS, CMS, ALICE and LHCb.




Running a service where the user expectation is for support 24x7, with extremely rapid problem determination and resolution targets, is already a challenge. When this is extended to a large number of rather loosely coupled sites, the majority of which support multiple disciplines – often with conflicting requirements but always with local constraints – this becomes a major or even “grand” challenge. That this model works at the scale required by the LHC experiments –literally around the world and around the clock – is a valuable In order to process and analyze the data from the world's largest scientific machine, a worldwide grid service – the vindication of the Grid computing paradigm.




Currently, adapting an existing application to the Grid environment is a non-trivial exercise that requires an in-depth understanding not only of the Grid computing paradigm but also of the computing model of the application in question. The successful demonstration of a straightforward recipe for moving a wide range of applications – from simple to the most demanding – to Cloud environments would be a significant boost for this technology and could open the door to truly ubiquitous computing. (ibid)

-------------------------------------------------------------------------------------------------------------------------------

References:

1) ttp://public.web.cern.ch/Public/en/Spotlight/SpotlightGridFactsAndFiction-en.html

2) Voss, A., & Vander Meer, E. (2009, September 7). Research in a Connected World. Retrieved from the

Connexions Web site: http://cnx.org/content/m20834/1.3/ )

3) Maria Girone, Distributed Database Services – a Fundamental Component of the WLCG Service for the LHC Experiments – Experience and Outlook, CERN Document Server, European Organization for Nuclear Research, (Email: Maria.Girone@cern.ch

4) R Ivanov and L Raae, INSPIRE: a new scientific information system for HEP, CERN Document Server, European Organization for Nuclear Research (E-mail: Radoslav.Ivanov@cern.ch, Lars.Christian.Raae@cern.ch)

5) J.D. Shiers, Can Clouds Replace Grids? A Real-Life Exabyte-Scale Test-Case, CERN Document Server, European Organisation for Nuclear Research (CERN) (e-mail: Jamie.Shiers@cern.ch)

------------------------------------------------------------------------------------------------------------------------------------