Wednesday, November 18, 2009

Indian Acadamy of Sciences Platinum Jubilee Meet 2009, IISc Bangalore


As per the Academy’s tradition, the first lecture of the meeting was the Presidential Address by D. Balasubramanian. In his talk, ‘When science looks you in the eye’, he discussed a number of novel ways to correct disorders of the eye. Defects such as presbyopia among adults and myopia among children may be treated by replacing the lens with polymer gels which will then be controlled by the ciliary muscles of the eye. Corneal defects can be treated by procedures such as corneal polymer onlays, corneal sculpting (for long term correction), replacement of the entire cornea with biopolymer or co-polymer contact lens, or by reconstruction of outer corneal layer by stem cell methods.

These are important in a country like India where 140,000 cornea donors are needed but only few thousands are available. Balasubramanian also talked about methods, such as gene therapy, for correcting retinal defects, and methods for effective drug delivery to the retina, such as nanoparticle encapsulated delivery of carboplatin in the treatment of retinoblastoma. He also spent some time discussing the use of ‘adaptive glasses’ and photoelectric methods to correct eye disorders. He concluded his speech with questions that are as yet unanswered. Some of them are: (a) Is it possible to deliver genes using nanomaterials? (b) Can tear fluid be used as a diagnostic material? (c) Can reengineered taste bud cells be used to repair the retina? and (d) Is it possible to transplant the entire donor eye?







This was followed by a highly inspiring talk by C. N. R. Rao, ‘Emerging India as a great centre of science.’ He said that the only things he had when he started off on his scientific career were enthusiasm and the inspiration he got by interacting with eminent scientists like Sir C. V. Raman; Indians did not get much money for doing science in those days. He wondered why the quality of science has not improved drastically in recent times though the availability of monetary support has greatly improved. He also discussed a number of problems that scientific progress in India faces, such as the relatively low importance that science is given in society, ‘referee fatigue’ when papers are published in international journals and the low output of research papers and PhD holders from India.


Public Lectures:

a) Nandan Nilekani: 12th nov. 2009

Unique ID Number

Chairman, UIDAI in a lecture ‘Unique Identification Project-Issues and Challenges’ explained the road mapping of Unique Identification Authority of India. Increasing rate of aspirations, government's renewed emphasis on ‘Inclusive Policies of Economic Development’, need to acquire multiple services in private and public sector have forced the urgency of this Unique Identification Number. The multiple services require authentication for fulfilling the demands of the citizens and customer. The process of issuing such cards is time and cost intensive. This projects aims at reducing the duplication of efforts in creating multiple databases and also to provide a formal identity to the benefit sharing citizens of the government's welfare schemes. UID will provide a unique 16 digit number to all the citizens who voluntarily apply for it. However this number will only be used for verification of demographic information of citizen concerned and it will not award any rights, citizenship and other additional benefits. This system will enable online verification. Also, UIDAI will not share the data with anyone else envisioning balance between privacy and purpose.

Thus through UIDAI India will be first country to implement biometric based Unique ID system on such a large scale. In essence this project targets to enable government to address direct benefit programs more efficiently and allow departments to coordinate investments and share information. Mr. Nilekani elaborated on technological challenges viz. Unlikely uniformity in biometric modality, unavailability of biometric pattern in significant population, institutional cost, use of distributed computing (along with cloud computing and virtualisation) in the process of database sharing and authentication, optimization for networks, rural Internet connectivity, security, privacy, scalling, sustainability and adoption standards.


b) Mark Tully: 13th Nov. 2009

The need for balance in unbalanced world

His argument centered around promoting more dialogue between Sciences and non-sciences, especially religion, theology, mystical and traditional knowledge systems. He was advocating greater need to have liberal education in order to build the bridges between TWO CULTURES; sciences and social sciences. He was pointing out finger at inability of sciences in appreciating the viewpoint of the people at large involved in alternative methods of inquiry like poetry, music and painting. It was in the context of the rigidity of scientists, the authenticated WISDOM largely relying on rational analysis ignoring the popular beliefs and myths.


He was asserting that we should give more space, tolerance, platform for greater dialogue the spirit which is reflected in ARGUMENTATIVE INDIAN written by Amartya Sen. In Q &A session one lady asked, "Whether he will preach the same TOLERANCE OF ARGUMENTS to religious fanatics just like he is conveying it to the scientists?" Some tangential discussion happened about literature of Richard Dawkins, a famous Biologists who as Tully said does less biology and more criticism of catholic church. He was trying to convey the point that in this era religion is the system which is trying most to be in the process of dialogue rather than rigid science.



Cold gas at high redshifts:12th Nov. 2009

R. Srianand from IUCAA, Pune talked about understanding of physical conditions in protogalaxies by analyzing cold gas at high redshifts. He highlighted another useful method of probing thermal state of interstellar medium of galaxies 21-cm absorption in the spectra of background quasars. Cosmology is a function of cosmic curve and red shift can be understood by application of Doppler Effect. Stars which are formed by cold gas the physical conditions of which are influenced by local radiation field, cosmic ray energy density, photoelectric energy by dust. 21-cm absorbers are capable of measuring correlation between spin and temperature. Challenge to measure Hydrogen molecules through 21-cm using GMRT, GBT and WSRT ( all major telescopes) was really the high excited state of H atoms. Therefore Hydrogen lines are not available for the purpose of Astronomical study. Survey conducted by R. Srianand provides representative sample of systems to be used in combination with various follow up observations like a) physical conditions, b) effect of metalicity, c) morphology of absorbing gas, d) time evolution of various fundamental constants.



Raman Spectroscopy: 13th Nov. 2009:-> Hallmark of the Platinum Jubilee meeting was symposium on Raman Spectroscopy as a tribute to founder of Acadamy C.V. Raman. Prof. Ajay Sood from IISc presided the proceedings. Inaugural lecture focused on versatility of the principle of Raman Spectroscopy and diverse applications in the research and life. Prof. Hiro-O Hamaguchi (University of Tokyo) talked about recent developments of Raman spectroscopy in enabling in vivo imaging of living cells with high time, space and molecular specificity. He mentions it as a spatial boundary of living organism be accounted for by physics and chemistry citing this phenomenon being Raman spectroscopic signature of life when metabolic activity of mitochondria is reflected by using this technique. Changes in spontaneous cell death process have been traced by excellent molecular specificity by time resolved Raman imaging. The advantages of this technique explained by Prof. Hamaguchi include enabling knowledge about molecular structure and dynamics, no pre-treatment of samples needed, LASER excitation is possible for space resolved measurement, no water interference and usable in fibre optics techniques of endoscope and sensing. Disadvantages account for irradiation damages, low sensitivity, fluorescence interference etc. Major question being addressed by this research is can we measure and quantify molecular life so as to apply this knowledge in curing the diseases like cancer etc.


Volker Deckert (Friedrich-Schiller University, Germany) talked (Raman spectroscopy beyond the diffraction limit) about the evolution of Raman Spectra applications on the sidelines of the development of LASER and efforts to increasing the intensity of very weak signal. The advent of multichannel detectors, utilizing of field enhancing substrates helped to increase spatial information. His talk centers around recent experiments Raman experiments performed by near field geometers/optics. Instead of conventional lenses, light was squeezed through a tiny little aperture with dimensions already beyond the Abbe limit. (According to Abbe, a detail with a particular spacing in the specimen is resolved when the numerical aperture of the objective lens is large enough to capture the first-order diffraction pattern produced by the detail at the wavelength employed.) This research was major breakthrough due to combination of near field optical concepts and field enhancing properties of plasmonic nanoparticles.


A. W. Parker (Seeing below surfaces: Developments in Raman spectrocopy for chemical and medical analysis) talked about efficiency of Raman Spectroscopy among different ways of molecular structure determination like X-ray,NMR, vibrational spectroscopy, UV etc. This field was hastily driven forward by rapid growth of tunable pulsed(nanosecond and picoseconds) LASER capable of operating across the ultraviolet.., visible and near infrared spectral regions. Raman effect can be successfully adopted to characterize bone diseases in both mineral and organic component of bone matrix. This research can be used to develop and analytical non-invasive method assessing the composition of bone matrix (for supporting diagnose) and mother treatments.


Prof. Siva Umapathi of LASER Spectroscopy group of IISc talked about changing paradigm of Raman Spectroscopy passing from Physics to Biology through Chemistry. These trends include nonlinear spectroscopy, Raman loss spectroscopy, biophotonics. The new wave of applications of Fluorescence to biology, to materials was discussed. Time dependent dynamics especially Femtosecond dynamics was discussed. He talks about new form of spectroscopy which use the principal of stimulated Raman spectroscopy. Here the signals are observed as negative (LOSS) peaks and on the high energy side with respect to the excitation wavelength. He said that advantage of Ultrafast Raman Loss Spectroscopy (URLS) has the advantage of the very intense signals compared to normal stimulated Raman, the ability to record signals even from highly fluorescent signals at Femtosecond time resolution. URLS is tipped to be competitor for Coherent Anti-Stokes Raman Spectroscopy and thus find applications in Biology and Medicine in the years to come. The rapid data acquisition, natural fluorescence rejection and experimental ease are the making it unique and valuable molecular structure determining device.


Multifunctional Poly(vinylidene fluoride)using Supramolecular interactions (Arun K. Nandi: 14th Nov. 2009) Achieving enhanced physical and mechanical properties of commercial polymeric materials by blending with other polymers, co-polymerizing with other monomers, grafting of the main chain with suitable polymeric/oligomeric moiety and making composites with nanofillers e.g. Clay, carbon nanotube, metal nanoparticle etc. is the main thrust in polymers materials research in past few years. Dr. Nandi reported use of supra-molecular interaction to achieve interesting properties by grafting of N-N-dimethylaminoethyl; methacrylate (DMAEMA) and n-butyl methacrylate directly from poly(vinylidene fluoride)(PVDF) backbone in solution phase by atom transfer radical polymerisation(ARTF). Gel permeable chromatography (GPC), Nuclear magnetic resonance (NMR) and polymerization kinetics study conclude ATRP nature of the polymerization where four graft polymers are prepared. Few graft co-polymers show super gluing properties and can carry a weight of 16-18 kg for a sample of 0.015 cubic centimeter volume. Because of its water solubility, the polymer promises great use in biotechnology, nanotechnology, energy research, and separation processes.


Efficient graph algorithms: T. Kavitha (IISc): 14th No. 2009

Many real world problems can be posed as on graphs. As Kavitha tried to visualize problems of Road network in the context of efficient graph. Efficient graph algorithms are concerned about optimum path amongst the computing connections between all the pairs. These types of graphs may have edge connectivity problems. The focus of this research is time taken by algorithm should be small. To search for efficient algorithms it is necessary to know steps of optimal matching.


Interfacial electrochemistry using functionalised surfaces: S Sampath (IISc): 14th Nov. 2009

Molecules are intentionally attached to various surfaces to study their properties and to subsequently use them for variety of applications. Prof. Smapath talked about interfacial electro chemistry i.e. Interaction between solid-liquid, solid-solid and liquid-liquid. He talked about organic thin films and their possible applications in molecular electronics, chemical sensors, fuel cells, corrosion protection and studying novel surfaces. Organic thin films due to their single molecular layer on substrate, common orientation, high degree of order and packing and due to amenability of them are highly probable for the advanced applications. Here challenge being complicated pattern, robust stretch and catalyze use. Materials being used in this regard are donor-spacer-acceptor mono layers achieved by step wise assembly. In the reactions molecules are sandwiched between two metal surfaces while applying the orientation dependent electro-catalysis.



NMR as probe for strongly correlated electron behavior in mesoscopic devices: Vikram Tripathi, TIFR : 14thNov. 2009

Semiconductor mesoscopic structures are associated with strongly correlated electron phenomena of fractional quantum hall effect, Kondo effect, Coloumb blockade etc. Nuclear magnetic resonance techniques have proved very useful in the study of strongly correlated electron phenomenon in bulk systems. Dr. Tripathi showed that with suitable adaptations, NMR can prove similarly useful for ptobing electrons in mesoscopic structures. He illustrated the advantages of NMR with respect to transport measurements by considering two examples of strongly correlated behavior in semiconductor mesoscopic structures: a) 0.7 conductance anomaly in ballistic quantum wires, & b) the Kondo lattice scenario in disordered two-dimensional electron gases in heterostructures.




Glycosidase is an enzyme that brings about cleavage of glycosidic bonds and helps in the
metabolism of starch, glycolipids and glycoproteins in most living cells. Disorders such as diabetes can be controlled using glycosidase inhibitors as drugs. D. D. Dhavale talked about his work on iminosugars as glycosidase inhibitors and immunomodulatory agents, especially focusing on 1-deoxy-1-hydroxymethyl castanospermine and 1-deoxy-1-epi-hydroxymethyl castanospermine. These non-toxic compounds were synthesized from castanospermine that naturally occurs in black beans using traditional methods of organic chemistry. Apart from being glycosidase inhibitors, they are also immunomodulatory and show cell-proliferating capacity, ability to reduce levels of interleukin-4 (IL-4), increase levels of interleukin-6 (IL-6) and might trigger improved response to antibodies. Dhavale said that similar studies have been carried out on five-, six-, seven-membered and bicyclic pyrrolizine, indolizine, quinolizine and ozaazulene iminosugars in his lab.






Pramod Aggarwal spoke about the impact of climate change on Indian agriculture and also listed some steps that may be taken to protect our farmers’ interests. In the short term, increase in carbon dioxide concentration might improve yields of crops such as cereals and pulses. But even a one degree centigrade rise in temperature will decrease the yield of crops like wheat, mustard and groundnut by 3% to 7%. He also projected that the productivity of rice, wheat etc. will drop by 10% to 40% by 2100. Climate change will cause droughts, floods, increased activity of pathogens and increased heat stress all of which will directly affect agriculture. He pointed out that over the last three decades rice yields in the Indo-Gangetic plains and apple yields in Himachal Pradesh have shown a decline due to changing weather. Some of the adaptive strategies he suggested are assisting farmers to cope with climate change by setting up agro-advisories, insurance policies and seed banks, intensification of food production systems, improved land and water management, introduction of sustainable policies, strengthening research to develop resistant varieties, resource conservation, and sharing and evaluation of traditional wisdom.






Rudiger Wehner talked about the remarkable navigational capability of the ant, Cataglyphis bicolor, that helps it return in a more or less straight line to its nest once it finds food. This 10 mg desert ant with a 0.1 mg brain solves complex computational tasks required to achieve this feat relying mainly on visual cues. Wehner, in his talk ‘A neuroethologist’s look into the cockpit of an insect navigator’, said that the ant uses polarized light of the sun as a compass, and makes a ‘cognitive map’ of the landmarks around it. He also explained how they studied the navigational capabilities of this ant through behavioural and neurophysical studies coupled with computer and robotic simulations.






R. Gadagkar, in his talk ‘Communication of social status in a primitively eusocial wasp’, drew lessons from various studies done in his lab on the wasp, Ropalidia marginata. He explained how a queen rubs a non-volatile pheromone secreted in her Dufours gland on the ‘floor’ of the nest to maintain her reproductive monopoly. He also talked about the intriguing behavior of the wasps which accept the successor to the queen’s ‘throne’ without challenging it at all. We still do not know where the successor’s ‘prestige’ comes from, nor can we predict which wasp will be the successor to the queen. But we do know now that the wasps know who the heir designate is!






Every other cardiac patient in the world will be an Indian in two years’ time. Efficient diagnosis of heart disease, hence, is very important. Explosive detection too has become essential in today’s world. V. Ramgopal Rao, in his talk ‘Polymer-based sensor systems for healthcare and homeland security applications’, talked about the sensory applications of micro-cantilever platforms made of polymer materials such as SU-8. Heart diseases can be diagnosed based on vibrations of the cantilevers when antibodies coated on them react with antigens. Explosive detection is more complex as explosives have very low vapour pressure. Cantilevers coated with materials that have an affinity to explosives can be used. However, sensitivity of the cantilevers is a problem that needs to be resolved.






A. Bhardwaj spoke about the basic design of the Sub-keV Atom Reflecting Analyzer (SARA), which was an experiment onboard Chandrayaan I, and its recent findings. SARA consists of three parts – Chandrayaan I Energetic Neutrals Analyzer (CENA), Solar Wind Monitor (SWIM) and a Digital Processing Unit (DPU). CENA was designed to face the lunar surface and SWIM was placed at right angles to it, facing the sun. CENA was used to observe energetic neutral atoms (ENAs) in the energetic range of 10 – 3 keV, and SWIM was used to measure ions in ~ 10 – 15 keV range. The main objective of SARA was to study the interaction of solar wind ions with the lunar surface by measuring ENAs and ions.


The major findings of the experiment are: (a) The solar wind ions are scattered from the lunar surface after impaction (b) ~20% of the impinging solar wind protons are reflected back as ENAs (c) The moon is a strong source of ENAs (d) Detection of lunar night-side ions (e) Discovery of a mini-magnetosphere on the moon near the Gerasimovic crater on 17 June 2009. Bhardwaj concluded his speech by saying that lunar surface interactions are much more complex than what was known before, and the microphysics of plasma-surface interaction is poorly understood.






In his talk ‘Prey-predator response: current research and paradigm shift’, Joydev Chattopadhyay examined a new solution for the classical ecological problem, ‘the paradox of the plankton’. The problem is about the driving force behind the ever-changing species abundance in plankton communities and the resulting non-equilibrium. Based on field experiments and mathematical modeling, he sought to provide a plausible answer to this 48-year old problem. In a system of Atremia (a predator), Chatocerous (non-toxic prey) and Microcystis (toxic prey), for example, Atremia consumes Chatocerous in the first few days. As the population of the non-toxic prey diminishes, Atremia begins to consume Microcystis. This causes a reduction in Atremia population, allowing the Chatocerous population to increase again. Such a cycle of events allows the coexistence of these species in nature. Chattopadhyay suggested that a paradigm shift may be required to solve such problems, involving derivation of new Functional Responses (FRs) based on various phonological events, rather than depending on already existing FRs.






One of the important tasks of polymer materials research has been to develop commercially viable materials with better physical and mechanical properties, through various methods such as copolymerizing with other monomers, grafting the main chain with another polymeric or oligomeric moiety etc. Arun K. Nandi, in his talk ‘Multifunctional poly(vinylidene fluoride) using supramolecular interactions’, described the development and properties of a material obtained by grafting N,N-dimethyl aminoethyl methaacrylate (DMAEMA) and n-butyl methacrylate directly from poly(vinyl fluoride) (PVDP) backbone in solution, by atom transfer radical polymerization (ATRP). Four graft polymers, varying in density according to the number of hours of polymerization, were prepared and named PD-6, PD-12, PD-18 and PD-24 respectively. Some of the properties of these polymers include induced solubility in water, decrease in melting point by 5 to 6 degrees centigrade, 45 times higher tensile strength, 1970% increase in toughness, 5 times increase in dielectric constant and better gluing property when compared to PVDF. These properties suggest their applicability in biotechnology, nanotechnology, energy research etc.







In the context of rising global temperature due to climate change, development of heat tolerant rice varieties becomes important especially for rice-dependent countries like India. Anil Grover, in this speech ‘Molecular components involved in mounting response to high temperature stress in rice’, talked about a class of proteins called heat-shock proteins (HSPs), with particular reference to HSP100. HSPs are produced in all organisms in response to heat (or any other stress). Due to their chaperone action, they remove denatured proteins and also help to restore the native conformation of proteins. HSP100 is particularly important as it is produced very quickly in response to stress and in large quantities. The gene for HSP100 has a 2kb promoter region, an essential component.


Regulation of HSP100 in rice involves the action of cytoplasmic, mitochondrial and chloroplast associated ClpB proteins too. The function of the latter two ClpB proteins is not fully understood as yet. ClpB proteins act in association with small HSPs (sHSP). Of the 40 HSP20 genes identified in rice, 23 were found to constitute sHSPs. Further work is aimed at building a comprehensive model of rice HSP gene expression, which would aid the development of heat tolerant varieties.





Navroz Dubash, from Centre for Policy Research in his talk ‘What should be India’s strategy in climate negotiations?’, spoke about international and national debates about climate change. International debate ranges from ‘who should decrease emissions – when and by how much’ to ‘what should be India’s role in climate change mitigation’. Indians agree that being labeled as a ‘major emitter’ is unfair as we contribute only 4% of global emissions (though that makes us the fourth largest emitter) and our per capita emissions in 70% below world average. Our efforts to mitigate climate change are not being appreciated, and we already have the burden of still being a developing nation. Dubash said there are three varying views in India: (a) the ‘growth-first stonewallers’ believe that it is our turn to become a developed nation and international commitment must be stonewalled (b) the ‘progressive realists’ believe that it is unfair to have international commitments imposed on us, and that the world is using India as an excuse. However, they suggest that co-benefits need to be explored, and (c) the ‘progressive internationalists’ agree that India is being used as an excuse, but believe that we need to develop an effective climate regime as it is impossible to delink domestic and global issues. Dubash suggested that ‘the two progressive groups have to join forces for a renewed Indian climate politics’.






Malaria is a major disease in tropical countries like India. Estimates show that there are around 500 million cases of malaria worldwide, resulting in 1 million deaths annually. This warrants the need for the development of a vaccine against Plasmodium. Chetan E. Chitnis presented a new approach to vaccine development in his talk, ‘Rational design of a malaria vaccine.’ Plasmodium has a number of polymorphic strains, which has made vaccine development difficult. Chitnis concentrated on the possibility of developing a vaccine against the highly conserved binding site of the Duffy binding protein (DBP) of the pathogen which binds to the Duffy antigen chemokine receptor (DARC) on RBCs. This will inhibit RBC invasion by Plasmodium merozoites. This approach might be an effective way of protecting people in regions where malaria is endemic.

--

====================================================================

Sunday, November 8, 2009

The World of imagination Set Free before the Future Shock of inevitable breakthrough prediction !

Sir Carl Popper famously said, “By making their interpretations and prophesies sufficiently vague...; and in order to escape falsification, astrologers destroyed the testability of the theory”;(1) In reply legendary Thomas Kuhn says that “The History of Astrology during the centuries when it was intellectually reputable records many predictions that categorically failed. Not even astrology`s most convinced and vehemant exponent doubted the recurrence of such failures. Astrology cannot be barred from the sciences because of the form in which it`s predictions are being cast.” (2)

As soon as we try to understand the role and significance of prediction, we are talking about definite objective, goal to be followed in certain direction in times ahead. And mind well, when the goal is far illuminated as shining as the coveted prize like Nobel then we should really listen to Salvador Luria, a Nobel Laureate in Physiology or Medicine in 1969. He says, “The goal in science should be to find out things- not to win a prize…Yet, in conversations with some of my younger colleagues, I get a sense that it has become a goal, and that is not good. I think it would be better if there were no prizes.”

As P. Balaram pointed out in his recent editorial: “Most professional scientometric analysts have little feeling for science itself, hoping that insights might emerge from impersonal quantitative methods. In recent times practicing scientists, often physicists, have entered the field of scientometrics; bringing with them increasingly complex ways of analyzing the exploding volume of scientific literature.”(3)These lines and talk about ‘Reinventing the Research University in India’ he delievered in Jawaharlal Nehru University on 31st August 2009 ignited my curiosity about ability of scientific mind in speculating future; both about discovery/invention and prizes.(4)

Thomson Reuters have recently come up with the practice of declaring Citation Laureate, a step in the direction of predicting Nobel Laureate. Surprisingly, Thomson Reuters had predicted Elizabeth H. Blackburn, Carol W. Greider and Jack W. Szostak as probable winners for year 2009 Nobel Prizes for their discovery "of how chromosomes are protected by telomeres and the enzyme telomerase". In order to validate their claim Thomson Reuters said that “Citations are lagging indicators as to research but leading indicators as to peer system, prizes.” They owe their methodology to the cumulative work of Eugene Garfield who argues that “Nobel Laureates publish five times the average number of papers but their work is cited 30 to 50 times the average contributing to high h-index among the peer community.” (5)


How do scientific breakthroughs happen?
Before ascertaining actual correctness of forecast or prediction we should know what are the distinct features by which we assess the “Breakthrough” in the array of discovery and inventions so to qualify it one of the pioneering work waiting for global recognition. Possibilities about Breakthrough are explained by Martino(6)as:
a)Breakthrough should never cause surprise because it is inevitable.
b)Some breakthroughs are surprising but they are of no value to forecaster
c)While breakthroughs are not inevitable, they should be surprising for only those people who are not watching them


So, if we read above three possibilties in the context of work done by Moed the visualisation of new scenarios in prediction making will be less difficult. Moed says, “ The use of citation analysis in research evaluation is more appropriate (when) the more it is formal, open, scholarly founded, supplemented with expert knowledge, carried out in a clear policy context with clear objectives, stimulating users to explicitly state basic notions of scholarly quality and enlightening rather than formulaic.” (7)


I think predictability of discovery/invention is an issue of larger significance than that of award itself. Degree of Evitability or inevitability of discovery/invention matters in this world of “Big Science, Competition and National Innovation Systems”. Martino (ibid pp.211) makes a strong case of inevitability of inventions and at the same time explains the hindrances in the process of innovation due to compartmentalisation of science and due to restricted access of scholarly literature. He says, “...compartmentalisation of knowledge causes unnecessary duplication of work i.e. it forces multiple invention through stifling the normal diffusion of innovation....improving information storage and retrieval systems for scientific and technical information, that much work is repeated simply beacause it is buried in the literature and is not available for the people who need it.”


Then Martino explains the roadblocks in assessing signals providing information about forthcoming events. Two hurdles are described; a) People in general do not recognise possibility of signals and b) Signals are buried in noise. Further this analysis emphasise role of ‘pattern formation’. Pattern is formed by our attention to each signal(research) becoming meanigful when placed in the context of preciding one. Even two kinds of errors may occur at this level; a) Failure to recognise that two or more research works are interrelated and 2) Failure in connecting two research areas to form overall pattern of wider research front.


In In Kuhn’s words, “Normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like. Much of the success of the enterprise derives from the community’s willingness to defend that assumption, if necessary at considerable cost. Normal science, for example, often suppresses fundamental novelties because they are necessarily subversive of its basic commitments.” (8)
According to May, “Although the history of science is full of revolutions, scientists do not expect them or look for them (by definition –because the revolution involves the overthrow of the paradigm that represents the scientist’s concept of ‘truth’)”

He continues: Standard methods for forecasting future developments in technology cannot predict the impact of major breakthroughs or ‘revolutions’ in basic science that might lead to radically new technology and a fundamental change in the way wars are fought. While such revolutions cannot, by definition, be predicted in any detail, it is possible to identify many of the broad factors that are involved in turning scientific breakthroughs into feasible technology.”


Eventually our knowledge about future, even if it is dependent on quantitaive techniques of forecasting is bounded by famous lines of Einstein:” Not everyhting that couunts can be counted, and not everything that can be be counted counts."

References:
1)Karl Popper, Conjecture and Refutations, 1963

2)Kuhn, Thomas, Logic of Discovery or Psychology of Research (1997) ,
Criticism and the Growth of Knowledge, Ed. Imre Lakatos, Alan Musgrave

3)Balaram P. Current Science Vol.96, No. 10, 25 May, 2009

4)David Pendlebury, Discover the Power of Quantitative Analysis, The Art &
Science of Identifying Future Nobel Laureates, Presentation made at DST,
New Delhi on 5th Nov. 2009

5)Eugene Garfield, Identifying Nobel Class Scientists and the
uncertainties thereof, Thomson ISI, European Conference on Scientific
Publication In Medicne and Biomedicine, Sweden, 2006

6)Joseph P. Martino, Technological Forecasting for Decision Making,
American Elsevier Publishing Company, New York 1972

7)Henk F. Moed (July 2005), Citation Analysis in Research Evaluation,
Springer

8)Kuhn T S. The structure of scientific revolutions. Third edition,
University of Chicago Press (1996)

9)May, Andrew (2001), Science forecasting: predicting the unpredictable,
Journal of Defense Science From a sign hanging in Albert Einstein`s
office at the Institute of Advanced Study in Princeton


-------------------------------------------------------------------------------------

Monday, November 2, 2009

New Hopes towards Greater Research Collaboration through progress in the Worldwide LHC Computing Grid

“Is Internet Looking towards future through the GRIDs having Silver Clouds of Transformation ?”


In the much awaited moments ahead of November 2009, Large Hadron Colider (LHC) in CERN (European Organization for Nuclear Research) was successful in channeling a proton beam in its accelerator after one year of gap in its operations. In parallel were the excited discussions about enormous progress been made about the potential of Worldwide LHC Computing Grid (WLCG) to transform Internet into more accessible, fast medium enabling voluminous data transfer and data sharing for collaborative exchanges of information leading towards greater capabilities in Scientific Cooperation across the borders.


Grid computing revolutionizes the way scientists share and analyses data by enabling researchers to share computer power and data storage over the Internet. Grid projects already help researchers search for new wheat genes, predict storms, or simulate the Sun’s interior. The 7000-odd physicists working on experiments at the Large Hadron Collier will rely entirely on grid computing, specifically on the Worldwide LHC Computing Grid, to connect them with LHC data. The computing centres providing resources for WLCG are embedded in different operational Grid organisations, in particular EGEE (Enabling Grids for E-SciencE) and OSG (the Open Science Grid), but also several national and regional Grid structures such as GridPP in the UK, INFN Grid in Italy and NorduGrid in the Nordic region.


The Enabling Grids for E-sciencE (EGEE) project is funded by the European Commission and aims to integrate current national, regional and thematic Grid efforts, in order to create a seamless Grid infrastructure available to scientists 24 hours-a-day, for the support of scientific research. LCG and EGEE are tightly coupled and provide complementary functions. OSG (Open Science Grid) is a U.S. distributed computing infrastructure for large-scale scientific research, built and operated by a consortium of universities, national laboratories, scientific collaborations and software developers. The OSG integrates computing and storage resources from more than 50 sites in the United States, Asia and South America. The OSG is supported by the U.S. National Science Foundation and Department of Energy's Office of Science.


The Globus Alliance involves several universities and research laboratories conducting research and development to create fundamental Grid technologies and produce open-source software. The WLCG project is actively involved in the support of Globus and uses the Globus-based Virtual Data Toolkit (VDT) as part of the project middleware.


During the development of the LHC Computing Grid, many additional benefits of a distributed system became apparent: (Courtesy-CERN) Multiple copies of data can be kept in different sites, ensuring access for all scientists involved, independent of geographical location. It allows optimum use of spare capacity for multiple computer centres, making it more efficient. Having computer centres in multiple time zones eases round-the-clock monitoring and the availability of expert support. There are no single points of failure. The cost of maintenance and upgrades is distributed, since individual institutes fund local computing resources and retain responsibility for these, while still contributing to the global goal. Independently managed resources have encouraged novel approaches to computing and analysis. So-called “brain drain”, where researchers are forced to leave their country to access resources, is reduced when resources are available from their desktop. The system can be easily reconfigured to face new challenges, making it able to dynamically evolve throughout the life of the LHC, growing in capacity to meet the rising demands as more data is collected each year. It provides considerable flexibility in deciding how and where to provide future computing resources. Also, it allows community to take new advantage of new technologies that may appear and that offer improved usability, cost effectiveness and energy efficiency.


Widely reported news that “The The Grid will revolutionaize the Internet” is clarified by CERN itself. They say: “Grid computing, like the World Wide Web, is an application of the Internet. When the LHC turns on, data will be transferred from CERN to 11 large computing centers around the world at rates of up to 10 gigabits per second. Those large centers will then send and receive data from 200 smaller centers worldwide. All this data transfer will take place over the Internet. Dedicated fibre-optic links are used between CERN and the large centres; the smaller centres connect together through research networks and sometimes the standard public Internet.” (1)


Going ahead to comment on speculation on unprecedented increase in capacity of internet in sharing and downloading capacity, CERN explains: “ First, in order to get such data-transfer rates, individuals would have to do what the large particle physics computing centres have done, and set up (or lease) a dedicated fibre-optic link between their home and the source of their data. Second, today’s grid computing technologies and projects are geared toward research and businesses with highly specific needs, such as vast amounts of data to process and analyse within large, worldwide collaborations. While other computer users may benefit from grid computing through better weather prediction or more effective medications, they may not be logging onto a computing grid anytime soon. (Something called “cloud computing”, where your programs are run in a central location rather than on your own computer, may also be on the horizon.) (ibid)


But scientists and engineers looking ahead towards the Grid for not only enabling sharing of documents and MP3 files, but also connecting PCs with sensors, telescopes and tidal-wave simulators. Though the task of standardizing everything from system templates to the definitions of various resources is a mammoth one, the Global Grid Forum (GGF) can look to the early days of the Web for guidance. The Grid that organizers are building is a new kind of Internet, only this time with the creators having a better knowledge of where the bottlenecks will be. Computers on the grid can also transmit data at lightning speed. This will allow researchers facing heavy processing tasks to call on the assistance of thousands of other computers around the world. The aim is to eliminate the problem experienced by internet users who ask their machine to handle too much information. The real goal of the grid is, however, to work with the LHC in tracking down nature’s most elusive particle, the Higgs boson. Predicted in theory but never yet found, the Higgs is supposed to be what gives matter mass. The latest spin-off from CERN (the particle physics centre that created the web), the grid could also provide the kind of power needed to transmit holographic images; allow instant online gaming with hundreds of thousands of players; and offer high-definition video telephony for the price of a local call.


Research Colloboration
The WLCG project is also following developments in industry, in particular through CERN openlab, where leading IT companies are testing and validating cutting-edge Grid technologies using the LCG environment. The CERN openlab is a collaboration between CERN and industrial partners to study and develop data-intensive solutions to be used by the worldwide community of scientists working at the next-generation Large Hadron Collider. These experiments will generate enormous amounts of data - 15 million gigabytes a year - and will require a globally distributed Grid of over 150 computing centres to store and analyse the data, with a computing capacity of more than 100,000 of today’s cores.




There is no question that scientific research over the past twenty years has undergone a transformation. This transformation has occurred as a result of new technologies leading to new methods of working, have accelerated the pace of discovery and knowledge accumulation not only in the natural sciences but also in the social sciences and arts and humanities. Research today is often critically dependent on computation and data handling. The practice has become known under various terms such as e-Science, e-Research, and cyberscience. Irrespective of the name, many researchers acknowledge that the use of computational methods and data handling is central to their work.



Advances in scientific and other knowledge generated vast amounts of data which need to be managed for analysis, storage and preservation for future re-use. Larger scale science enabled by the Internet, and other information and communication technologies (ICTs), scientific instrumentation and automation of research processes has resulted in the emergence of new research paradigms that are often summarised as 'data-rich science'. A feature of this new kind of research is an unprecedented increase in complexity, in terms of the sophistication of research methods used, in terms of the scale of phenomena considered as well as the granularity of investigation. (2)





e-Research
involves the use of computer-enabled methods to achieve new, better, faster or more efficient research and innovation in any discipline. It draws on developments in computing science, computation, automation and digital communications. Such computer-enabled methods are invaluable within this context of rapid change, accumulation of knowledge and increased collaboration. They can be used by the researcher throughout the research cycle, from research design, data collection, and analysis to the dissemination of results. This is unlike other technological "equipment" which often only proves useful at certain stages of research. Researchers from all disciplines can benefit from the use of e-Research approaches, from the physical sciences to arts and humanities and the social sciences.




e-Research Technologies Supporting Collaboration
e-Research technologies support the research collaborations described above by introducing a model for resource sharing based on the notions of “resources” that are accessed through “services”. Resources can be computational resources such as high-performance computers, storage resources such as storage resource brokers or repositories, datasets held by data archives or even remote instruments such as radio telescopes. In order to make resources available to collaborating researchers, their owners provide services that provide a well-described interface specifying the operations that can be performed on or with a resource, e.g., submitting a compute job or accessing a set of data.




Computer-enabled methods of collaboration for research take many forms, including use of video conferencing, wikis, social networking websites and distributed computing itself. For example, researchers might use Access Grid for video conferencing to hold virtual meetings to discuss their projects. Access Grid and virtual research environments provide simultaneous viewing of participating groups as well as software to allow participants to interact with data on-screen. Wikis have also become a valuable collaborative tool. This is perhaps best demonstrated by the OpenWetWare website, which promotes the sharing of information between researchers working in biology, biomedical research and bioengineering using the concept of a virtual Lab Notebook. This allows researchers to publish research protocols and document experiments. It also provides information about laboratories and research groups around the world as well as courses and events of interest to the community.



Social networking sites have been used or created for research purposes. The myExperiment social website is becoming an indispensible collaboration tool for sharing scientific workflows and building communities. Such sharing cuts down on the repetition of research work, saving time and effort and leading to advances and innovation more rapidly than if researchers were on their own, without access to similar work (for comparison to their own). Other social networking sites such as Facebook have been adopted by researchers and extensions have been built to allow them to be used as portal to access research information. For example, content in the ICEAGE Digital Library can be accessed within Facebook. (ibid)




The Role of Databases for LHC Data Processing and Analysis:
Database services are required by the experiments’ online systems, for most if not all aspects of offline processing, for simulation activities as well as analysis. Some specific examples include for the PVSS Supervisory Control and Data Acquisition (SCADA) system, for detector conditions (e.g. COOL), alignment and geometry applications, for Grid Data Management (LCG File Catalog, File Transfer Service) and Storage Management (e.g. CASTOR + SRM) services, as well as Grid infrastructure and operations tools (GridView, SAM, Dashboards, VOMS). (3)



In late spring 2007 four high-energy physics (HEP) laboratories, The European Organization for Nuclear Research (CERN), the Deutsches Elektronen Synchrotron (DESY), the Fermi National Accelerator Laboratory (FNAL) and the Stanford Linear Accelerator Center (SLAC), ran a user poll to analyze the current state of HEP information systems. The goal was to achieve a better understanding of the perceptions, behaviors and wishes of the end users of these information systems. The poll received more than 2100 answers, representing about 10% of the active HEP community worldwide. The poll showed that community-based services dominate this field of research with the metadata- only search engine SPIRES-HEP [1] being the primary information gateway for most scholars. Users also gave their preferences regarding existing functionalities like access to full text and to citation information, and a list of features that they would like to have in the coming years. The results showed that the scholars attach paramount importance to three axes of excellence: access to full text, depth of coverage and quality of content. (4)



Real challenging question being faced by LHC-CERN is “Can concept of cloud computing replace that of grids?”(5) Worldwide LHC Computing Grid (LCG)– has been established, building on two main production infrastructures: those of the Open Science Grid (OSG) in the Americas, and the Enabling Grids for E-sciencE (EGEE) Grid in Europe and elsewhere. The machine itself – the Large Hadron Collider (LHC) – is situated some 100m underground beneath the French-Swiss border near Geneva, Switzerland and supports four major collaborations and their associated detectors: ATLAS, CMS, ALICE and LHCb.




Running a service where the user expectation is for support 24x7, with extremely rapid problem determination and resolution targets, is already a challenge. When this is extended to a large number of rather loosely coupled sites, the majority of which support multiple disciplines – often with conflicting requirements but always with local constraints – this becomes a major or even “grand” challenge. That this model works at the scale required by the LHC experiments –literally around the world and around the clock – is a valuable In order to process and analyze the data from the world's largest scientific machine, a worldwide grid service – the vindication of the Grid computing paradigm.




Currently, adapting an existing application to the Grid environment is a non-trivial exercise that requires an in-depth understanding not only of the Grid computing paradigm but also of the computing model of the application in question. The successful demonstration of a straightforward recipe for moving a wide range of applications – from simple to the most demanding – to Cloud environments would be a significant boost for this technology and could open the door to truly ubiquitous computing. (ibid)

-------------------------------------------------------------------------------------------------------------------------------

References:

1) ttp://public.web.cern.ch/Public/en/Spotlight/SpotlightGridFactsAndFiction-en.html

2) Voss, A., & Vander Meer, E. (2009, September 7). Research in a Connected World. Retrieved from the

Connexions Web site: http://cnx.org/content/m20834/1.3/ )

3) Maria Girone, Distributed Database Services – a Fundamental Component of the WLCG Service for the LHC Experiments – Experience and Outlook, CERN Document Server, European Organization for Nuclear Research, (Email: Maria.Girone@cern.ch

4) R Ivanov and L Raae, INSPIRE: a new scientific information system for HEP, CERN Document Server, European Organization for Nuclear Research (E-mail: Radoslav.Ivanov@cern.ch, Lars.Christian.Raae@cern.ch)

5) J.D. Shiers, Can Clouds Replace Grids? A Real-Life Exabyte-Scale Test-Case, CERN Document Server, European Organisation for Nuclear Research (CERN) (e-mail: Jamie.Shiers@cern.ch)

------------------------------------------------------------------------------------------------------------------------------------