990 resultados para Coherent backscattering
Resumo:
Recent studies of the current state of rural education and training (RET) systems in sub-Saharan Africa have assessed their ability to provide for the learning needs essential for more knowledgeable and productive small-scale rural households. These are most necessary if the endemic causes of rural poverty (poor nutrition, lack of sustainable livelihoods, etc.) are to be overcome. A brief historical background and analysis of the major current constraints to improvement in the sector are discussed. Paramount among those factors leading to its present 'malaise' is the lack of a whole-systems perspective and the absence of any coherent policy framework in most countries. There is evidence of some recent innovations, both in the public sector and through the work of non-governmental organisations (NGOs), civil society organisations (CSOs) and other private bodies. These provide hope of a new sense of direction that could lead towards meaningful 'revitalisation' of the sector. A suggested framework offers 10 key steps which, it is argued, could largely be achieved with modest internal resources and very little external support, provided that the necessary leadership and managerial capacities are in place. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Lactase persistence (LP) is common among people of European ancestry, but with the exception of some African, Middle Eastern and southern Asian groups, is rare or absent elsewhere in the world. Lactase gene haplotype conservation around a polymorphism strongly associated with LP in Europeans (-13,910 C/T) indicates that the derived allele is recent in origin and has been subject to strong positive selection. Furthermore, ancient DNA work has shown that the -13,910*T (derived) allele was very rare or absent in early Neolithic central Europeans. It is unlikely that LP would provide a selective advantage without a supply of fresh milk, and this has lead to a gene-culture coevolutionary model where lactase persistence is only favoured in cultures practicing dairying, and dairying is more favoured in lactase persistent populations. We have developed a flexible demic computer simulation model to explore the spread of lactase persistence, dairying, other subsistence practices and unlinked genetic markers in Europe and western Asia's geographic space. Using data on -13,910*T allele frequency and farming arrival dates across Europe, and approximate Bayesian computation to estimate parameters of interest, we infer that the -13,910*T allele first underwent selection among dairying farmers around 7,500 years ago in a region between the central Balkans and central Europe, possibly in association with the dissemination of the Neolithic Linearbandkeramik culture over Central Europe. Furthermore, our results suggest that natural selection favouring a lactase persistence allele was not higher in northern latitudes through an increased requirement for dietary vitamin D. Our results provide a coherent and spatially explicit picture of the coevolution of lactase persistence and dairying in Europe.
Resumo:
The Species 2000 & ITIS Catalogue of Life is planned to become a comprehensive catalogue of all known species of organisms on Earth by the year 2011. Rapid progress has been made recently and this, the sixth edition of the Annual Checklist, contains 884,552 species, approximately half of all known organisms. The present Catalogue is compiled with sectors provided by 37 taxonomic databases from around the world. Many of these contain taxonomic data and opinions from extensive networks of specialists, so that the complete work contains contributions from an estimated 2-3,000 specialists from throughout the taxonomic profession. The work of the Species 2000 and ITIS teams is to peer review databases, select appropriate sectors and to integrate the sectors into a single coherent catalogue with a single hierarchical classification. It is planned in future to introduce alternative taxonomic treatments and alternative hierarchies, but an important feature is that for those users who wish to use it, a single preferred catalogue, based on peer reviews, will continue to be provided.
Resumo:
About 5.5% of all UK hemophilia B patients have the base substitution IVS 5+13 A-->G as the only change in their factor (F)IX gene (F9). This generates a novel donor splice site which fits the consensus better than the normal intron 5 donor splice. Use of the novel splice site should result in a missense mutation followed by the abnormal addition of four amino acids to the patients' FIX. In order to explain the prevalence of this mutation, its genealogical history is examined. Analysis of restriction fragment length polymorphism in the 21 reference UK individuals (from different families) with the above mutation showed identical haplotypes in 19 while two differed from the rest and from each other. In order to investigate the history of the mutation and to verify that it had occurred independently more than once, the sequence variation in 1.5-kb segments scattered over a 13-Mb region including F9 was examined in 18 patients and 15 controls. This variation was then analyzed with a recently developed Bayesian approach that reconstructs the genealogy of the gene investigated while providing evidence of independent mutations that contribute disconnected branches to the genealogical tree. The method also provides minimum estimates of the age of the mutation inherited by the members of coherent trees. This revealed that 17 or 18 mutant genes descend from a founder who probably lived 450 years ago, while one patient carries an independent mutation. The independent recurrence of the IVS5+13 A-->G mutation strongly supports the conclusion that it is the cause of these patients' mild hemophilia.
Resumo:
Polycrystalline LiH was studied in situ using diffuse reflectance infrared Fourier transform (DRIFT) spectroscopy to investigate the effect water vapour has on the rate of production of the corrosion products, particularly LiOH. The reaction rate of the formation of surface LiOH was monitored by measurement of the hydroxyl (OH) band at 3676 cm(-1). The initial hydrolysis rate of LiH exposed to water vapour at 50% relative humidity was found to be almost two times faster than LiH exposed to water vapour at 2% relative humidity. The hydrolysis rate was shown to be initially very rapid followed by a much slower, almost linear rate. The change in hydrolysis rate was attributed to the formation of a coherent layer of LiOH on the LiH surface. Exposure to lower levels of water vapour appeared to result in the formation of a more coherent corrosion product, resulting in effective passivation of the surface to further attack from water. Crown Copyright (c) 2007 Published by Elsevier B.V. All rights reserved.
Resumo:
Theoretical understanding of the implementation and use of innovations within construction contexts is discussed and developed. It is argued that both the rhetoric of the 'improvement agenda' within construction and theories of innovation fail to account for the complex contexts and disparate perspectives which characterize construction work. To address this, the concept of relative boundedness is offered. Relatively unbounded innovation is characterized by a lack of a coherent central driving force or mediator with the ability to reconcile potential conflicts and overcome resistance to implementation. This is a situation not exclusive to, but certainly indicative of, much construction project work. Drawing on empirical material from the implementation of new design and coordination technologies on a large construction project, the concept is developed, concentrating on the negotiations and translations implementation mobilized. An actor-network theory (ANT) approach is adopted, which emphasizes the roles that both human actors and non-human agents play in the performance and outcomes of these interactions. Three aspects of how relative boundedness is constituted and affected are described; through the robustness of existing practices and expectations, through the delegation of interests on to technological artefacts and through the mobilization of actors and artefacts to constrain and limit the scope of negotiations over new technology implementation.
Resumo:
A whole life-cycle information management vision is proposed, the organizational requirements for the realization of the scenario is investigated. Preliminary interviews with construction professionals are reported. Discontinuities at information transfer throughout life-cycle of built environments are resulting from lack of coordination and multiple data collection/storage practices. A more coherent history of these activities can improve the work practices of various teams by augmenting decision making processes and creating organizational learning opportunities. Therefore, there is a need for unifying these fragmented bits of data to create a meaningful, semantically rich and standardized information repository for built environment. The proposed vision utilizes embedded technologies and distributed building information models. Two diverse construction project types (large one-off design, small repetitive design) are investigated for the applicability of the vision. A functional prototype software/hardware system for demonstrating the practical use of this vision is developed and discussed. Plans for case-studies for validating the proposed model at a large PFI hospital and housing association projects are discussed.
Resumo:
Climate change is one of the major challenges facing economic systems at the start of the 21st century. Reducing greenhouse gas emissions will require both restructuring the energy supply system (production) and addressing the efficiency and sufficiency of the social uses of energy (consumption). The energy production system is a complicated supply network of interlinked sectors with 'knock-on' effects throughout the economy. End use energy consumption is governed by complex sets of interdependent cultural, social, psychological and economic variables driven by shifts in consumer preference and technological development trajectories. To date, few models have been developed for exploring alternative joint energy production-consumption systems. The aim of this work is to propose one such model. This is achieved in a methodologically coherent manner through integration of qualitative input-output models of production, with Bayesian belief network models of consumption, at point of final demand. The resulting integrated framework can be applied either (relatively) quickly and qualitatively to explore alternative energy scenarios, or as a fully developed quantitative model to derive or assess specific energy policy options. The qualitative applications are explored here.
Resumo:
The rational for this review is to provide a coherent formulation of the cognitive neurochemistry of nicotine, with the aim of suggesting research and clinical applications. The first part is a comprehensive review of the empirical studies of the enhancing effects of nicotine on information processing, especially those on attentional and mnemonic processing. Then, these studies are put in the context of recent studies on the neurochemistry of nicotine and cholinergic drugs, in general. They suggest a positive effect of nicotine on processes acting on encoded material during the post acquisition phase, the process of consolidation. Thus, the involvement of nicotinic receptors in mnemonic processing is modulation of the excitability of neurons in the hippocampal formation to enable associative processing.
Resumo:
Given the paucity of research in this area, the primary aim of this study was to explore how parents of infants with unclear sex at birth made sense of 'intersex'. Qualitative methods were, used (semi-structured interviews, interpretative phenomenological analysis) with 10 parents to generate pertinent themes and provide ideas for further research. Our analysis highlights the fundamental shock engendered by the uncertain sex status of children, and documents parental struggles to negotiate a coherent sex identity for their children. Findings are discussed in light of the rigid two-sex system which pervades medicine and everyday life, and we argue that greater understanding of the complexity of sex and gender is required in order to facilitate better service provision and, ultimately, greater informed consent and parental participation regarding decisions about their children's status.
Resumo:
The transreal numbers are a total number system in which even, arithmetical operation is well defined even-where. This has many benefits over the real numbers as a basis for computation and, possibly, for physical theories. We define the topology of the transreal numbers and show that it gives a more coherent interpretation of two's complement arithmetic than the conventional integer model. Trans-two's-complement arithmetic handles the infinities and 0/0 more coherently, and with very much less circuitry, than floating-point arithmetic. This reduction in circuitry is especially beneficial in parallel computers, such as the Perspex machine, and the increase in functionality makes Digital Signal Processing chips better suited to general computation.
Resumo:
Requirements analysis focuses on stakeholders concerns and their influence towards e-government systems. Some characteristics of stakeholders concerns clearly show the complexity and conflicts. This imposes a number of questions in the requirements analysis, such as how are they relevant to stakeholders? What are their needs? How conflicts among the different stakeholders can be resolved? And what coherent requirements can be methodologically produced? This paper describes the problem articulation method in organizational semiotics which can be used to conduct such complex requirements analysis. The outcomes of the analysis enable e-government systems development and management to meet userspsila needs. A case study of Yantai Citizen Card is chosen to illustrate a process of analysing stakeholders in the lifecycle of requirements analysis.
Resumo:
Information provision to address the changing requirements can be best supported by content management. The Current information technology enables information to be stored and provided from various distributed sources. To identify and retrieve relevant information requires effective mechanisms for information discovery and assembly. This paper presents a method, which enables the design of such mechanisms, with a set of techniques for articulating and profiling users' requirements, formulating information provision specifications, realising management of information content in repositories, and facilitating response to the user's requirements dynamically during the process of knowledge construction. These functions are represented in an ontology which integrates the capability of the mechanisms. The ontological modelling in this paper has adopted semiotics principles with embedded norms to ensure coherent course of actions represented in these mechanisms. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Although extensively studied within the lidar community, the multiple scattering phenomenon has always been considered a rare curiosity by radar meteorologists. Up to few years ago its appearance has only been associated with two- or three-body-scattering features (e.g. hail flares and mirror images) involving highly reflective surfaces. Recent atmospheric research aimed at better understanding of the water cycle and the role played by clouds and precipitation in affecting the Earth's climate has driven the deployment of high frequency radars in space. Examples are the TRMM 13.5 GHz, the CloudSat 94 GHz, the upcoming EarthCARE 94 GHz, and the GPM dual 13-35 GHz radars. These systems are able to detect the vertical distribution of hydrometeors and thus provide crucial feedbacks for radiation and climate studies. The shift towards higher frequencies increases the sensitivity to hydrometeors, improves the spatial resolution and reduces the size and weight of the radar systems. On the other hand, higher frequency radars are affected by stronger extinction, especially in the presence of large precipitating particles (e.g. raindrops or hail particles), which may eventually drive the signal below the minimum detection threshold. In such circumstances the interpretation of the radar equation via the single scattering approximation may be problematic. Errors will be large when the radiation emitted from the radar after interacting more than once with the medium still contributes substantially to the received power. This is the case if the transport mean-free-path becomes comparable with the instrument footprint (determined by the antenna beam-width and the platform altitude). This situation resembles to what has already been experienced in lidar observations, but with a predominance of wide- versus small-angle scattering events. At millimeter wavelengths, hydrometeors diffuse radiation rather isotropically compared to the visible or near infrared region where scattering is predominantly in the forward direction. A complete understanding of radiation transport modeling and data analysis methods under wide-angle multiple scattering conditions is mandatory for a correct interpretation of echoes observed by space-borne millimeter radars. This paper reviews the status of research in this field. Different numerical techniques currently implemented to account for higher order scattering are reviewed and their weaknesses and strengths highlighted. Examples of simulated radar backscattering profiles are provided with particular emphasis given to situations in which the multiple scattering contributions become comparable or overwhelm the single scattering signal. We show evidences of multiple scattering effects from air-borne and from CloudSat observations, i.e. unique signatures which cannot be explained by single scattering theory. Ideas how to identify and tackle the multiple scattering effects are discussed. Finally perspectives and suggestions for future work are outlined. This work represents a reference-guide for studies focused at modeling the radiation transport and at interpreting data from high frequency space-borne radar systems that probe highly opaque scattering media such as thick ice clouds or precipitating clouds.
Resumo:
The relationship between tropical convection, surface fluxes, and sea surface temperature (SST) on intraseasonal timescales has been examined as part of an investigation of the possibility that the intraseasonal oscillation is a coupled atmosphere–ocean phenomenon. The unique feature of this study is that 15 yr of data and the whole region from the Indian Ocean to the Pacific Ocean have been analyzed using lag-correlation analysis and compositing techniques. A coherent relationship between convection, surface fluxes, and SST has been found on intraseasonal timescales in the Indian Ocean, Maritime Continent, and west Pacific regions of the Tropics. Prior to the maximum in convection, there are positive shortwave and latent heat flux anomalies into the surface, followed by warm SST anomalies about 10 days before the convective maximum. Coincident with the convective maximum, there is a minimum in the shortwave flux, followed by a cooling due to increased evaporation associated with enhanced westerly wind stress, leading to negative SST anomalies about 10 days after the convection. The relationships are robust from year to year, including both phases of the El Niño–Southern Oscillation (ENSO) although the eastward extent of the region over which the relationship holds varies with the phase of ENSO, consistent with the variations in the eastward extent of the warm pool and westerly winds. The spatial scale of the anomalies is about 60° longitude, consistent with the scale of the intraseasonal oscillation. The spatial and temporal characteristics of the surface flux and SST perturbations are consistent with the surface flux variations forcing the ocean, and the magnitudes of the anomalies are consistent with mixed-layer depths appropriate to the Indian Ocean and west Pacific