26 resultados para Not available.

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Jupiter’s magnetosphere acts as a point source of near-relativistic electrons within the heliosphere. In this study, three solar cycles of Jovian electron data in near-Earth space are examined. Jovian electron intensity is found to peak for an ideal Parker spiral connection, but with considerable spread about this point. Assuming the peak in Jovian electron counts indicates the best magnetic connection to Jupiter, we find a clear trend for fast and slow solar wind to be over- and under-wound with respect to the ideal Parker spiral, respectively. This is shown to be well explained in terms of solar wind stream interactions. Thus, modulation of Jovian electrons by corotating interaction regions (CIRs) may primarily be the result of changing magnetic connection, rather than CIRs acting as barriers to cross-field diffusion. By using Jovian electrons to remote sensing magnetic connectivity with Jupiter’s magnetosphere, we suggest that they provide a means to validate solar wind models between 1 and 5 AU, even when suitable in situ solar wind observations are not available. Furthermore, using Jovian electron observations as probes of heliospheric magnetic topology could provide insight into heliospheric magnetic field braiding and turbulence, as well as any systematic under-winding of the heliospheric magnetic field relative to the Parker spiral from footpoint motion of the magnetic field.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new snow-soil-vegetation-atmosphere transfer (Snow-SVAT) scheme, which simulates the accumulation and ablation of the snow cover beneath a forest canopy, is presented. The model was formulated by coupling a canopy optical and thermal radiation model to a physically-based multi-layer snow model. This canopy radiation model is physically-based yet requires few parameters, so can be used when extensive in-situ field measurements are not available. Other forest effects such as the reduction of wind speed, interception of snow on the canopy and the deposition of litter were incorporated within this combined model, SNOWCAN, which was tested with data taken as part of the Boreal Ecosystem-Atmosphere Study (BOREAS) international collaborative experiment. Snow depths beneath four different canopy types and at an open site were simulated. Agreement between observed and simulated snow depths was generally good, with correlation coefficients ranging between r^2=0.94 and r^2=0.98 for all sites where automatic measurements were available. However, the simulated date of total snowpack ablation generally occurred later than the observed date. A comparison between simulated solar radiation and limited measurements of sub-canopy radiation at one site indicates that the model simulates the sub-canopy downwelling solar radiation early in the season to within measurement uncertainty.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The eukaryotic genome is a mosaic of eubacterial and archaeal genes in addition to those unique to itself. The mosaic may have arisen as the result of two prokaryotes merging their genomes, or from genes acquired from an endosymbiont of eubacterial origin. A third possibility is that the eukaryotic genome arose from successive events of lateral gene transfer over long periods of time. This theory does not exclude the endosymbiont, but questions whether it is necessary to explain the peculiar set of eukaryotic genes. We use phylogenetic studies and reconstructions of ancestral first appearances of genes on the prokaryotic phylogeny to assess evidence for the lateral gene transfer scenario. We find that phylogenies advanced to support fusion can also arise from a succession of lateral gene transfer events. Our reconstructions of ancestral first appearances of genes reveal that the various genes that make up the eukaryotic mosaic arose at different times and in diverse lineages on the prokaryotic tree, and were not available in a single lineage. Successive events of lateral gene transfer can explain the unusual mosaic structure of the eukaryotic genome, with its content linked to the immediate adaptive value of the genes its acquired. Progress in understanding eukaryotes may come from identifying ancestral features such as the eukaryotic splicesome that could explain why this lineage invaded, or created, the eukaryoticniche.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Matrix-assisted laser desorption/ionization (MALDI) is a key technique in mass spectrometry (MS)-based proteomics. MALDI MS is extremely sensitive, easy-to-apply, and relatively tolerant to contaminants. Its high-speed data acquisition and large-scale, off-line sample preparation has made it once again the focus for high-throughput proteomic analyses. These and other unique properties of MALDI offer new possibilities in applications such as rapid molecular profiling and imaging by MS. Proteomics and its employment in Systems Biology and other areas that require sensitive and high-throughput bioanalytical techniques greatly depend on these methodologies. This chapter provides a basic introduction to the MALDI methodology and its general application in proteomic research. It describes the basic MALDI sample preparation steps and two easy-to-follow examples for protein identification including extensive notes on these topics with practical tips that are often not available in the Subheadings 2 and 3 of research articles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Importance measures in reliability engineering are used to identify weak areas of a system and signify the roles of components in either causing or contributing to proper functioning of the system. Traditional importance measures for multistate systems mainly concern reliability importance of an individual component and seldom consider the utility performance of the systems. This paper extends the joint importance concepts of two components from the binary system case to the multistate system case. A joint structural importance and a joint reliability importance are defined on the basis of the performance utility of the system. The joint structural importance measures the relationship of two components when the reliabilities of components are not available. The joint reliability importance is inferred when the reliabilities of the components are given. The properties of the importance measures are also investigated. A case study for an offshore electrical power generation system is given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The presented study examined the opinion of in-service and prospective chemistry teachers about the importance of usage of molecular and crystal models in secondary-level school practice, and investigated some of the reasons for their (non-) usage. The majority of participants stated that the use of models plays an important role in chemistry education and that they would use them more often if the circumstances were more favourable. Many teachers claimed that three-dimensional (3d) models are still not available in sufficient number at their schools; they also pointed to the lack of available computer facilities during chemistry lessons. The research revealed that, besides the inadequate material circumstances, less than one third of participants are able to use simple (freeware) computer programs for drawing molecular structures and their presentation in virtual space; however both groups of teachers expressed the willingness to improve their knowledge in the subject area. The investigation points to several actions which could be undertaken to improve the current situation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

G-protein-coupled receptors are desensitized by a two-step process. In a first step, G-protein-coupled receptor kinases (GRKs) phosphorylate agonist-activated receptors that subsequently bind to a second class of proteins, the arrestins. GRKs can be classified into three subfamilies, which have been implicated in various diseases. The physiological role(s) of GRKs have been difficult to study as selective inhibitors are not available. We have used SELEX (systematic evolution of ligands by exponential enrichment) to develop RNA aptamers that potently and selectively inhibit GRK2. This process has yielded an aptamer, C13, which bound to GRK2 with a high affinity and inhibited GRK2-catalyzed rhodopsin phosphorylation with an IC50 of 4.1 nM. Phosphorylation of rhodopsin catalyzed by GRK5 was also inhibited, albeit with 20-fold lower potency (IC50 of 79 nM). Furthermore, C13 reveals significant specificity, since almost no inhibitory activity was detectable testing it against a panel of 14 other kinases. The aptamer is two orders of magnitude more potent than the best GRK2 inhibitors described previously and shows high selectivity for the GRK family of protein kinases.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Binocular disparity, blur, and proximal cues drive convergence and accommodation. Disparity is considered to be the main vergence cue and blur the main accommodation cue. We have developed a remote haploscopic photorefractor to measure simultaneous vergence and accommodation objectively in a wide range of participants of all ages while fixating targets at between 0.3 and 2 m. By separating the three main near cues, we can explore their relative weighting in three-, two-, one-, and zero-cue conditions. Disparity can be manipulated by remote occlusion; blur cues manipulated by using either a Gabor patch or a detailed picture target; looming cues by either scaling or not scaling target size with distance. In normal orthophoric, emmetropic, symptom-free, naive visually mature participants, disparity was by far the most significant cue to both vergence and accommodation. Accommodation responses dropped dramatically if disparity was not available. Blur only had a clinically significant effect when disparity was absent. Proximity had very little effect. There was considerable interparticipant variation. We predict that relative weighting of near cue use is likely to vary between clinical groups and present some individual cases as examples. We are using this naturalistic tool to research strabismus, vergence and accommodation development, and emmetropization.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Normally wind measurements from Doppler radars rely on the presence of rain. During fine weather, insects become a potential radar target for wind measurement. However, it is difficult to separate ground clutter and insect echoes when spectral or polarimetric methods are not available. Archived reflectivity and velocity data from repeated scans provide alternative methods. The probability of detection (POD) method, which maps areas with a persistent signal as ground clutter, is ineffective when most scans also contain persistent insect echoes. We developed a clutter detection method which maps the standard deviation of velocity (SDV) over a large number of scans, and can differentiate insects and ground clutter close to the radar. Beyond the range of persistent insect echoes, the POD method more thoroughly removes ground clutter. A new, pseudo-probability clutter map was created by combining the POD and SDV maps. The new map optimised ground clutter detection without removing insect echoes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The simulation and development work that has been undertaken to produce a signal equaliser used to improve the data rates from oil well logging instruments is presented. The instruments are lowered into the drill bore hole suspended by a cable which has poor electrical characteristics. The equaliser described in the paper corrects for the distortions that occur from the cable (dispersion and attenuation) with the result that the instrument can send data at 100 K.bits/second down its own suspension cable of 12 Km in length. The use of simulation techniques and tools were invaluable in generating a model for the distortions and proved to be a useful tool when site testing was not available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is intense scientific and public interest in the Intergovernmental Panel on Climate Change (IPCC) projections of sea level for the twenty-first century and beyond. The Fourth Assessment Report (AR4) projections, obtained by applying standard methods to the results of the World Climate Research Programme Coupled Model Experiment, includes estimates of ocean thermal expansion, the melting of glaciers and ice caps (G&ICs), increased melting of the Greenland Ice Sheet, and increased precipitation over Greenland and Antarctica, partially offsetting other contributions. The AR4 recognized the potential for a rapid dynamic ice sheet response but robust methods for quantifying it were not available. Illustrative scenarios suggested additional sea level rise on the order of 10 to 20 cm or more, giving a wide range in the global averaged projections of about 20 to 80 cm by 2100. Currently, sea level is rising at a rate near the upper end of these projections. Since publication of the AR4 in 2007, biases in historical ocean temperature observations have been identified and significantly reduced, resulting in improved estimates of ocean thermal expansion. Models that include all climate forcings are in good agreement with these improved observations and indicate the importance of stratospheric aerosol loadings from volcanic eruptions. Estimates of the volumes of G&ICs and their contributions to sea level rise have improved. Results from recent (but possibly incomplete) efforts to develop improved ice sheet models should be available for the 2013 IPCC projections. Improved understanding of sea level rise is paving the way for using observations to constrain projections. Understanding of the regional variations in sea level change as a result of changes in ocean properties, wind-stress patterns, and heat and freshwater inputs into the ocean is improving. Recently, estimates of sea level changes resulting from changes in Earth's gravitational field and the solid Earth response to changes in surface loading have been included in regional projections. While potentially valuable, semi-empirical models have important limitations, and their projections should be treated with caution

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Linear models of bidirectional reflectance distribution are useful tools for understanding the angular variability of surface reflectance as observed by medium-resolution sensors such as the Moderate Resolution Imaging Spectrometer. These models are operationally used to normalize data to common view and illumination geometries and to calculate integral quantities such as albedo. Currently, to compensate for noise in observed reflectance, these models are inverted against data collected during some temporal window for which the model parameters are assumed to be constant. Despite this, the retrieved parameters are often noisy for regions where sufficient observations are not available. This paper demonstrates the use of Lagrangian multipliers to allow arbitrarily large windows and, at the same time, produce individual parameter sets for each day even for regions where only sparse observations are available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A precipitation downscaling method is presented using precipitation from a general circulation model (GCM) as predictor. The method extends a previous method from monthly to daily temporal resolution. The simplest form of the method corrects for biases in wet-day frequency and intensity. A more sophisticated variant also takes account of flow-dependent biases in the GCM. The method is flexible and simple to implement. It is proposed here as a correction of GCM output for applications where sophisticated methods are not available, or as a benchmark for the evaluation of other downscaling methods. Applied to output from reanalyses (ECMWF, NCEP) in the region of the European Alps, the method is capable of reducing large biases in the precipitation frequency distribution, even for high quantiles. The two variants exhibit similar performances, but the ideal choice of method can depend on the GCM/reanalysis and it is recommended to test the methods in each case. Limitations of the method are found in small areas with unresolved topographic detail that influence higher-order statistics (e.g. high quantiles). When used as benchmark for three regional climate models (RCMs), the corrected reanalysis and the RCMs perform similarly in many regions, but the added value of the latter is evident for high quantiles in some small regions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.