78 resultados para RANK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The proteome of Salmonella enterica serovar Typhimurium was characterized by 2-dimensional HPLC mass spectrometry to provide a platform for subsequent proteomic investigations of low level multiple antibiotic resistance (MAR). Bacteria (2.15 +/- 0.23 x 10(10) cfu; mean +/- s.d.) were harvested from liquid culture and proteins differentially fractionated, on the basis of solubility, into preparations representative of the cytosol, cell envelope and outer membrane proteins (OMPs). These preparations were digested by treatment with trypsin and peptides separated into fractions (n = 20) by strong cation exchange chromatography (SCX). Tryptic peptides in each SCX fraction were further separated by reversed-phase chromatography and detected by mass spectrometry. Peptides were assigned to proteins and consensus rank listings compiled using SEQUEST. A total of 816 +/- 11 individual proteins were identified which included 371 +/- 33, 565 +/- 15 and 262 +/- 5 from the cytosolic, cell envelope and OMP preparations, respectively. A significant correlation was observed (r(2) = 0.62 +/- 0.10; P < 0.0001) between consensus rank position for duplicate cell preparations and an average of 74 +/- 5% of proteins were common to both replicates. A total of 34 outer membrane proteins were detected, 20 of these from the OMP preparation. A range of proteins (n = 20) previously associated with the mar locus in E. coli were also found including the key MAR effectors AcrA, TolC and OmpF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There have been limited recent advances in understanding of what influences uptake of innovations despite the current international focus on smallholder agriculture as a means of achieving food security and rural development. This paper provides a rigorous study of factors influencing adoption by smallholders in central Mexico and builds on findings to identify a broad approach to significantly improve research on and understanding of factors influencing adoption by smallholders in developing countries. Small-scale dairy systems play an important role in providing income, employment and nutrition in the highlands of central Mexico. A wide variety of practices and technologies have been promoted by the government public services to increase milk production and economic efficiency, but there have been very low levels of uptake of most innovations, with the exception of improving grassland through introduction of grass varieties together with management practices. A detailed study was conducted with 80 farmers who are already engaged with the use of this innovation to better understand the process of adoption and identify socioeconomic and farm variables, cognitive (beliefs), and social–psychological (social norms) factors associated with farmers' use of improved grassland. The Theory of Reasoned Action (TRA) was used as a theoretical framework and Spearman Rank Order correlation was conducted to analyse the data. Most farmers (92.5%) revealed strong intention to continue to use improved grassland (which requires active management and investment of resources) for the next 12 months; whereas 7.5% of farmers were undecided and showed weak intention, which was associated with farmers whose main income was from non-farm activities as well as with farmers who had only recently started using improved grassland. Despite farmers' experience of using improved grassland (mean of 18 years) farmers' intentions to continue to adopt it was influenced almost as much by salient referents (mainly male relatives) as by their own attitudes. The hitherto unnoticed longevity of the role social referents play in adoption decisions is an important finding and has implications for further research and for the design of extension approaches. The study demonstrates the value and importance of using TRA or TPB approaches to understand social cognitive (beliefs) and social–psychological (social norms) factors in the study of adoption. However, other factors influencing adoption processes need to be included to provide fuller understanding. An approach that would enable this, and the development of more generalisable findings than from location specific case studies, and contribute to broader conceptualisation, is proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a description of the theoretical framework and "best practice" for using the paleo-climate model component of the Coupled Model Intercomparison Project (Phase 5) (CMIP5) to constrain future projections of climate using the same models. The constraints arise from measures of skill in hindcasting paleo-climate changes from the present over 3 periods: the Last Glacial Maximum (LGM) (21 thousand years before present, ka), the mid-Holocene (MH) (6 ka) and the Last Millennium (LM) (850–1850 CE). The skill measures may be used to validate robust patterns of climate change across scenarios or to distinguish between models that have differing outcomes in future scenarios. We find that the multi-model ensemble of paleo-simulations is adequate for addressing at least some of these issues. For example, selected benchmarks for the LGM and MH are correlated to the rank of future projections of precipitation/temperature or sea ice extent to indicate that models that produce the best agreement with paleoclimate information give demonstrably different future results than the rest of the models. We also find that some comparisons, for instance associated with model variability, are strongly dependent on uncertain forcing timeseries, or show time dependent behaviour, making direct inferences for the future problematic. Overall, we demonstrate that there is a strong potential for the paleo-climate simulations to help inform the future projections and urge all the modeling groups to complete this subset of the CMIP5 runs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Possible changes in the frequency and intensity of windstorms under future climate conditions during the 21st century are investigated based on an ECHAM5 GCM multi-scenario ensemble. The intensity of a storm is quantified by the associated estimated loss derived with using an empirical model. The geographical focus is ‘Core Europe’, which comprises countries of Western Europe. Possible changes of losses are analysed by comparing ECHAM5 GCM data for recent (20C, 1960 to 2000) and future climate conditions (B1, A1B, A2; 2060 to 2100), each with 3 ensemble members. Changes are quantified using both rank statistics and return periods (RP) estimated by fitting an extreme value distribution using the peak over threshold method to potential storm losses. The estimated losses for ECHAM5 20C and reanalysis events show similar statistical features in terms of return periods. Under future climate conditions, all climate scenarios show an increase in both frequency and magnitude of potential losses caused by windstorms for Core Europe. Future losses that are double the highest ECHAM5 20C loss are identified for some countries. While positive changes of ranking are significant for many countries and multiple scenarios, significantly shorter RPs are mostly found under the A2 scenario for return levels correspondent to 20 yr losses or less. The emergence time of the statistically significant changes in loss varies from 2027 to 2100. These results imply an increased risk of occurrence of windstorm-associated losses, which can be largely attributed to changes in the meteorological severity of the events. Additionally, factors such as changes in the cyclone paths and in the location of the wind signatures relative to highly populated areas are also important to explain the changes in estimated losses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The complexity of current and emerging high performance architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven performance modelling approach is outlined that is appro- priate for modern multicore architectures. The approach is demonstrated by constructing a model of a simple shallow water code on a Cray XE6 system, from application-specific benchmarks that illustrate precisely how architectural char- acteristics impact performance. The model is found to recre- ate observed scaling behaviour up to 16K cores, and used to predict optimal rank-core affinity strategies, exemplifying the type of problem such a model can be used for.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Before the advent of genome-wide association studies (GWASs), hundreds of candidate genes for obesity-susceptibility had been identified through a variety of approaches. We examined whether those obesity candidate genes are enriched for associations with body mass index (BMI) compared with non-candidate genes by using data from a large-scale GWAS. A thorough literature search identified 547 candidate genes for obesity-susceptibility based on evidence from animal studies, Mendelian syndromes, linkage studies, genetic association studies and expression studies. Genomic regions were defined to include the genes ±10 kb of flanking sequence around candidate and non-candidate genes. We used summary statistics publicly available from the discovery stage of the genome-wide meta-analysis for BMI performed by the genetic investigation of anthropometric traits consortium in 123 564 individuals. Hypergeometric, rank tail-strength and gene-set enrichment analysis tests were used to test for the enrichment of association in candidate compared with non-candidate genes. The hypergeometric test of enrichment was not significant at the 5% P-value quantile (P = 0.35), but was nominally significant at the 25% quantile (P = 0.015). The rank tail-strength and gene-set enrichment tests were nominally significant for the full set of genes and borderline significant for the subset without SNPs at P < 10(-7). Taken together, the observed evidence for enrichment suggests that the candidate gene approach retains some value. However, the degree of enrichment is small despite the extensive number of candidate genes and the large sample size. Studies that focus on candidate genes have only slightly increased chances of detecting associations, and are likely to miss many true effects in non-candidate genes, at least for obesity-related traits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new Bayesian econometric specification for a hypothetical Discrete Choice Experiment (DCE) incorporating respondent ranking information about attribute importance. Our results indicate that a DCE debriefing question that asks respondents to rank the importance of attributes helps to explain the resulting choices. We also examine how mode of survey delivery (online and mail) impacts model performance, finding that results are not substantively a§ected by the mode of survey delivery. We conclude that the ranking data is a complementary source of information about respondent utility functions within hypothetical DCEs

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A continuum model describing sea ice as a layer of granulated thick ice, consisting of many rigid, brittle floes, intersected by long and narrow regions of thinner ice, known as leads, is developed. We consider the evolution of mesoscale leads, formed under extension, whose lengths span many floes, so that the surrounding ice is treated as a granular plastic. The leads are sufficiently small with respect to basin scales of sea ice deformation that they may be modelled using a continuum approach. The model includes evolution equations for the orientational distribution of leads, their thickness and width expressed through second-rank tensors and terms requiring closures. The closing assumptions are constructed for the case of negligibly small lead ice thickness and the canonical deformation types of pure and simple shear, pure divergence and pure convergence. We present a new continuum-scale sea ice rheology that depends upon the isotropic, material rheology of sea ice, the orientational distribution of lead properties and the thick ice thickness. A new model of lead and thick ice interaction is presented that successfully describes a number of effects: (i) because of its brittle nature, thick ice does not thin under extension and (ii) the consideration of the thick sea ice as a granular material determines finite lead opening under pure shear, when granular dilation is unimportant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop the essential ingredients of a new, continuum and anisotropic model of sea-ice dynamics designed for eventual use in climate simulation. These ingredients are a constitutive law for sea-ice stress, relating stress to the material properties of sea ice and to internal variables describing the sea-ice state, and equations describing the evolution of these variables. The sea-ice cover is treated as a densely flawed two-dimensional continuum consisting of a uniform field of thick ice that is uniformly permeated with narrow linear regions of thinner ice called leads. Lead orientation, thickness and width distributions are described by second-rank tensor internal variables: the structure, thickness and width tensors, whose dynamics are governed by corresponding evolution equations accounting for processes such as new lead generation and rotation as the ice cover deforms. These evolution equations contain contractions of higher-order tensor expressions that require closures. We develop a sea-ice stress constitutive law that relates sea-ice stress to the structure tensor, thickness tensor and strain rate. For the special case of empty leads (containing no ice), linear closures are adopted and we present calculations for simple shear, convergence and divergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A potential problem with Ensemble Kalman Filter is the implicit Gaussian assumption at analysis times. Here we explore the performance of a recently proposed fully nonlinear particle filter on a high-dimensional but simplified ocean model, in which the Gaussian assumption is not made. The model simulates the evolution of the vorticity field in time, described by the barotropic vorticity equation, in a highly nonlinear flow regime. While common knowledge is that particle filters are inefficient and need large numbers of model runs to avoid degeneracy, the newly developed particle filter needs only of the order of 10-100 particles on large scale problems. The crucial new ingredient is that the proposal density cannot only be used to ensure all particles end up in high-probability regions of state space as defined by the observations, but also to ensure that most of the particles have similar weights. Using identical twin experiments we found that the ensemble mean follows the truth reliably, and the difference from the truth is captured by the ensemble spread. A rank histogram is used to show that the truth run is indistinguishable from any of the particles, showing statistical consistency of the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Position in the social hierarchy can influence brain dopamine function and cocaine reinforcement in nonhuman primates during early cocaine exposure. With prolonged exposure, however, initial differences in rates of cocaine self-administration between dominant and subordinate monkeys dissipate. The present studies used a choice procedure to assess the relative reinforcing strength of cocaine in group-housed male cynomolgus monkeys with extensive cocaine self-administration histories. Responding was maintained under a concurrent fixed-ratio 50 schedule of food and cocaine (0.003-0.1 mg/kg per injection) presentation. Responding on the cocaine-associated lever increased as a function of cocaine dose in all monkeys. Although response distribution was similar across social rank when saline or relatively low or high cocaine doses were the alternative to food, planned t tests indicated that cocaine choice was significantly greater in subordinate monkeys when choice was between an intermediate dose (0.01 mg/kg) and food. When a between-session progressive-ratio procedure was used to increase response requirements for the preferred reinforcer (either cocaine or food), choice of that reinforcer decreased in all monkeys. The average response requirement that produced a shift in response allocation from the cocaine-associated lever to the food-associated lever was higher in subordinates across cocaine doses, an effect that trended toward significance (p = 0.053). These data indicate that despite an extensive history of cocaine self-administration, most subordinate monkeys were more sensitive to the relative reinforcing strength of cocaine than dominant monkeys.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of rank dependent preference functionals under risk is comprehensively evaluated using Bayesian model averaging. Model comparisons are made at three levels of heterogeneity plus three ways of linking deterministic and stochastic models: the differences in utilities, the differences in certainty equivalents and contextualutility. Overall, the"bestmodel", which is conditional on the form of heterogeneity is a form of Rank Dependent Utility or Prospect Theory that cap tures the majority of behaviour at both the representative agent and individual level. However, the curvature of the probability weighting function for many individuals is S-shaped, or ostensibly concave or convex rather than the inverse S-shape commonly employed. Also contextual utility is broadly supported across all levels of heterogeneity. Finally, the Priority Heuristic model, previously examined within a deterministic setting, is estimated within a stochastic framework, and allowing for endogenous thresholds does improve model performance although it does not compete well with the other specications considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In cooperative communication networks, owing to the nodes' arbitrary geographical locations and individual oscillators, the system is fundamentally asynchronous. Such a timing mismatch may cause rank deficiency of the conventional space-time codes and, thus, performance degradation. One efficient way to overcome such an issue is the delay-tolerant space-time codes (DT-STCs). The existing DT-STCs are designed assuming that the transmitter has no knowledge about the channels. In this paper, we show how the performance of DT-STCs can be improved by utilizing some feedback information. A general framework for designing DT-STC with limited feedback is first proposed, allowing for flexible system parameters such as the number of transmit/receive antennas, modulated symbols, and the length of codewords. Then, a new design method is proposed by combining Lloyd's algorithm and the stochastic gradient-descent algorithm to obtain optimal codebook of STCs, particularly for systems with linear minimum-mean-square-error receiver. Finally, simulation results confirm the performance of the newly designed DT-STCs with limited feedback.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In general, particle filters need large numbers of model runs in order to avoid filter degeneracy in high-dimensional systems. The recently proposed, fully nonlinear equivalent-weights particle filter overcomes this requirement by replacing the standard model transition density with two different proposal transition densities. The first proposal density is used to relax all particles towards the high-probability regions of state space as defined by the observations. The crucial second proposal density is then used to ensure that the majority of particles have equivalent weights at observation time. Here, the performance of the scheme in a high, 65 500 dimensional, simplified ocean model is explored. The success of the equivalent-weights particle filter in matching the true model state is shown using the mean of just 32 particles in twin experiments. It is of particular significance that this remains true even as the number and spatial variability of the observations are changed. The results from rank histograms are less easy to interpret and can be influenced considerably by the parameter values used. This article also explores the sensitivity of the performance of the scheme to the chosen parameter values and the effect of using different model error parameters in the truth compared with the ensemble model runs.