46 resultados para Rank and file unionism
Resumo:
A common method for testing preference for objects is to determine which of a pair of objects is approached first in a paired-choice paradigm. In comparison, many studies of preference for environmental enrichment (EE) devices have used paradigms in which total time spent with each of a pair of objects is used to determine preference. While each of these paradigms gives a specific measure of the preference for one object in comparison to another, neither method allows comparisons between multiple objects simultaneously. Since it is possible that several EE objects would be placed in a cage together to improve animal welfare, it is important to determine measures for rats' preferences in conditions that mimic this potential home cage environment. While it would be predicted that each type of measure would produce similar rankings of objects, this has never been tested empirically. In this study, we compared two paradigms: EE objects were either presented in pairs (paired-choice comparison) or four objects were presented simultaneously (simultaneous presentation comparison). We used frequency of first interaction and time spent with each object to rank the objects in the paired-choice experiment, and time spent with each object to rank the objects in the simultaneous presentation experiment. We also considered the behaviours elicited by the objects to determine if these might be contributing to object preference. We demonstrated that object ranking based on time spent with objects from the paired-choice experiment predicted object ranking in the simultaneous presentation experiment. Additionally, we confirmed that behaviours elicited were an important determinant of time spent with an object. This provides convergent evidence that both paired choice and simultaneous comparisons provide valid measures of preference for EE objects in rats. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Adult male hooded Lister rats were either fed a diet containing 150 microg/g soya phytoestrogens or a soya-free diet for 18 days. This concentration of phytoestrogens should have been sufficient to occupy the oestrogen-beta, but not the oestrogen-alpha, receptors. Using in situ hybridisation, significant reductions were found in brain-derived neurotrophic factor (BDNF) mRNA expression in the CA3 and CA4 region of the hippocampus and in the cerebral cortex in the rats fed the diet containing phytoestrogens, compared with those on the soya-free diet. No changes in glutamic acid decarboxylase-67 or glial fibrillary acidic protein mRNA were found. This suggests a role for oestrogen-beta receptors in regulating BDNF mRNA expression.
Resumo:
The Rank Forum on Vitamin D was held on 2nd and 3rd July 2009 at the University of Surrey, Guildford, UK. The workshop consisted of a series of scene-setting presentations to address the current issues and challenges concerning vitamin D and health, and included an open discussion focusing on the identification of the concentrations of serum 25-hydroxyvitamin D (25(OH)D) (a marker of vitamin D status) that may be regarded as optimal, and the implications this process may have in the setting of future dietary reference values for vitamin D in the UK. The Forum was in agreement with the fact that it is desirable for all of the population to have a serum 25(OH)D concentration above 25 nmol/l, but it discussed some uncertainty about the strength of evidence for the need to aim for substantially higher concentrations (25(OH)D concentrations . 75 nmol/l). Any discussion of ‘optimal’ concentration of serum 25(OH)D needs to define ‘optimal’ with care since it is important to consider the normal distribution of requirements and the vitamin D needs for a wide range of outcomes. Current UK reference values concentrate on the requirements of particular subgroups of the population; this differs from the approaches used in other European countries where a wider range of age groups tend to be covered. With the re-emergence of rickets and the public health burden of low vitamin D status being already apparent, there is a need for urgent action from policy makers and risk managers. The Forum highlighted concerns regarding the failure of implementation of existing strategies in the UK for achieving current vitamin D recommendations.
Resumo:
It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.
Resumo:
A key strategy to improve the skill of quantitative predictions of precipitation, as well as hazardous weather such as severe thunderstorms and flash floods is to exploit the use of observations of convective activity (e.g. from radar). In this paper, a convection-permitting ensemble prediction system (EPS) aimed at addressing the problems of forecasting localized weather events with relatively short predictability time scale and based on a 1.5 km grid-length version of the Met Office Unified Model is presented. Particular attention is given to the impact of using predicted observations of radar-derived precipitation intensity in the ensemble transform Kalman filter (ETKF) used within the EPS. Our initial results based on the use of a 24-member ensemble of forecasts for two summer case studies show that the convective-scale EPS produces fairly reliable forecasts of temperature, horizontal winds and relative humidity at 1 h lead time, as evident from the inspection of rank histograms. On the other hand, the rank histograms seem also to show that the EPS generates too much spread for forecasts of (i) surface pressure and (ii) surface precipitation intensity. These may indicate that for (i) the value of surface pressure observation error standard deviation used to generate surface pressure rank histograms is too large and for (ii) may be the result of non-Gaussian precipitation observation errors. However, further investigations are needed to better understand these findings. Finally, the inclusion of predicted observations of precipitation from radar in the 24-member EPS considered in this paper does not seem to improve the 1-h lead time forecast skill.
Resumo:
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.
Resumo:
Much consideration is rightly given to the design of metadata models to describe data. At the other end of the data-delivery spectrum much thought has also been given to the design of geospatial delivery interfaces such as the Open Geospatial Consortium standards, Web Coverage Service (WCS), Web Map Server and Web Feature Service (WFS). Our recent experience with the Climate Science Modelling Language shows that an implementation gap exists where many challenges remain unsolved. To bridge this gap requires transposing information and data from one world view of geospatial climate data to another. Some of the issues include: the loss of information in mapping to a common information model, the need to create ‘views’ onto file-based storage, and the need to map onto an appropriate delivery interface (as with the choice between WFS and WCS for feature types with coverage-valued properties). Here we summarise the approaches we have taken in facing up to these problems.
Resumo:
Purpose – The purpose of this research was twofold. First, to investigate the views of occupiers in a typical UK city on the importance of various sustainability issues, their perceived impact of different sustainability drivers and willingness to pay. Second, the environmental and social performance of existing buildings in that city was examined. Design/methodology/approach – The research focuses on buildings of 10,000 feet2 or more that have been constructed in the Bristol city-region in the UK over the past 50 years. The buildings in the sample are located in the city centre and in out-of-town business parks. A questionnaire survey investigated the views of occupiers and follow-up interviews looked more closely at the sustainability performance of the existing stock. Findings – The findings indicate that, as far as occupiers are concerned, the strongest drivers are consumer demand and staff demand. Green features of a building appear to rank low in the overall building selection preference structure and a willingness to pay a premium for green features was indicated. The interviews uncovered barriers to progress as well as initiatives to reduce both energy consumption and the environmental impact of office space. Practical implications – The paper identifies progress and issues which could form obstacles to improving the environmental performance of office buildings. It is argued that there is a need to focus on energy efficiency. Originality/value – This paper explores the linkage between the perception and use of office space by occupants and how this affects the environmental performance of this space.
Resumo:
A characterization of observability for linear time-varying descriptor systemsE(t)x(t)+F(t)x(t)=B(t)u(t), y(t)=C(t)x(t) was recently developed. NeitherE norC were required to have constant rank. This paper defines a dual system, and a type of controllability so that observability of the original system is equivalent to controllability of the dual system. Criteria for observability and controllability are given in terms of arrays of derivatives of the original coefficients. In addition, the duality results of this paper lead to an improvement on a previous fundamental structure result for solvable systems of the formE(t)x(t)+F(t)x(t)=f(tt).
Resumo:
Reliable techniques for screening large numbers of plants for root traits are still being developed, but include aeroponic, hydroponic and agar plate systems. Coupled with digital cameras and image analysis software, these systems permit the rapid measurement of root numbers, length and diameter in moderate ( typically <1000) numbers of plants. Usually such systems are employed with relatively small seedlings, and information is recorded in 2D. Recent developments in X-ray microtomography have facilitated 3D non-invasive measurement of small root systems grown in solid media, allowing angular distributions to be obtained in addition to numbers and length. However, because of the time taken to scan samples, only a small number can be screened (typically<10 per day, not including analysis time of the large spatial datasets generated) and, depending on sample size, limited resolution may mean that fine roots remain unresolved. Although agar plates allow differences between lines and genotypes to be discerned in young seedlings, the rank order may not be the same when the same materials are grown in solid media. For example, root length of dwarfing wheat ( Triticum aestivum L.) lines grown on agar plates was increased by similar to 40% relative to wild-type and semi-dwarfing lines, but in a sandy loam soil under well watered conditions it was decreased by 24-33%. Such differences in ranking suggest that significant soil environment-genotype interactions are occurring. Developments in instruments and software mean that a combination of high-throughput simple screens and more in-depth examination of root-soil interactions is becoming viable.
Resumo:
The Commission has proposed that a revised version of the present regime of direct payments should be rolled forward into the post-2013 CAP. There would be a limited redistribution of funds between Member States. Thirty per cent of the budget would be allocated to a new greening component, which would be problematic in the WTO. Non-active farmers would not qualify for aid; and payments would be capped. Special schemes would be introduced for small farmers, for young new entrants, and for disadvantaged regions.
Resumo:
A nitric oxide synthase (NOS)-like activity has been demonstrated in human red blood cells (RBCs), but doubts about its functional significance, isoform identity and disease relevance remain. Using flow cytometry in combination with the NO-imaging probe DAF-FM we find that all blood cells form NO intracellularly, with a rank order of monocytes > neutrophils > lymphocytes > RBCs > platelets. The observation of a NO-related fluorescence within RBCs was unexpected given the abundance of the NO-scavenger oxyhemoglobin. Constitutive normoxic NO formation was abolished by NOS inhibition and intracellular NO scavenging, confirmed by laser-scanning microscopy and unequivocally validated by detection of the DAF-FM reaction product with NO using HPLC and LC-MS/MS. Employing immunoprecipitation, ESI-MS/MS-based peptide sequencing and enzymatic assay we further demonstrate that human RBCs contain an endothelial NOS (eNOS) that converts L-3H-Arginine to L-3H-Citrulline in a Ca2+/Calmodulin-dependent fashion. Moreover, in patients with coronary artery disease, red cell eNOS expression and activity are both lower than in age-matched healthy individuals and correlate with the degree of endothelial dysfunction. Thus, human RBCs constitutively produce NO under normoxic conditions via an active eNOS isoform the activity of which is compromised in patients with coronary artery disease.
Resumo:
The application of forecast ensembles to probabilistic weather prediction has spurred considerable interest in their evaluation. Such ensembles are commonly interpreted as Monte Carlo ensembles meaning that the ensemble members are perceived as random draws from a distribution. Under this interpretation, a reasonable property to ask for is statistical consistency, which demands that the ensemble members and the verification behave like draws from the same distribution. A widely used technique to assess statistical consistency of a historical dataset is the rank histogram, which uses as a criterion the number of times that the verification falls between pairs of members of the ordered ensemble. Ensemble evaluation is rendered more specific by stratification, which means that ensembles that satisfy a certain condition (e.g., a certain meteorological regime) are evaluated separately. Fundamental relationships between Monte Carlo ensembles, their rank histograms, and random sampling from the probability simplex according to the Dirichlet distribution are pointed out. Furthermore, the possible benefits and complications of ensemble stratification are discussed. The main conclusion is that a stratified Monte Carlo ensemble might appear inconsistent with the verification even though the original (unstratified) ensemble is consistent. The apparent inconsistency is merely a result of stratification. Stratified rank histograms are thus not necessarily flat. This result is demonstrated by perfect ensemble simulations and supplemented by mathematical arguments. Possible methods to avoid or remove artifacts that stratification induces in the rank histogram are suggested.