137 resultados para quantifying heteroskedasticity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a reappraisal of the blood clotting response (BCR) tests for anticoagulant rodenticides, and proposes a standardised methodology for identifying and quantifying physiological resistance in populations of rodent species. The standardisation is based on the International Normalised Ratio, which is standardised against a WHO international reference preparation of thromboplastin, and allows comparison of data obtained using different thromboplastin reagents. ne methodology is statistically sound, being based on the 50% response, and has been validated against the Norway rat (Rattus norvegicus) and the house mouse (Mus domesticus). Susceptibility baseline data are presented for warfarin, diphacinone, chlorophacinone and coumatetralyl against the Norway rat, and for bromadiolone, difenacoum, difethialone, flocoumafen and brodifacoum against the Norway rat and the house mouse. A 'test dose' of twice the ED50 can be used for initial identification of resistance, and will provide a similar level of information to previously published methods. Higher multiples of the ED50 can be used to assess the resistance factor, and to predict the likely impact on field control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Chronic fatigue syndrome (CFS) is an increasing medical phenomenon of unknown aetiology leading to high levels of chronic morbidity. Of the many hypotheses that purport to explain this disease, immune system activation, as a central feature, has remained prominent but unsubstantiated. Supporting this, a number of important cytokines have previously been shown to be over-expressed in disease subjects. The diagnosis of CFS is highly problematic since no biological markers specific to this disease have been identified. The discovery of genes relating to this condition is an important goal in seeking to correctly categorize and understand this complex syndrome. OBJECTIVE: The aim of this study was to screen for changes in gene expression in the lymphocytes of CFS patients. METHODS: 'Differential Display' is a method for comparing mRNA populations for the induction or suppression of genes. In this technique, mRNA populations from control and test subjects can be 'displayed' by gel electrophoresis and screened for differing banding patterns. These differences are indicative of altered gene expression between samples, and the genes that correspond to these bands can be cloned and identified. Differential display has been used to compare expression levels between four control subjects and seven CFS patients. RESULTS: Twelve short expressed sequence tags have been identified that were over-expressed in lymphocytes from CFS patients. Two of these correspond to cathepsin C and MAIL1 - genes known to be upregulated in activated lymphocytes. The expression level of seven of the differentially displayed sequences have been verified by quantifying relative level of these transcripts using TAQman quantitative PCR. CONCLUSION: Taken as a whole, the identification of novel gene tags up-regulated in CFS patients adds weight to the idea that CFS is a disease characterized by subtle changes in the immune system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to focus on the Fédération Internationale des Ingénieurs-Conseils (FIDIC) White Book standard form of building contract. It tracks the changes to this contract over its four editions, and seeks to identify their underlying causes. Design/methodology/approach – The changes made to the White Book are quantified using a specific type of quantitative content analysis. The amended clauses are then examined to understand the nature of the changes made. Findings – The length of the contract increased by 34 per cent between 1990 and 2006. A large proportion of the overall increase can be attributed to the clauses dealing with “conflict of interest/corruption” and “dispute resolution”. In both instances, the FIDIC drafting committees have responded to international developments to discourage corruption, and to encourage the use of alternative dispute resolution. Between 1998 and 2006, the average length of the sentences increased slightly, raising the question of whether long sentences are easily understood by users of contracts. Research limitations/implications – Quantification of text appears to be particularly useful for the analysis of documents which are regularly updated because changes can be clearly identified and the length of sentences can be determined, leading to conclusions about the readability of the text. However, caution is needed because changes of great relevance can be made to contract clauses without actually affecting their length. Practical implications – The paper will be instructive for contract drafters and informative for users of FIDIC's White Book. Originality/value – Quantifying text has been rarely used regarding standard-form contracts in the field of construction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims: To develop a quantitative equation [prebiotic index ( PI)] to aid the analysis of prebiotic fermentation of commercially available and novel prebiotic carbohydrates in vitro, using previously published fermentation data. Methods: The PI equation is based on the changes in key bacterial groups during fermentation. The bacterial groups incorporated into this PI equation were bifidobacteria, lactobacilli, clostridia and bacteroides. The changes in these bacterial groups from previous studies were entered into the PI equation in order to determine a quantitative PI score. PI scores were than compared with the qualitative conclusions made in these publications. In general the PI scores agreed with the qualitative conclusions drawn and provided a quantitative measure. Conclusions: The PI allows the magnitude of prebiotic effects to be quantified rather than evaluations being solely qualitative. Significance and Impact of the Study: The PI equation may be of great use in quantifying prebiotic effects in vitro. It is expected that this will facilitate more rational food product development and the development of more potent prebiotics with activity at lower doses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantitative analysis by mass spectrometry (MS) is a major challenge in proteomics as the correlation between analyte concentration and signal intensity is often poor due to varying ionisation efficiencies in the presence of molecular competitors. However, relative quantitation methods that utilise differential stable isotope labelling and mass spectrometric detection are available. Many drawbacks inherent to chemical labelling methods (ICAT, iTRAQ) can be overcome by metabolic labelling with amino acids containing stable isotopes (e.g. 13C and/or 15N) in methods such as Stable Isotope Labelling with Amino acids in Cell culture (SILAC). SILAC has also been used for labelling of proteins in plant cell cultures (1) but is not suitable for whole plant labelling. Plants are usually autotrophic (fixing carbon from atmospheric CO2) and, thus, labelling with carbon isotopes becomes impractical. In addition, SILAC is expensive. Recently, Arabidopsis cell cultures were labelled with 15N in a medium containing nitrate as sole nitrogen source. This was shown to be suitable for quantifying proteins and nitrogen-containing metabolites from this cell culture (2,3). Labelling whole plants, however, offers the advantage of studying quantitatively the response to stimulation or disease of a whole multicellular organism or multi-organism systems at the molecular level. Furthermore, plant metabolism enables the use of inexpensive labelling media without introducing additional stress to the organism. And finally, hydroponics is ideal to undertake metabolic labelling under extremely well-controlled conditions. We demonstrate the suitability of metabolic 15N hydroponic isotope labelling of entire plants (HILEP) for relative quantitative proteomic analysis by mass spectrometry. To evaluate this methodology, Arabidopsis plants were grown hydroponically in 14N and 15N media and subjected to oxidative stress.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A theoretical framework is developed for the evolution of baroclinic waves with latent heat release parameterized in terms of vertical velocity. Both wave–conditional instability of the second kind (CISK) and large-scale rain approaches are included. The new quasigeostrophic framework covers evolution from general initial conditions on zonal flows with vertical shear, planetary vorticity gradient, a lower boundary, and a tropopause. The formulation is given completely in terms of potential vorticity, enabling the partition of perturbations into Rossby wave components, just as for the dry problem. Both modal and nonmodal development can be understood to a good approximation in terms of propagation and interaction between these components alone. The key change with moisture is that growing normal modes are described in terms of four counterpropagating Rossby wave (CRW) components rather than two. Moist CRWs exist above and below the maximum in latent heating, in addition to the upper- and lower-level CRWs of dry theory. Four classifications of baroclinic development are defined by quantifying the strength of interaction between the four components and identifying the dominant pairs, which range from essentially dry instability to instability in the limit of strong heating far from boundaries, with type-C cyclogenesis and diabatic Rossby waves being intermediate types. General initial conditions must also include passively advected residual PV, as in the dry problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The urban heat island (UHI) is a well-known effect of urbanisation and is particularly important in world megacities. Overheating in such cities is expected to be exacerbated in the future as a result of further urban growth and climate change. Demonstrating and quantifying the impact of individual design interventions on the UHI is currently difficult using available software tools. The tools developed in the LUCID (‘The Development of a Local Urban Climate Model and its Application to the Intelligent Design of Cities’) research project will enable the related impacts to be better understood, quantified and addressed. This article summarises the relevant literature and reports on the ongoing work of the project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global temperatures are expected to rise by between 1.1 and 6.4oC this century, depending, to a large extent, on the amount of carbon we emit to the atmosphere from now onwards. This warming is expected to have very negative effects on many peoples and ecosystems and, therefore, minimising our carbon emissions is a priority. Buildings are estimated to be responsible for around 50% of carbon emissions in the UK. Potential reductions involve both operational emissions, produced during use, and embodied emissions, produced during manufacture of materials and components, and during construction, refurbishments and demolition. To date the major effort has focused on reducing the, apparently, larger operational element, which is more readily quantifiable and reduction measures are relatively straightforward to identify and implement. Various studies have compared the magnitude of embodied and operational emissions, but have shown considerable variation in the relative values. This illustrates the difficulties in quantifying embodied, as it requires a detailed knowledge of the processes involved in the different life cycle phases, and requires the use of consistent system boundaries. However, other studies have established the interaction between operational and embodied, which demonstrates the importance of considering both elements together in order to maximise potential reductions. This is borne out in statements from both the Intergovernmental Panel on Climate Change and The Low Carbon Construction Innovation and Growth Team of the UK Government. In terms of meeting the 2020 and 2050 timeframes for carbon reductions it appears to be equally, if not more, important to consider early embodied carbon reductions, rather than just future operational reductions. Future decarbonisation of energy supply and more efficient lighting and M&E equipment installed in future refits is likely to significantly reduce operational emissions, lending further weight to this argument. A method of discounting to evaluate the present value of future carbon emissions would allow more realistic comparisons to be made on the relative importance of the embodied and operational elements. This paper describes the results of case studies on carbon emissions over the whole lifecycle of three buildings in the UK, compares four available software packages for determining embodied carbon and suggests a method of carbon discounting to obtain present values for future emissions. These form the initial stages of a research project aimed at producing information on embodied carbon for different types of building, components and forms of construction, in a simplified form, which can be readily used by building designers in optimising building design in terms of minimising overall carbon emissions. Keywords: Embodied carbon; carbon emission; building; operational carbon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energetic constraints on precipitation are useful for understanding the response of the hydrological cycle to ongoing climate change, its response to possible geoengineering schemes, and the limits on precipitation in very warm climates of the past. Much recent progress has been made in quantifying the different forcings and feedbacks on precipitation and in understanding how the transient responses of precipitation and temperature might differ qualitatively. Here, we introduce the basic ideas and review recent progress. We also examine the extent to which energetic constraints on precipitation may be viewed as radiative constraints and the extent to which they are confirmed by available observations. Challenges remain, including the need to better demonstrate the link between energetics and precipitation in observations and to better understand energetic constraints on precipitation at sub-global length scales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is intense scientific and public interest in the Intergovernmental Panel on Climate Change (IPCC) projections of sea level for the twenty-first century and beyond. The Fourth Assessment Report (AR4) projections, obtained by applying standard methods to the results of the World Climate Research Programme Coupled Model Experiment, includes estimates of ocean thermal expansion, the melting of glaciers and ice caps (G&ICs), increased melting of the Greenland Ice Sheet, and increased precipitation over Greenland and Antarctica, partially offsetting other contributions. The AR4 recognized the potential for a rapid dynamic ice sheet response but robust methods for quantifying it were not available. Illustrative scenarios suggested additional sea level rise on the order of 10 to 20 cm or more, giving a wide range in the global averaged projections of about 20 to 80 cm by 2100. Currently, sea level is rising at a rate near the upper end of these projections. Since publication of the AR4 in 2007, biases in historical ocean temperature observations have been identified and significantly reduced, resulting in improved estimates of ocean thermal expansion. Models that include all climate forcings are in good agreement with these improved observations and indicate the importance of stratospheric aerosol loadings from volcanic eruptions. Estimates of the volumes of G&ICs and their contributions to sea level rise have improved. Results from recent (but possibly incomplete) efforts to develop improved ice sheet models should be available for the 2013 IPCC projections. Improved understanding of sea level rise is paving the way for using observations to constrain projections. Understanding of the regional variations in sea level change as a result of changes in ocean properties, wind-stress patterns, and heat and freshwater inputs into the ocean is improving. Recently, estimates of sea level changes resulting from changes in Earth's gravitational field and the solid Earth response to changes in surface loading have been included in regional projections. While potentially valuable, semi-empirical models have important limitations, and their projections should be treated with caution

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The adaptive thermal comfort theory considers people as active rather than passive recipients in response to ambient physical thermal stimuli, in contrast with conventional, heat-balance-based, thermal comfort theory. Occupants actively interact with the environments they occupy by means of utilizing adaptations in terms of physiological, behavioural and psychological dimensions to achieve ‘real world’ thermal comfort. This paper introduces a method of quantifying the physiological, behavioural and psychological portions of the adaptation process by using the analytic hierarchy process (AHP) based on the case studies conducted in the UK and China. Apart from three categories of adaptations which are viewed as criteria, six possible alternatives are considered: physiological indices/health status, the indoor environment, the outdoor environment, personal physical factors, environmental control and thermal expectation. With the AHP technique, all the above-mentioned criteria, factors and corresponding elements are arranged in a hierarchy tree and quantified by using a series of pair-wise judgements. A sensitivity analysis is carried out to improve the quality of these results. The proposed quantitative weighting method provides researchers with opportunities to better understand the adaptive mechanisms and reveal the significance of each category for the achievement of adaptive thermal comfort.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Canopy leaf area index (LAI), defined as the single-sided leaf area per unit ground area, is a quantitative measure of canopy foliar area. LAI is a controlling biophysical property of vegetation function, and quantifying LAI is thus vital for understanding energy, carbon and water fluxes between the land surface and the atmosphere. LAI is routinely available from Earth Observation (EO) instruments such as MODIS. However EO-derived estimates of LAI require validation before they are utilised by the ecosystem modelling community. Previous validation work on the MODIS collection 4 (c4) product suggested considerable error especially in forested biomes, and as a result significant modification of the MODIS LAI algorithm has been made for the most recent collection 5 (c5). As a result of these changes the current MODIS LAI product has not been widely validated. We present a validation of the MODIS c5 LAI product over a 121 km2 area of mixed coniferous forest in Oregon, USA, based on detailed ground measurements which we have upscaled using high resolution EO data. Our analysis suggests that c5 shows a much more realistic temporal LAI dynamic over c4 values for the site we examined. We find improved spatial consistency between the MODIS c5 LAI product and upscaled in situ measurements. However results also suggest that the c5 LAI product underestimates the upper range of upscaled in situ LAI measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable interest in the climate impact of trends in stratospheric water vapor (SWV). However, the representation of the radiative properties of water vapor under stratospheric conditions remains poorly constrained across different radiation codes. This study examines the sensitivity of a detailed line-by-line (LBL) code, a Malkmus narrow-band model and two broadband GCM radiation codes to a uniform perturbation in SWV in the longwave spectral region. The choice of sampling rate in wave number space (Δν) in the LBL code is shown to be important for calculations of the instantaneous change in heating rate (ΔQ) and the instantaneous longwave radiative forcing (ΔFtrop). ΔQ varies by up to 50% for values of Δν spanning 5 orders of magnitude, and ΔFtrop varies by up to 10%. In the three less detailed codes, ΔQ differs by up to 45% at 100 hPa and 50% at 1 hPa compared to a LBL calculation. This causes differences of up to 70% in the equilibrium fixed dynamical heating temperature change due to the SWV perturbation. The stratosphere-adjusted radiative forcing differs by up to 96% across the less detailed codes. The results highlight an important source of uncertainty in quantifying and modeling the links between SWV trends and climate.