15 resultados para 1333

em CentAUR: Central Archive University of Reading - UK


Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting bloom occurrence in lakes and rivers. In this paper existing key models of cyanobacteria are reviewed, evaluated and classified. Two major groups emerge: deterministic mathematical and artificial neural network models. Mathematical models can be further subcategorized into those models concerned with impounded water bodies and those concerned with rivers. Most existing models focus on a single aspect such as the growth of transport mechanisms, but there are a few models which couple both.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Glycoxidation and lipoxidation reactions contribute to the chemical modification of proteins during the Maillard reaction. Reactive oxygen species, produced during the oxidation of sugars and lipids in these processes, irreversibly oxidize proteins. Methionine is particularly susceptible to oxidation, yielding the oxidation product methionine sulfoxide (MetSO). Here we describe a method for the analysis of MetSO using proteomic techniques. Using these techniques, we measured MetSO formation on the model protein RNase during aerobic incubations with glucose and arachidonate. We also evaluated the susceptibility of MetSO to reduction by NaBH4, a commonly used reductant in the analysis of Maillard reaction products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proteomic tools-in particular, mass spectrometry (MS)-have advanced significantly in recent years, and the identification of proteins within complex mixtures is now a routine procedure. Quantitative methods of analysis are less well advanced and continue to develop. These include the use of stable isotope ratio approaches, isotopically labeled peptide standards, and nonlabeling methods. This paper summarizes the use of MS as a proteomics tool to identify and semiquantify proteins and their modified forms by using examples of relevance to the Maillard reaction. Finally, some challenges for the future are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The self-assembly of tripeptides based on the RGD cell adhesion motif is investigated. Two tripeptides containing the Fmoc [N-(fluorenyl)-9-methoxycarbonyl] aromatic unit were synthesized, Fmoc-RGD and a control peptide containing a scrambled sequence, Fmoc-GRD. The Fmoc is used to control selfassembly via aromatic stacking interactions. The self-assembly and hydrogelation properties of the two Fmoc-tripeptides are compared. Both form well defined amyloid fibrils (as shown by cryo-TEM and SAXS) with b-sheet features in their circular dichroism and FTIR spectra. Both peptides form selfsupporting hydrogels, the dynamic shear modulus of which was measured. Preliminary cell culture experiments reveal that Fmoc-RGD can be used as a support for bovine fibroblasts, but not Fmoc- GRD, consistent with the incorporation of the cell adhesion motif in the former peptide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: To analyse the influence of serving method on compliance and consumption of nutritional supplement drinks in older adults with cognitive impairment. Background: Oral nutritional supplement drinks have positive benefits on increasing nutritional status within undernourished elderly people leading to weight gain. However, consumption of these drinks is low and therefore limits their effectiveness. Design: This study was a non blind randomised control trial where participants either consumed nutritional supplement drinks in a glass/beaker or consumed them through a straw inserted directly into the container. Method: Participants with longstanding cognitive impairment were recruited from nursing homes (n=31) and hospitals (n=14). Participants were randomised to serving method. Nursing and care staff were instructed to give the supplement drinks three times per day on alternate days over a week by the allocated serving method. The researcher weighed the amount of supplement drink remaining after consumption. Data were collected over 12 months in 2011-2012. Results: 45 people participated in this study mean age 86.7 (SD 7.5 ) years. After randomisation there was no significant difference between the baseline characteristics of the two groups. Participants randomised to consume nutritional drinks from a glass / beaker drank significantly more than those who consumed them via a straw inserted directly into the container. However, supplements allocated to be given in a glass/beaker were more frequently omitted. Conclusion: Nutritional supplement drinks should be given to people with dementia who are able to feed themselves in a glass or a beaker if staffing resources allow (NIHR CSP ref 31101).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A realistic representation of the North Atlantic tropical cyclone tracks is crucial as it allows, for example, explaining potential changes in US landfalling systems. Here we present a tentative study, which examines the ability of recent climate models to represent North Atlantic tropical cyclone tracks. Tracks from two types of climate models are evaluated: explicit tracks are obtained from tropical cyclones simulated in regional or global climate models with moderate to high horizontal resolution (1° to 0.25°), and downscaled tracks are obtained using a downscaling technique with large-scale environmental fields from a subset of these models. For both configurations, tracks are objectively separated into four groups using a cluster technique, leading to a zonal and a meridional separation of the tracks. The meridional separation largely captures the separation between deep tropical and sub-tropical, hybrid or baroclinic cyclones, while the zonal separation segregates Gulf of Mexico and Cape Verde storms. The properties of the tracks’ seasonality, intensity and power dissipation index in each cluster are documented for both configurations. Our results show that except for the seasonality, the downscaled tracks better capture the observed characteristics of the clusters. We also use three different idealized scenarios to examine the possible future changes of tropical cyclone tracks under 1) warming sea surface temperature, 2) increasing carbon dioxide, and 3) a combination of the two. The response to each scenario is highly variable depending on the simulation considered. Finally, we examine the role of each cluster in these future changes and find no preponderant contribution of any single cluster over the others.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper finds preference reversals in measurements of ambiguity aversion, even if psychological and informational circumstances are kept constant. The reversals are of a fundamentally different nature than the reversals found before because they cannot be explained by context-dependent weightings of attributes. We offer an explanation based on Sugden's random-reference theory, with different elicitation methods generating different random reference points. Then measurements of ambiguity aversion that use willingness to pay are confounded by loss aversion and hence overestimate ambiguity aversion.