91 resultados para Field Analysis Comfa
Resumo:
As the ideal method of assessing the nutritive value of a feedstuff, namely offering it to the appropriate class of animal and recording the production response obtained, is neither practical nor cost effective a range of feed evaluation techniques have been developed. Each of these balances some degree of compromise with the practical situation against data generation. However, due to the impact of animal-feed interactions over and above that of feed composition, the target animal remains the ultimate arbitrator of nutritional value. In this review current in vitro feed evaluation techniques are examined according to the degree of animal-feed interaction. Chemical analysis provides absolute values and therefore differs from the majority of in vitro methods that simply rank feeds. However, with no host animal involvement, estimates of nutritional value are inferred by statistical association. In addition given the costs involved, the practical value of many analyses conducted should be reviewed. The in sacco technique has made a substantial contribution to both understanding rumen microbial degradative processes and the rapid evaluation of feeds, especially in developing countries. However, the numerous shortfalls of the technique, common to many in vitro methods, the desire to eliminate the use of surgically modified animals for routine feed evaluation, paralleled with improvements in in vitro techniques, will see this technique increasingly replaced. The majority of in vitro systems use substrate disappearance to assess degradation, however, this provides no information regarding the quantity of derived end-products available to the host animal. As measurement of volatile fatty acids or microbial biomass production greatly increases analytical costs, fermentation gas release, a simple and non-destructive measurement, has been used as an alternative. However, as gas release alone is of little use, gas-based systems, where both degradation and fermentation gas release are measured simultaneously, are attracting considerable interest. Alternative microbial inocula are being considered, as is the potential of using multi-enzyme systems to examine degradation dynamics. It is concluded that while chemical analysis will continue to form an indispensable part of feed evaluation, enhanced use will be made of increasingly complex in vitro systems. It is vital, however, the function and limitations of each methodology are fully understood and that the temptation to over-interpret the data is avoided so as to draw the appropriate conclusions. With careful selection and correct application in vitro systems offer powerful research tools with which to evaluate feedstuffs. (C) 2003 Elsevier B.V. All rights reserved.
Recent developments in genetic data analysis: what can they tell us about human demographic history?
Resumo:
Over the last decade, a number of new methods of population genetic analysis based on likelihood have been introduced. This review describes and explains the general statistical techniques that have recently been used, and discusses the underlying population genetic models. Experimental papers that use these methods to infer human demographic and phylogeographic history are reviewed. It appears that the use of likelihood has hitherto had little impact in the field of human population genetics, which is still primarily driven by more traditional approaches. However, with the current uncertainty about the effects of natural selection, population structure and ascertainment of single-nucleotide polymorphism markers, it is suggested that likelihood-based methods may have a greater impact in the future.
Resumo:
We have developed a new simple method for transport, storage, and analysis of genetic material from the corals Agaricia agaricites, Dendrogyra cylindrica, Eusmilia ancora, Meandrina meandrites, Montastrea annularis, Porites astreoides, Porites furcata, Porites porites, and Siderastrea siderea at room temperature. All species yielded sufficient DNA from a single FTA(R) card (19 mug-43 ng) for subsequent PCR amplification of both coral and zooxanthellar DNA. The D1 and D2 variable region of the large Subunit rRNA gene (LSUrDNA) was amplified from the DNA of P. furcata and S. siderea by PCR. Electrophoresis yielded two major DNA bands: an 800-base pair (bp) DNA, which represented the coral ribosomal RNA (rRNA) gene, and a 600-bp DNA, which represented the zooxanthellar srRNA gene. Extraction of DNA from the bands yielded between 290 mug total DNA (S. siderea coral DNA) and 9 mug total DNA (P. furcata zooxanthellar DNA). The ability to transport and store genetic material from scleractinian corals without resort to laboratory facilities in the field allows for the molecular Study of a far wider range and variety of coral sites than have been studied to date. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
The hydrothermal reactions of Ni(NO3)(2).6H(2)O, disodium fumarate (fum) and 1,2-bis(4-pyridyl)ethane (bpe)/1,3-bis(4-pyridyl) propane (bpp) in aqueous-methanol medium yield one 3-D and one 2-D metal-organic hybrid material, [Ni(fum)(bpe)] (1) and [Ni(fum)(bpp)(H2O)] (2), respectively. Complex 1 possesses a novel unprecedented structure, the first example of an "unusual mode" of a five-fold distorted interpenetrated network with metal-ligand linkages where the four six-membered windows in each adamantane-type cage are different. The structural characterization of complex 2 evidences a buckled sheet where nickel ions are in a distorted octahedral geometry, with two carboxylic groups, one acting as a bis-chelate, the other as a bis-monodentate ligand. The metal ion completes the coordination sphere through one water molecule and two bpp nitrogens in cis position. Variable-temperature magnetic measurements of complexes 1 and 2 reveal the existence of very weak antiferromagnetic intramolecular interactions and/or the presence of single-ion zero field splitting (D) of isolated Ni-II ions in both the compounds. Experimentally, both the J parameters are close, comparable and very small. Considering zero-field splitting of Ni-II, the calculated D values are in agreement with values reported in the literature for Ni-II ions. Complex 3, [{Co(phen)}(2)(fum)(2)] (phen=1,10-phenanthroline) is obtained by diffusing methanolic solution of 1,10-phenanthroline on an aqueous layer of disodium fumarate and Co(NO3)(2).6H(2)O. It consists of dimeric Co-II(phen) units, doubly bridged by carboxylate groups in a distorted syn-syn fashion. These fumarate anions act as bis-chelates to form corrugated sheets. The 2D layer has a (4,4) topology, with the nodes represented by the centres of the dimers. The magnetic data were fitted ignoring the very weak coupling through the fumarate pathway and using a dimer model.
Resumo:
The issue of levels of participation in post-compulsory education has been emphasised by the current policy initiatives to increase the age to which some form of participation is compulsory. One of the acknowledged weaknesses of research in the field of children's intentions with regard to participation is the lack of longitudinal data. This paper offers a longitudinal analysis using the Youth Survey from the British Household Panel Survey. The results show that most children can express intentions with regard to future participation very early in their secondary school careers and that these intentions are good predictors of actual behaviour five years later. Intentions to stay on are more consistent than intentions to leave and most children who finally leave at 16 have at some point said they want to remain in education post-16. The strongest association with participation levels is attainment at GCSE. However, there are also influences of gender and parental background and these remain, even after attainment is held constant. The results show the value of focusing on intentions for participation at a very early stage of children's school careers and also the importance of current attempts to reform curriculum and assessment for the 14-19 age group.
Resumo:
In this paper, we address issues in segmentation Of remotely sensed LIDAR (LIght Detection And Ranging) data. The LIDAR data, which were captured by airborne laser scanner, contain 2.5 dimensional (2.5D) terrain surface height information, e.g. houses, vegetation, flat field, river, basin, etc. Our aim in this paper is to segment ground (flat field)from non-ground (houses and high vegetation) in hilly urban areas. By projecting the 2.5D data onto a surface, we obtain a texture map as a grey-level image. Based on the image, Gabor wavelet filters are applied to generate Gabor wavelet features. These features are then grouped into various windows. Among these windows, a combination of their first and second order of statistics is used as a measure to determine the surface properties. The test results have shown that ground areas can successfully be segmented from LIDAR data. Most buildings and high vegetation can be detected. In addition, Gabor wavelet transform can partially remove hill or slope effects in the original data by tuning Gabor parameters.
Resumo:
A cross-platform field campaign, OP3, was conducted in the state of Sabah in Malaysian Borneo between April and July of 2008. Among the suite of observations recorded, the campaign included measurements of NOx and O3 – crucial outputs of any model chemistry mechanism. We describe the measurements of these species made from both the ground site and aircraft. We then use the output from two resolutions of the chemistry transport model p-TOMCAT to illustrate the ability of a global model chemical mechanism to capture the chemistry at the rainforest site. The basic model performance is good for NOx and poor for ozone. A box model containing the same chemical mechanism is used to explore the results of the global model in more depth and make comparisons between the two. Without some parameterization of the nighttime boundary layer – free troposphere mixing (i.e. the use of a dilution parameter), the box model does not reproduce the observations, pointing to the importance of adequately representing physical processes for comparisons with surface measurements. We conclude with a discussion of box model budget calculations of chemical reaction fluxes, deposition and mixing, and compare these results to output from p-TOMCAT. These show the same chemical mechanism behaves similarly in both models, but that emissions and advection play particularly strong roles in influencing the comparison to surface measurements.
Resumo:
Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.
Resumo:
A first step in interpreting the wide variation in trace gas concentrations measured over time at a given site is to classify the data according to the prevailing weather conditions. In order to classify measurements made during two intensive field campaigns at Mace Head, on the west coast of Ireland, an objective method of assigning data to different weather types has been developed. Air-mass back trajectories calculated using winds from ECMWF analyses, arriving at the site in 1995–1997, were allocated to clusters based on a statistical analysis of the latitude, longitude and pressure of the trajectory at 12 h intervals over 5 days. The robustness of the analysis was assessed by using an ensemble of back trajectories calculated for four points around Mace Head. Separate analyses were made for each of the 3 years, and for four 3-month periods. The use of these clusters in classifying ground-based ozone measurements at Mace Head is described, including the need to exclude data which have been influenced by local perturbations to the regional flow pattern, for example, by sea breezes. Even with a limited data set, based on 2 months of intensive field measurements in 1996 and 1997, there are statistically significant differences in ozone concentrations in air from the different clusters. The limitations of this type of analysis for classification and interpretation of ground-based chemistry measurements are discussed.
Resumo:
The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.
Resumo:
Retinal blurring resulting from the human eye's depth of focus has been shown to assist visual perception. Infinite focal depth within stereoscopically displayed virtual environments may cause undesirable effects, for instance, objects positioned at a distance in front of or behind the observer's fixation point will be perceived in sharp focus with large disparities thereby causing diplopia. Although published research on incorporation of synthetically generated Depth of Field (DoF) suggests that this might act as an enhancement to perceived image quality, no quantitative testimonies of perceptional performance gains exist. This may be due to the difficulty of dynamic generation of synthetic DoF where focal distance is actively linked to fixation distance. In this paper, such a system is described. A desktop stereographic display is used to project a virtual scene in which synthetically generated DoF is actively controlled from vergence-derived distance. A performance evaluation experiment on this system which involved subjects carrying out observations in a spatially complex virtual environment was undertaken. The virtual environment consisted of components interconnected by pipes on a distractive background. The subject was tasked with making an observation based on the connectivity of the components. The effects of focal depth variation in static and actively controlled focal distance conditions were investigated. The results and analysis are presented which show that performance gains may be achieved by addition of synthetic DoF. The merits of the application of synthetic DoF are discussed.
Resumo:
Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.
Resumo:
This paper describes the novel use of cluster analysis in the field of industrial process control. The severe multivariable process problems encountered in manufacturing have often led to machine shutdowns, where the need for corrective actions arises in order to resume operation. Production faults which are caused by processes running in less efficient regions may be prevented or diagnosed using a reasoning based on cluster analysis. Indeed the intemal complexity of a production machinery may be depicted in clusters of multidimensional data points which characterise the manufacturing process. The application of a Mean-Tracking cluster algorithm (developed in Reading) to field data acquired from a high-speed machinery will be discussed. The objective of such an application is to illustrate how machine behaviour can be studied, in particular how regions of erroneous and stable running behaviour can be identified.
Resumo:
The ASTER Global Digital Elevation Model (GDEM) has made elevation data at 30 m spatial resolution freely available, enabling reinvestigation of morphometric relationships derived from limited field data using much larger sample sizes. These data are used to analyse a range of morphometric relationships derived for dunes (between dune height, spacing, and equivalent sand thickness) in the Namib Sand Sea, which was chosen because there are a number of extant studies that could be used for comparison with the results. The relative accuracy of GDEM for capturing dune height and shape was tested against multiple individual ASTER DEM scenes and against field surveys, highlighting the smoothing of the dune crest and resultant underestimation of dune height, and the omission of the smallest dunes, because of the 30 m sampling of ASTER DEM products. It is demonstrated that morphometric relationships derived from GDEM data are broadly comparable with relationships derived by previous methods, across a range of different dune types. The data confirm patterns of dune height, spacing and equivalent sand thickness mapped previously in the Namib Sand Sea, but add new detail to these patterns.
Resumo:
Following earlier work looking at overall career difficulties and low economic rewards faced by graduates in creative disciplines, the paper takes a closer look into the different career patterns and economic performance of “Bohemian” graduates across different creative disciplines. While it is widely acknowledged in the literature that careers in the creative field tend to be unstructured, often relying on part-time work and low wages, our knowledge of how these characteristics differ across the creative industries and occupational sectors is very limited. The paper explores the different trajectory and career patterns experienced by graduates in different creative disciplinary fields and their ability to enter creative occupations. Data from the Higher Education Statistical Agency (HESA) are presented, articulating a complex picture of the reality of finding a creative occupation for creative graduates. While students of some disciplines struggle to find full-time work in the creative economy, for others full-time occupation is the norm. Geography plays a crucial role also in offering graduates opportunities in creative occupations and higher salaries. The findings are contextualised in the New Labour cultural policy framework and conclusions are drawn on whether the creative industries policy construct has hidden a very problematic reality of winners and losers in the creative economy.