1000 resultados para data-basemanagement


Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last decades there has been a global shift in forest management from a focus solely on timber management to ecosystem management that endorses all aspects of forest functions: ecological, economic and social. This has resulted in a shift in paradigm from sustained yield to sustained diversity of values, goods and benefits obtained at the same time, introducing new temporal and spatial scales into forest resource management. The purpose of the present dissertation was to develop methods that would enable spatial and temporal scales to be introduced into the storage, processing, access and utilization of forest resource data. The methods developed are based on a conceptual view of a forest as a hierarchically nested collection of objects that can have a dynamically changing set of attributes. The temporal aspect of the methods consists of lifetime management for the objects and their attributes and of a temporal succession linking the objects together. Development of the forest resource data processing method concentrated on the extensibility and configurability of the data content and model calculations, allowing for a diverse set of processing operations to be executed using the same framework. The contribution of this dissertation to the utilisation of multi-scale forest resource data lies in the development of a reference data generation method to support forest inventory methods in approaching single-tree resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patterns of movement in aquatic animals reflect ecologically important behaviours. Cyclical changes in the abiotic environment influence these movements, but when multiple processes occur simultaneously, identifying which is responsible for the observed movement can be complex. Here we used acoustic telemetry and signal processing to define the abiotic processes responsible for movement patterns in freshwater whiprays (Himantura dalyensis). Acoustic transmitters were implanted into the whiprays and their movements detected over 12 months by an array of passive acoustic receivers, deployed throughout 64 km of the Wenlock River, Qld, Australia. The time of an individual's arrival and departure from each receiver detection field was used to estimate whipray location continuously throughout the study. This created a linear-movement-waveform for each whipray and signal processing revealed periodic components within the waveform. Correlation of movement periodograms with those from abiotic processes categorically illustrated that the diel cycle dominated the pattern of whipray movement during the wet season, whereas tidal and lunar cycles dominated during the dry season. The study methodology represents a valuable tool for objectively defining the relationship between abiotic processes and the movement patterns of free-ranging aquatic animals and is particularly expedient when periods of no detection exist within the animal location data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract of Macbeth, G. M., Broderick, D., Buckworth, R. & Ovenden, J. R. (In press, Feb 2013). Linkage disequilibrium estimation of effective population size with immigrants from divergent populations: a case study on Spanish mackerel (Scomberomorus commerson). G3: Genes, Genomes and Genetics. Estimates of genetic effective population size (Ne) using molecular markers are a potentially useful tool for the management of endangered through to commercial species. But, pitfalls are predicted when the effective size is large, as estimates require large numbers of samples from wild populations for statistical validity. Our simulations showed that linkage disequilibrium estimates of Ne up to 10,000 with finite confidence limits can be achieved with sample sizes around 5000. This was deduced from empirical allele frequencies of seven polymorphic microsatellite loci in a commercially harvested fisheries species, the narrow barred Spanish mackerel (Scomberomorus commerson). As expected, the smallest standard deviation of Ne estimates occurred when low frequency alleles were excluded. Additional simulations indicated that the linkage disequilibrium method was sensitive to small numbers of genotypes from cryptic species or conspecific immigrants. A correspondence analysis algorithm was developed to detect and remove outlier genotypes that could possibly be inadvertently sampled from cryptic species or non-breeding immigrants from genetically separate populations. Simulations demonstrated the value of this approach in Spanish mackerel data. When putative immigrants were removed from the empirical data, 95% of the Ne estimates from jacknife resampling were above 24,000.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tetrapeptide sequences of the type Z-Pro-Y-X were obtained from the crystal structure data on 34 globular proteins, and used in an analysis of the positional preferences of the individual amino acid residues in the β-turn conformation. The effect of fixing proline as the second position residue in the tetrapeptide sequence was studied by comparing the data obtained on the positional preferences with the corresponding data obtained by Chou and Fasman using the Z-R-Y-X sequence, where no particular residue was fixed in any of the four positions. While, in general, several amino acid residues having relatively very high or very low preferences for specific positions were found to be common to both the Z-Pro-Y-X and Z-R-Y-X sequences, many significant differences were found between the two sets of data, which are to be attributed to specific interactions arising from the presence of the proline residue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data flow computers are high-speed machines in which an instruction is executed as soon as all its operands are available. This paper describes the EXtended MANchester (EXMAN) data flow computer which incorporates three major extensions to the basic Manchester machine. As extensions we provide a multiple matching units scheme, an efficient, implementation of array data structure, and a facility to concurrently execute reentrant routines. A simulator for the EXMAN computer has been coded in the discrete event simulation language, SIMULA 67, on the DEC 1090 system. Performance analysis studies have been conducted on the simulated EXMAN computer to study the effectiveness of the proposed extensions. The performance experiments have been carried out using three sample problems: matrix multiplication, Bresenham's line drawing algorithm, and the polygon scan-conversion algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines the feasibility of a forest inventory method based on two-phase sampling in estimating forest attributes at the stand or substand levels for forest management purposes. The method is based on multi-source forest inventory combining auxiliary data consisting of remote sensing imagery or other geographic information and field measurements. Auxiliary data are utilized as first-phase data for covering all inventory units. Various methods were examined for improving the accuracy of the forest estimates. Pre-processing of auxiliary data in the form of correcting the spectral properties of aerial imagery was examined (I), as was the selection of aerial image features for estimating forest attributes (II). Various spatial units were compared for extracting image features in a remote sensing aided forest inventory utilizing very high resolution imagery (III). A number of data sources were combined and different weighting procedures were tested in estimating forest attributes (IV, V). Correction of the spectral properties of aerial images proved to be a straightforward and advantageous method for improving the correlation between the image features and the measured forest attributes. Testing different image features that can be extracted from aerial photographs (and other very high resolution images) showed that the images contain a wealth of relevant information that can be extracted only by utilizing the spatial organization of the image pixel values. Furthermore, careful selection of image features for the inventory task generally gives better results than inputting all extractable features to the estimation procedure. When the spatial units for extracting very high resolution image features were examined, an approach based on image segmentation generally showed advantages compared with a traditional sample plot-based approach. Combining several data sources resulted in more accurate estimates than any of the individual data sources alone. The best combined estimate can be derived by weighting the estimates produced by the individual data sources by the inverse values of their mean square errors. Despite the fact that the plot-level estimation accuracy in two-phase sampling inventory can be improved in many ways, the accuracy of forest estimates based mainly on single-view satellite and aerial imagery is a relatively poor basis for making stand-level management decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An ecological risk assessment of the East Coast Otter Trawl Fishery in the Great Barrier Reef Region was undertaken in 2010 and 2011. It assessed the risks posed by this fishery to achieving fishery-related and broader ecological objectives of both the Queensland and Australian governments, including risks to the values and integrity of the Great Barrier Reef World Heritage Area. The risks assessed included direct and indirect effects on the species caught in the fishery as well as on the structure and functioning of the ecosystem. This ecosystem-based approach included an assessment of the impacts on harvested species, by-catch, species of conservation concern, marine habitats, species assemblages and ecosystem processes. The assessment took into account current management arrangements and fishing practices at the time of the assessment. The main findings of the assessment were: Current risk levels from trawling activities are generally low. Some risks from trawling remain. Risks from trawling have reduced in the Great Barrier Reef Region. Trawl fishing effort is a key driver of ecological risk. Zoning has been important in reducing risks. Reducing identified unacceptable risks requires a range of management responses. The commercial fishing industry is supportive and being proactive. Further reductions in trawl by-catch, high compliance with rules and accurate information from ongoing risk monitoring are important. Trawl fishing is just one of the sources of risk to the Great Barrier Reef.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As for other complex diseases, linkage analyses of schizophrenia (SZ) have produced evidence for numerous chromosomal regions, with inconsistent results reported across studies. The presence of locus heterogeneity appears likely and may reduce the power of linkage analyses if homogeneity is assumed. In addition, when multiple heterogeneous datasets are pooled, inter-sample variation in the proportion of linked families (alpha) may diminish the power of the pooled sample to detect susceptibility loci, in spite of the larger sample size obtained. We compare the significance of linkage findings obtained using allele-sharing LOD scores (LOD(exp))-which assume homogeneity-and heterogeneity LOD scores (HLOD) in European American and African American NIMH SZ families. We also pool these two samples and evaluate the relative power of the LOD(exp) and two different heterogeneity statistics. One of these (HLOD-P) estimates the heterogeneity parameter alpha only in aggregate data, while the second (HLOD-S) determines alpha separately for each sample. In separate and combined data, we show consistently improved performance of HLOD scores over LOD(exp). Notably, genome-wide significant evidence for linkage is obtained at chromosome 10p in the European American sample using a recessive HLOD score. When the two samples are combined, linkage at the 10p locus also achieves genome-wide significance under HLOD-S, but not HLOD-P. Using HLOD-S, improved evidence for linkage was also obtained for a previously reported region on chromosome 15q. In linkage analyses of complex disease, power may be maximised by routinely modelling locus heterogeneity within individual datasets, even when multiple datasets are combined to form larger samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For zygosity diagnosis in the absence of genotypic data, or in the recruitment phase of a twin study where only single twins from same-sex pairs are being screened, or to provide a test for sample duplication leading to the false identification of a dizygotic pair as monozygotic, the appropriate analysis of respondents' answers to questions about zygosity is critical. Using data from a young adult Australian twin cohort (N = 2094 complete pairs and 519 singleton twins from same-sex pairs with complete responses to all zygosity items), we show that application of latent class analysis (LCA), fitting a 2-class model, yields results that show good concordance with traditional methods of zygosity diagnosis, but with certain important advantages. These include the ability, in many cases, to assign zygosity with specified probability on the basis of responses of a single informant (advantageous when one zygosity type is being oversampled); and the ability to quantify the probability of misassignment of zygosity, allowing prioritization of cases for genotyping as well as identification of cases of probable laboratory error. Out of 242 twins (from 121 like-sex pairs) where genotypic data were available for zygosity confirmation, only a single case was identified of incorrect zygosity assignment by the latent class algorithm. Zygosity assignment for that single case was identified by the LCA as uncertain (probability of being a monozygotic twin only 76%), and the co-twin's responses clearly identified the pair as dizygotic (probability of being dizygotic 100%). In the absence of genotypic data, or as a safeguard against sample duplication, application of LCA for zygosity assignment or confirmation is strongly recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A-DNA pattern, obtained using a flat plat camera, was indexed by Fuller Image on the basis of a c-face centred monoclinic cell with A = 22.24 Å, B = 40.62 Å, C = 28.15 Å and β = 97.0°. A precession photograph of A-DNA which gives an undistorted picture of the lattice, showed that the unit cell parameters as given by Fuller Image were not quite correct. The precession photograph showed a strong meridional reflection (R = 0.00 Å−1) on the 11th layer line. But the occurrence of the meridional reflection on the 11th layer line could not be explained on the basis of the cell parameters given by Fuller Image ; using those cell parameters the reflection which comes closest to the meridian on 11th layer line is at R = 0.025 Å−1. However, a simple interchange of a and b values accounted for the meridional reflection on 11th layer line. The corrected cell parameter refined against 28 strong spots are A = 40.75 Å, B = 22.07 Å, C = 28.16 Å and β = 97.5°. In the new unit cell of A-DNA, the packing arrangement of the two molecules is different from that in the old one. Nonetheless, our earlier contention is again reaffirmed that both right and left-handed A-DNA are stereochemically allowed and consistent with the observed fibre pattern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A study was performed to investigate the value of near infrared reflectance spectroscopy (NIRS) as an alternate method to analytical techniques for identifying QTL associated with feed quality traits. Milled samples from an F6-derived recombinant inbred Tallon/Scarlett population were incubated in the rumen of fistulated cattle, recovered, washed and dried to determine the in-situ dry matter digestibility (DMD). Both pre- and post-digestion samples were analysed using NIRS to quantify key quality components relating to acid detergent fibre, starch and protein. This phenotypic data was used to identify trait associated QTL and compare them to previously identified QTL. Though a number of genetic correlations were identified between the phenotypic data sets, the only correlation of most interest was between DMD and starch digested (r = -0.382). The significance of this genetic correlation was that the NIRS data set identified a putative QTL on chromosomes 7H (LOD = 3.3) associated with starch digested. A QTL for DMD occurred in the same region of chromosome 7H, with flanking markers fAG/CAT63 and bPb-0758. The significant correlation and identification of this putative QTL, highlights the potential of technologies like NIRS in QTL analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-species fisheries are complex to manage and the ability to develop an appropriate governance structure is often seriously impeded because trading between sustainability objectives at the species level, economic objectives at the fleet level, and social objectives at the community scale, is complex. Many of these fisheries also tend to have a mix of information, with stock assessments available for some species and almost no information on other species. The fleets themselves comprise fishers from small family enterprises to large vertically integrated businesses. The Queensland trawl fishery in Australia is used as a case study for this kind of fishery. It has the added complexity that a large part of the fishery is within a World Heritage Area, the Great Barrier Reef Marine Park, which is managed by an agency of the Australian Commonwealth Government whereas the fishery itself is managed by the Queensland State Government. A stakeholder elicitation process was used to develop social, governance, economic and ecological objectives, and then weight the relative importance of these. An expert group was used to develop different governance strawmen (or management strategies) and these were assessed by a group of industry stakeholders and experts using multi-criteria decision analysis techniques against the different objectives. One strawman clearly provided the best overall set of outcomes given the multiple objectives, but was not optimal in terms of every objective, demonstrating that even the "best" strawman may be less than perfect. © 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In multi-vehicle motorcycle crashes, the motorcycle rider is less likely to be at-fault but more commonly severely injured than the other road user. Therefore, not surprisingly, crashes in which motorcycle riders are at-fault and particularly the injuries to the other road users in these crashes have received little research attention. This paper aims to address this gap in the literature by investigating the factors influencing the severity of injury to other road users in motorcyclist-at-fault crashes. Five years of data from Queensland, Australia, were obtained from a database of claims against the compulsory third party (CTP) injury insurance of the at-fault motorcyclists. Analysis of the data using an ordered probit model shows higher injury severity for crashes involving young (under 25) and older (60+) at-fault motorcyclists. Among the not at-fault road users, the young, old, and males were found to be more severely injured than others. Injuries to vehicle occupants were less severe than those to pillions. Crashes that occurred between vehicles traveling in opposite directions resulted in more severe injuries than those involving vehicles traveling in the same direction. While most existing studies have analyzed police reported crash data, this study used CTP insurance data. Comparison of results indicates the potential of using CTP insurance data as an alternative to police reported crash data for gaining a better understanding of risk factors for motorcycle crashes and injury severity.