15 resultados para methods: data analysis

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variety selection in perennial pasture crops involves identifying best varieties from data collected from multiple harvest times in field trials. For accurate selection, the statistical methods for analysing such data need to account for the spatial and temporal correlation typically present. This paper provides an approach for analysing multi-harvest data from variety selection trials in which there may be a large number of harvest times. Methods are presented for modelling the variety by harvest effects while accounting for the spatial and temporal correlation between observations. These methods provide an improvement in model fit compared to separate analyses for each harvest, and provide insight into variety by harvest interactions. The approach is illustrated using two traits from a lucerne variety selection trial. The proposed method provides variety predictions allowing for the natural sources of variation and correlation in multi-harvest data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of near infrared (NIR) hyperspectral imaging and hyperspectral image analysis for distinguishing between hard, intermediate and soft maize kernels from inbred lines was evaluated. NIR hyperspectral images of two sets (12 and 24 kernels) of whole maize kernels were acquired using a Spectral Dimensions MatrixNIR camera with a spectral range of 960-1662 nm and a sisuChema SWIR (short wave infrared) hyperspectral pushbroom imaging system with a spectral range of 1000-2498 nm. Exploratory principal component analysis (PCA) was used on absorbance images to remove background, bad pixels and shading. On the cleaned images. PCA could be used effectively to find histological classes including glassy (hard) and floury (soft) endosperm. PCA illustrated a distinct difference between glassy and floury endosperm along principal component (PC) three on the MatrixNIR and PC two on the sisuChema with two distinguishable clusters. Subsequently partial least squares discriminant analysis (PLS-DA) was applied to build a classification model. The PLS-DA model from the MatrixNIR image (12 kernels) resulted in root mean square error of prediction (RMSEP) value of 0.18. This was repeated on the MatrixNIR image of the 24 kernels which resulted in RMSEP of 0.18. The sisuChema image yielded RMSEP value of 0.29. The reproducible results obtained with the different data sets indicate that the method proposed in this paper has a real potential for future classification uses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To facilitate marketing and export, the Australian macadamia industry requires accurate crop forecasts. Each year, two levels of crop predictions are produced for this industry. The first is an overall longer-term forecast based on tree census data of growers in the Australian Macadamia Society (AMS). This data set currently accounts for around 70% of total production, and is supplemented by our best estimates of non-AMS orchards. Given these total tree numbers, average yields per tree are needed to complete the long-term forecasts. Yields from regional variety trials were initially used, but were found to be consistently higher than the average yields that growers were obtaining. Hence, a statistical model was developed using growers' historical yields, also taken from the AMS database. This model accounted for the effects of tree age, variety, year, region and tree spacing, and explained 65% of the total variation in the yield per tree data. The second level of crop prediction is an annual climate adjustment of these overall long-term estimates, taking into account the expected effects on production of the previous year's climate. This adjustment is based on relative historical yields, measured as the percentage deviance between expected and actual production. The dominant climatic variables are observed temperature, evaporation, solar radiation and modelled water stress. Initially, a number of alternate statistical models showed good agreement within the historical data, with jack-knife cross-validation R2 values of 96% or better. However, forecasts varied quite widely between these alternate models. Exploratory multivariate analyses and nearest-neighbour methods were used to investigate these differences. For 2001-2003, the overall forecasts were in the right direction (when compared with the long-term expected values), but were over-estimates. In 2004 the forecast was well under the observed production, and in 2005 the revised models produced a forecast within 5.1% of the actual production. Over the first five years of forecasting, the absolute deviance for the climate-adjustment models averaged 10.1%, just outside the targeted objective of 10%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding how aquatic species grow is fundamental in fisheries because stock assessment often relies on growth dependent statistical models. Length-frequency-based methods become important when more applicable data for growth model estimation are either not available or very expensive. In this article, we develop a new framework for growth estimation from length-frequency data using a generalized von Bertalanffy growth model (VBGM) framework that allows for time-dependent covariates to be incorporated. A finite mixture of normal distributions is used to model the length-frequency cohorts of each month with the means constrained to follow a VBGM. The variances of the finite mixture components are constrained to be a function of mean length, reducing the number of parameters and allowing for an estimate of the variance at any length. To optimize the likelihood, we use a minorization–maximization (MM) algorithm with a Nelder–Mead sub-step. This work was motivated by the decline in catches of the blue swimmer crab (BSC) (Portunus armatus) off the east coast of Queensland, Australia. We test the method with a simulation study and then apply it to the BSC fishery data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The majority of Australian weeds are exotic plant species that were intentionally introduced for a variety of horticultural and agricultural purposes. A border weed risk assessment system (WRA) was implemented in 1997 in order to reduce the high economic costs and massive environmental damage associated with introducing serious weeds. We review the behaviour of this system with regard to eight years of data collected from the assessment of species proposed for importation or held within genetic resource centres in Australia. From a taxonomic perspective, species from the Chenopodiaceae and Poaceae were most likely to be rejected and those from the Arecaceae and Flacourtiaceae were most likely to be accepted. Dendrogram analysis and classification and regression tree (TREE) models were also used to analyse the data. The latter revealed that a small subset of the 35 variables assessed was highly associated with the outcome of the original assessment. The TREE model examining all of the data contained just five variables: unintentional human dispersal, congeneric weed, weed elsewhere, tolerates or benefits from mutilation, cultivation or fire, and reproduction by vegetative propagation. It gave the same outcome as the full WRA model for 71% of species. Weed elsewhere was not the first splitting variable in this model, indicating that the WRA has a capacity for capturing species that have no history of weediness. A reduced TREE model (in which human-mediated variables had been removed) contained four variables: broad climate suitability, reproduction in less or than equal to 1 year, self-fertilisation, and tolerates and benefits from mutilation, cultivation or fire. It yielded the same outcome as the full WRA model for 65% of species. Data inconsistencies and the relative importance of questions are discussed, with some recommendations made for improving the use of the system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Instantaneous natural mortality rates and a nonparametric hunting mortality function are estimated from a multiple-year tagging experiment with arbitrary, time-dependent fishing or hunting mortality. Our theory allows animals to be tagged over a range of times in each year, and to take time to mix into the population. Animals are recovered by hunting or fishing, and death events from natural causes occur but are not observed. We combine a long-standing approach based on yearly totals, described by Brownie et al. (1985, Statistical Inference from Band Recovery Data: A Handbook, Second edition, United States Fish and Wildlife Service, Washington, Resource Publication, 156), with an exact-time-of-recovery approach originated by Hearn, Sandland and Hampton (1987, Journal du Conseil International pour l'Exploration de la Mer, 43, 107-117), who modeled times at liberty without regard to time of tagging. Our model allows for exact times of release and recovery, incomplete reporting of recoveries, and potential tag shedding. We apply our methods to data on the heavily exploited southern bluefin tuna (Thunnus maccoyii).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The emerging carbon economy will have a major impact on grazing businesses because of significant livestock methane and land-use change emissions. Livestock methane emissions alone account for similar to 11% of Australia's reported greenhouse gas emissions. Grazing businesses need to develop an understanding of their greenhouse gas impact and be able to assess the impact of alternative management options. This paper attempts to generate a greenhouse gas budget for two scenarios using a spread sheet model. The first scenario was based on one land-type '20-year-old brigalow regrowth' in the brigalow bioregion of southern-central Queensland. The 50 year analysis demonstrated the substantially different greenhouse gas outcomes and livestock carrying capacity for three alternative regrowth management options: retain regrowth (sequester 71.5 t carbon dioxide equivalents per hectare, CO2-e/ha), clear all regrowth (emit 42.8 t CO2-e/ha) and clear regrowth strips (emit 5.8 t CO2-e/ha). The second scenario was based on a 'remnant eucalypt savanna-woodland' land type in the Einasleigh Uplands bioregion of north Queensland. The four alternative vegetation management options were: retain current woodland structure (emit 7.4 t CO2-e/ha), allow woodland to thicken increasing tree basal area (sequester 20.7 t CO2-e/ha), thin trees less than 10 cm diameter (emit 8.9 t CO2-e/ha), and thin trees <20 cm diameter (emit 12.4 t CO2-e/ha). Significant assumptions were required to complete the budgets due to gaps in current knowledge on the response of woody vegetation, soil carbon and non-CO2 soil emissions to management options and land-type at the property scale. The analyses indicate that there is scope for grazing businesses to choose alternative management options to influence their greenhouse gas budget. However, a key assumption is that accumulation of carbon or avoidance of emissions somewhere on a grazing business (e.g. in woody vegetation or soil) will be recognised as an offset for emissions elsewhere in the business (e.g. livestock methane). This issue will be a challenge for livestock industries and policy makers to work through in the coming years.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Marker ordering during linkage map construction is a critical component of QTL mapping research. In recent years, high-throughput genotyping methods have become widely used, and these methods may generate hundreds of markers for a single mapping population. This poses problems for linkage analysis software because the number of possible marker orders increases exponentially as the number of markers increases. In this paper, we tested the accuracy of linkage analyses on simulated recombinant inbred line data using the commonly used Map Manager QTX (Manly et al. 2001: Mammalian Genome 12, 930-932) software and RECORD (Van Os et al. 2005: Theoretical and Applied Genetics 112, 30-40). Accuracy was measured by calculating two scores: % correct marker positions, and a novel, weighted rank-based score derived from the sum of absolute values of true minus observed marker ranks divided by the total number of markers. The accuracy of maps generated using Map Manager QTX was considerably lower than those generated using RECORD. Differences in linkage maps were often observed when marker ordering was performed several times using the identical dataset. In order to test the effect of reducing marker numbers on the stability of marker order, we pruned marker datasets focusing on regions consisting of tightly linked clusters of markers, which included redundant markers. Marker pruning improved the accuracy and stability of linkage maps because a single unambiguous marker order was produced that was consistent across replications of analysis. Marker pruning was also applied to a real barley mapping population and QTL analysis was performed using different map versions produced by the different programs. While some QTLs were identified with both map versions, there were large differences in QTL mapping results. Differences included maximum LOD and R-2 values at QTL peaks and map positions, thus highlighting the importance of marker order for QTL mapping

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For essential elements, such as copper (Cu) and zinc (Zn), the bioavailability in biosolids is important from a nutrient release and a potential contamination perspective. Most ecotoxicity studies are done using metal salts and it has been argued that the bioavailability of metals in biosolids can be different to that of metal salts. We compared the bioavailability of Cu and Zn in biosolids with those of metal salts in the same soils using twelve Australian field trials. Three different measures of bioavailability were assessed: soil solution extraction, CaCl2 extractable fractions and plant uptake. The results showed that bioavailability for Zn was similar in biosolid and salt treatments. For Cu, the results were inconclusive due to strong Cu homeostasis in plants and dissolved organic matter interference in extractable measures. We therefore recommend using isotope dilution methods to assess differences in Cu availability between biosolid and salt treatments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is essential to provide experimental evidence and reliable predictions of the effects of water stress on crop production in the drier, less predictable environments. A field experiment undertaken in southeast Queensland, Australia with three water regimes (fully irrigated, rainfed and irrigated until late canopy expansion followed by rainfed) was used to compare effects of water stress on crop production in two maize (Zea mays L.) cultivars (Pioneer 34N43 and Pioneer 31H50). Water stress affected growth and yield more in Pioneer 34N43 than in Pioneer 31H50. A crop model APSIM-Maize, after having been calibrated for the two cultivars, was used to simulate maize growth and development under water stress. The predictions on leaf area index (LAI) dynamics, biomass growth and grain yield under rain fed and irrigated followed by rain fed treatments was reasonable, indicating that stress indices used by APSIM-Maize produced appropriate adjustments to crop growth and development in response to water stress. This study shows that Pioneer 31H50 is less sensitive to water stress and thus a preferred cultivar in dryland conditions, and that it is feasible to provide sound predictions and risk assessment for crop production in drier, more variable conditions using the APSIM-Maize model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Standardised time series of fishery catch rates require collations of fishing power data on vessel characteristics. Linear mixed models were used to quantify fishing power trends and study the effect of missing data encountered when relying on commercial logbooks. For this, Australian eastern king prawn (Melicertus plebejus) harvests were analysed with historical (from vessel surveys) and current (from commercial logbooks) vessel data. Between 1989 and 2010, fishing power increased up to 76%. To date, both forward-filling and, alternatively, omitting records with missing vessel information from commercial logbooks produce broadly similar fishing power increases and standardised catch rates, due to the strong influence of years with complete vessel data (16 out of 23 years of data). However, if gaps in vessel information had not originated randomly and skippers from the most efficient vessels were the most diligent at filling in logbooks, considerable errors would be introduced. Also, the buffering effect of complete years would be short lived as years with missing data accumulate. Given ongoing changes in fleet profile with high-catching vessels fishing proportionately more of the fleet’s effort, compliance with logbook completion, or alternatively ongoing vessel gear surveys, is required for generating accurate estimates of fishing power and standardised catch rates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Patterns of movement in aquatic animals reflect ecologically important behaviours. Cyclical changes in the abiotic environment influence these movements, but when multiple processes occur simultaneously, identifying which is responsible for the observed movement can be complex. Here we used acoustic telemetry and signal processing to define the abiotic processes responsible for movement patterns in freshwater whiprays (Himantura dalyensis). Acoustic transmitters were implanted into the whiprays and their movements detected over 12 months by an array of passive acoustic receivers, deployed throughout 64 km of the Wenlock River, Qld, Australia. The time of an individual's arrival and departure from each receiver detection field was used to estimate whipray location continuously throughout the study. This created a linear-movement-waveform for each whipray and signal processing revealed periodic components within the waveform. Correlation of movement periodograms with those from abiotic processes categorically illustrated that the diel cycle dominated the pattern of whipray movement during the wet season, whereas tidal and lunar cycles dominated during the dry season. The study methodology represents a valuable tool for objectively defining the relationship between abiotic processes and the movement patterns of free-ranging aquatic animals and is particularly expedient when periods of no detection exist within the animal location data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The development of a horse vaccine against Hendra virus has been hailed as a good example of a One Health approach to the control of human disease. Although there is little doubt that this is true, it is clear from the underwhelming uptake of the vaccine by horse owners to date (approximately 10%) that realisation of a One Health approach requires more than just a scientific solution. As emerging infectious diseases may often be linked to the development and implementation of novel vaccines this presentation will discuss factors influencing their uptake; using Hendra virus in Australia as a case study. Methods: This presentation will draw on data collected from the Horse owners and Hendra virus: A Longitudinal cohort study To Evaluate Risk (HHALTER) study. The HHALTER study is a mixed methods research study comprising a two-year survey-based longitudinal cohort study and qualitative interview study with horse owners in Australia. The HHALTER study has investigated and tracked changes in a broad range of issues around early uptake of vaccination, horse owner uptake of other recommended disease risk mitigation strategies, and attitudes to government policy and disease response. Interviews provide further insights into attitudes towards risk and decision-making in relation to vaccine uptake. A combination of quantitative and qualitative data analysis will be reported. Results: Data collected from more than 1100 horse owners shortly after vaccine introduction indicated that vaccine uptake and intention to vaccinate was associated with a number of risk perception factors and financial cost factors. In addition, concerns about side effects and veterinarians refusing to treat unvaccinated horses were linked to uptake. Across the study period vaccine uptake in the study cohort increased to more than 50%, however, concerns around side effects, equine performance and breeding impacts, delays to full vaccine approvals, and attempts to mandate vaccination by horse associations and event organisers have all impacted acceptance. Conclusion: Despite being provided with a safe and effective vaccine for Hendra virus that can protect horses and break the transmission cycle of the virus to humans, Australian horse owners have been reluctant to commit to it. General issues pertinent to novel vaccines, combined with challenges in the implementation of the vaccine have led to issues of mistrust and misconception with some horse owners. Moreover, factors such as cost, booster dose schedules, complexities around perceived risk, and ulterior motives attributed to veterinarians have only served to polarise attitudes to vaccine acceptance.