855 resultados para quantifying heteroskedasticity
Resumo:
The Brigalow Belt bioregion of southern and central Queensland supports a large percentage of northern Australia's sown pastures and beef herd. The Brigalow soils were widely thought to have adequate phosphorus (P) for cropping, sown pastures and grazing animals, which has led to almost no use of P fertiliser on sown pastures. The majority of pastures established in the region were sown with tropical grasses only (i.e. no legumes were sown). Under grass-only pastures, nitrogen (N) mineralisation rates decline with time since establishment as N is 'tied-up' in soil organic matter. This process leads to a significant decline in pasture and animal productivity and is commonly called 'pasture rundown'. Incorporating pasture legumes has been identified as the best long-term solution to improve the productivity of rundown sown grass pastures. Pasture legumes require adequate P to grow well and fix large amounts of N to increase the productivity of rundown sown grass pastures. Producers and farm advisors have traditionally thought that P fertiliser is not cost-effective for legume-based improved pastures growing on inland areas of Queensland despite there being little, if any, data on production responses or their economic outcomes. Recent studies show large and increasing areas of low plant available soil P and large responses by pasture legumes to P fertiliser on Brigalow soils. The economic analysis in this scoping study indicates potential returns of 9–15% on extra funds invested from the application of P fertiliser, when establishing legumes into grass pastures on low P soils (i.e. lower than the critical P requirement of the legume grown). Higher returns of 12–24% may be possible when adding P fertiliser to already established grass/legume pastures on such soils. As these results suggest potential for significant returns from applying P fertiliser on legume pastures, it is recommended that research be conducted to better quantify the impacts of P fertiliser on productivity and profit. Research priorities include: quantifying the animal production and economic impact of fertilising legume-based pastures in the sub-tropics for currently used legumes; quantifying the comparative P requirements and responses of available legume varieties; understanding clay soil responses to applied P fertiliser; testing the P status of herds grazing in the Brigalow Belt; and quantifying the extent of other nutrient deficiencies (e.g. sulphur and potassium) for legume based pastures. Development and extension activities are required to demonstrate the commercial impacts of applying P fertiliser to legume based pastures.
Resumo:
Recolonisation of soil by macrofauna (especially ants, termites and earthworms) in rehabilitated open-cut mine sites is inevitable and, in terms of habitat restoration and function, typically of great value. In these highly disturbed landscapes, soil invertebrates play a major role in soil development (macropore configuration, nutrient cycling, bioturbation, etc.) and can influence hydrological processes such as infiltration, seepage, runoff generation and soil erosion. Understanding and quantifying these ecosystem processes is important in rehabilitation design, establishment and subsequent management to ensure progress to the desired end goal, especially in waste cover systems designed to prevent water reaching and transporting underlying hazardous waste materials. However, the soil macrofauna is typically overlooked during hydrological modelling, possibly due to uncertainties on the extent of their influence, which can lead to failure of waste cover systems or rehabilitation activities. We propose that scientific experiments under controlled conditions and field trials on post-mining lands are required to quantify (i) macrofauna–soil structure interactions, (ii) functional dynamics of macrofauna taxa,and (iii) their effects on macrofauna and soil development over time. Such knowledge would provide crucial information for soil water models, which would increase confidence in mine waste cover design recommendations and eventually lead to higher likelihood of rehabilitation success of open-cut mining land.
Resumo:
Objective Foodborne illnesses in Australia, including salmonellosis, are estimated to cost over $A1.25 billion annually. The weather has been identified as being influential on salmonellosis incidence, as cases increase during summer, however time series modelling of salmonellosis is challenging because outbreaks cause strong autocorrelation. This study assesses whether switching models is an improved method of estimating weather–salmonellosis associations. Design We analysed weather and salmonellosis in South-East Queensland between 2004 and 2013 using 2 common regression models and a switching model, each with 21-day lags for temperature and precipitation. Results The switching model best fit the data, as judged by its substantial improvement in deviance information criterion over the regression models, less autocorrelated residuals and control of seasonality. The switching model estimated a 5°C increase in mean temperature and 10 mm precipitation were associated with increases in salmonellosis cases of 45.4% (95% CrI 40.4%, 50.5%) and 24.1% (95% CrI 17.0%, 31.6%), respectively. Conclusions Switching models improve on traditional time series models in quantifying weather–salmonellosis associations. A better understanding of how temperature and precipitation influence salmonellosis may identify where interventions can be made to lower the health and economic costs of salmonellosis.
Resumo:
Urban growth identification, quantification, knowledge of rate and the trends of growth would help in regional planning for better infrastructure provision in environmentally sound way. This requires analysis of spatial and temporal data, which help in quantifying the trends of growth on spatial scale. Emerging technologies such as Remote Sensing, Geographic Information System (GIS) along with Global Positioning System (GPS) help in this regard. Remote sensing aids in the collection of temporal data and GIS helps in spatial analysis. This paper focuses on the analysis of urban growth pattern in the form of either radial or linear sprawl along the Bangalore - Mysore highway. Various GIS base layers such as builtup areas along the highway, road network, village boundary etc. were generated using collateral data such as the Survey of India toposheet, etc. Further, this analysis was complemented with the computation of Shannon's entropy, which helped in identifying prevalent sprawl zone, rate of growth and in delineating potential sprawl locations. The computation Shannon's entropy helped in delineating regions with dispersed and compact growth. This study reveals that the Bangalore North and South taluks contributed mainly to the sprawl with 559% increase in built-up area over a period of 28 years and high degree of dispersion. The Mysore and Srirangapatna region showed 128% change in built-up area and a high potential for sprawl with slightly high dispersion. The degree of sprawl was found to be directly proportional to the distances from the cities.
Resumo:
Quantifying nitrous oxide (N(2)O) fluxes, a potent greenhouse gas, from soils is necessary to improve our knowledge of terrestrial N(2)O losses. Developing universal sampling frequencies for calculating annual N(2)O fluxes is difficult, as fluxes are renowned for their high temporal variability. We demonstrate daily sampling was largely required to achieve annual N(2)O fluxes within 10% of the best estimate for 28 annual datasets collected from three continents, Australia, Europe and Asia. Decreasing the regularity of measurements either under- or overestimated annual N(2)O fluxes, with a maximum overestimation of 935%. Measurement frequency was lowered using a sampling strategy based on environmental factors known to affect temporal variability, but still required sampling more than once a week. Consequently, uncertainty in current global terrestrial N(2)O budgets associated with the upscaling of field-based datasets can be decreased significantly using adequate sampling frequencies.
Resumo:
Microbial activity in soils is the main source of nitrous oxide (N2O) to the atmosphere. Nitrous oxide is a strong greenhouse gas in the troposphere and participates in ozone destructive reactions in the stratosphere. The constant increase in the atmospheric concentration, as well as uncertainties in the known sources and sinks of N2O underline the need to better understand the processes and pathways of N2O in terrestrial ecosystems. This study aimed at quantifying N2O emissions from soils in northern Europe and at investigating the processes and pathways of N2O from agricultural and forest ecosystems. Emissions were measured in forest ecosystems, agricultural soils and a landfill, using the soil gradient, chamber and eddy covariance methods. Processes responsible for N2O production, and the pathways of N2O from the soil to the atmosphere, were studied in the laboratory and in the field. These ecosystems were chosen for their potential importance to the national and global budget of N2O. Laboratory experiments with boreal agricultural soils revealed that N2O production increases drastically with soil moisture content, and that the contribution of the nitrification and denitrification processes to N2O emissions depends on soil type. Laboratory study with beech (Fagus sylvatica) seedlings demonstrated that trees can serve as conduits for N2O from the soil to the atmosphere. If this mechanism is important in forest ecosystems, the current emission estimates from forest soils may underestimate the total N2O emissions from forest ecosystems. Further field and laboratory studies are needed to evaluate the importance of this mechanism in forest ecosystems. The emissions of N2O from northern forest ecosystems and a municipal landfill were highly variable in time and space. The emissions of N2O from boreal upland forest soil were among the smallest reported in the world. Despite the low emission rates, the soil gradient method revealed a clear seasonal variation in N2O production. The organic topsoil was responsible for most of the N2O production and consumption in this forest soil. Emissions from the municipal landfill were one to two orders of magnitude higher than those from agricultural soils, which are the most important source of N2O to the atmosphere. Due to their small areal coverage, landfills only contribute minimally to national N2O emissions in Finland. The eddy covariance technique was demonstrated to be useful for measuring ecosystem-scale emissions of N2O in forest and landfill ecosystems. Overall, more measurements and integration between different measurement techniques are needed to capture the large variability in N2O emissions from natural and managed northern ecosystems.
Resumo:
Site-specific geotechnical data are always random and variable in space. In the present study, a procedure for quantifying the variability in geotechnical characterization and design parameters is discussed using the site-specific cone tip resistance data (qc) obtained from static cone penetration test (SCPT). The parameters for the spatial variability modeling of geotechnical parameters i.e. (i) existing trend function in the in situ qc data; (ii) second moment statistics i.e. analysis of mean, variance, and auto-correlation structure of the soil strength and stiffness parameters; and (iii) inputs from the spatial correlation analysis, are utilized in the numerical modeling procedures using the finite difference numerical code FLAC 5.0. The influence of consideration of spatially variable soil parameters on the reliability-based geotechnical deign is studied for the two cases i.e. (a) bearing capacity analysis of a shallow foundation resting on a clayey soil, and (b) analysis of stability and deformation pattern of a cohesive-frictional soil slope. The study highlights the procedure for conducting a site-specific study using field test data such as SCPT in geotechnical analysis and demonstrates that a few additional computations involving soil variability provide a better insight into the role of variability in designs.
Resumo:
Efficient and effective growth factor (GF) delivery is an ongoing challenge for tissue regeneration therapies. The accurate quantification of complex molecules such as GFs, encapsulated in polymeric delivery devices, is equally critical and just as complex as achieving efficient delivery of active GFs. In this study, GFs relevant to bone tissue formation, vascular endothelial growth factor (VEGF) and bone morphogenetic protein 7 (BMP-7), were encapsulated, using the technique of electrospraying, into poly(lactic-co-glycolic acid) microparticles that contained poly(ethylene glycol) and trehalose to assist GF bioactivity. Typical quantification procedures, such as extraction and release assays using saline buffer, generated a significant degree of GF interactions, which impaired accurate assessment by enzyme-linked immunosorbent assay (ELISA). When both dry BMP-7 and VEGF were processed with chloroform, as is the case during the electrospraying process, reduced concentrations of the GFs were detected by ELISA; however, the biological effect on myoblast cells (C2C12) or endothelial cells (HUVECs) was unaffected. When electrosprayed particles containing BMP-7 were cultured with preosteoblasts (MC3T3-E1), significant cell differentiation into osteoblasts was observed up to 3 weeks in culture, as assessed by measuring alkaline phosphatase. In conclusion, this study showed how electrosprayed microparticles ensured efficient delivery of fully active GFs relevant to bone tissue engineering. Critically, it also highlights major discrepancies in quantifying GFs in polymeric microparticle systems when comparing ELISA with cell-based assays.
Resumo:
Accurately quantifying total greenhouse gas emissions (e.g. methane) from natural systems such as lakes, reservoirs and wetlands requires the spatial-temporal measurement of both diffusive and ebullitive (bubbling) emissions. Traditional, manual, measurement techniques provide only limited localised assessment of methane flux, often introducing significant errors when extrapolated to the whole-of-system. In this paper, we directly address these current sampling limitations and present a novel multiple robotic boat system configured to measure the spatiotemporal release of methane to atmosphere across inland waterways. The system, consisting of multiple networked Autonomous Surface Vehicles (ASVs) and capable of persistent operation, enables scientists to remotely evaluate the performance of sampling and modelling algorithms for real-world process quantification over extended periods of time. This paper provides an overview of the multi-robot sampling system including the vehicle and gas sampling unit design. Experimental results are shown demonstrating the system’s ability to autonomously navigate and implement an exploratory sampling algorithm to measure methane emissions on two inland reservoirs.
Resumo:
Researchers are assessed from a researcher-centric perspective - by quantifying a researcher's contribution to the field. Citation and publication counts are some typical examples. We propose a student-centric measure to assess researchers on their mentoring abilities. Our approach quantifies benefits bestowed by researchers upon their students by characterizing the publication dynamics of research advisor-student interactions in author collaboration networks. We show that our measures could help aspiring students identify research advisors with proven mentoring skills. Our measures also help in stratification of researchers with similar ranks based on typical indices like publication and citation counts while being independent of their direct influences.
Resumo:
The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a ‘magnitude-based inference’ approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.
Resumo:
New stars in galaxies form in dense, molecular clouds of the interstellar medium. Measuring how the mass is distributed in these clouds is of crucial importance for the current theories of star formation. This is because several open issues in them, such as the strength of different mechanism regulating star formation and the origin of stellar masses, can be addressed using detailed information on the cloud structure. Unfortunately, quantifying the mass distribution in molecular clouds accurately over a wide spatial and dynamical range is a fundamental problem in the modern astrophysics. This thesis presents studies examining the structure of dense molecular clouds and the distribution of mass in them, with the emphasis on nearby clouds that are sites of low-mass star formation. In particular, this thesis concentrates on investigating the mass distributions using the near infrared dust extinction mapping technique. In this technique, the gas column densities towards molecular clouds are determined by examining radiation from the stars that shine through the clouds. In addition, the thesis examines the feasibility of using a similar technique to derive the masses of molecular clouds in nearby external galaxies. The papers presented in this thesis demonstrate how the near infrared dust extinction mapping technique can be used to extract detailed information on the mass distribution in nearby molecular clouds. Furthermore, such information is used to examine characteristics crucial for the star formation in the clouds. Regarding the use of extinction mapping technique in nearby galaxies, the papers of this thesis show that deriving the masses of molecular clouds using the technique suffers from strong biases. However, it is shown that some structural properties can still be examined with the technique.
Resumo:
A better understanding of stock price changes is important in guiding many economic activities. Since prices often do not change without good reasons, searching for related explanatory variables has involved many enthusiasts. This book seeks answers from prices per se by relating price changes to their conditional moments. This is based on the belief that prices are the products of a complex psychological and economic process and their conditional moments derive ultimately from these psychological and economic shocks. Utilizing information about conditional moments hence makes it an attractive alternative to using other selective financial variables in explaining price changes. The first paper examines the relation between the conditional mean and the conditional variance using information about moments in three types of conditional distributions; it finds that the significance of the estimated mean and variance ratio can be affected by the assumed distributions and the time variations in skewness. The second paper decomposes the conditional industry volatility into a concurrent market component and an industry specific component; it finds that market volatility is on average responsible for a rather small share of total industry volatility — 6 to 9 percent in UK and 2 to 3 percent in Germany. The third paper looks at the heteroskedasticity in stock returns through an ARCH process supplemented with a set of conditioning information variables; it finds that the heteroskedasticity in stock returns allows for several forms of heteroskedasticity that include deterministic changes in variances due to seasonal factors, random adjustments in variances due to market and macro factors, and ARCH processes with past information. The fourth paper examines the role of higher moments — especially skewness and kurtosis — in determining the expected returns; it finds that total skewness and total kurtosis are more relevant non-beta risk measures and that they are costly to be diversified due either to the possible eliminations of their desirable parts or to the unsustainability of diversification strategies based on them.
Resumo:
Questions of the small size of non-industrial private forest (NIPF) holdings in Finland are considered and factors affecting their partitioning are analyzed. This work arises out of Finnish forest policy statements in which the small average size of holdings has been seen to have a negative influence on the economics of forestry. A survey of the literature indicates that the size of holdings is an important factor determining the costs of logging and silvicultural operations, while its influence on the timber supply is slight. The empirical data are based on a sample of 314 holdings collected by interviewing forest owners in the years 1980-86. In 1990-91 the same holdings were resurveyed by means of a postal inquiry and partly by interviewing forest owners. The principal objective in compiling the data is to assist in quantifying ownership factors that influence partitioning among different kinds of NIPF holdings. Thus the mechanism of partitioning were described and a maximum likelihood logistic regression model was constructed using seven independent holding and ownership variables. One out of four holdings had undergone partitioning in conjunction with a change in ownership, one fifth among family owned holdings and nearly a half among jointly owned holdings. The results of the logistic regression model indicate, for instance, that the odds on partitioning is about three times greater for jointly owned holdings than for family owned ones. Also, the probabilities of partitioning were estimated and the impact of independent dichotomous variables on the probability of partitioning ranged between 0.02 and 0.10. The low value of the Hosmer-Lemeshow test statistic indicates a good fit of the model and the rate of correct classification was estimated to be 88 per cent with a cutoff point of 0.5. The average size of holdings undergoing ownership changes decreased from 29.9 ha to 28.7 ha over the approximate interval 1983-90. In addition, the transition probability matrix showed that the trends towards smaller size categories mostly involved in the small size categories, less than 20 ha. The results of the study can be used in considering the effects of the small size of holdings for forestry and if the purpose is to influence partitioning through forest or rural policy.
Resumo:
Results are reported of comparative measurements made in 14 HV (high-voltage) laboratories in ten different countries. The theory of the proposed methods of characterizing the dynamic behavior is given, and the parameters to be used are discussed. Comparative measurements made using 95 systems based on 53 dividers are analyzed. This analysis shows that many of the system now in use, even though they fulfil the basic response requirements of the standards, do not meet the accuracy requirements. Because no transfer measurements were made between laboratories, there is no way to detect similar errors in both the system under test and the reference system. Hence, the situation may be worse than reported. This has led to the recommendation that comparative measurements should be the main route for quantifying industrial impulse measuring systems