135 resultados para Agricultural Experiments Stations
Resumo:
Methane (CH4) is an important greenhouse gas with a global warming potential (GWP) 25 times greater than carbon dioxide (CO2) that can be produced or consumed in soils depending on environmental conditions and other factors. Biochar application to soils has been shown to reduce CH4 emissions and to increase CH4 consumption. However, the effects of rice husk biochar (RB) have not been thoroughly investigated. Two 60-day laboratory incubation experiments were conducted to investigate the effects of amending two soil types with RB, raw mill mud (MM) and composted mill mud (CM) on soil CH4 consumption and emissions. Soil cores incubated in 1 L glass jars and gas samples were analysed for CH4 using gas chromatography. Average CH4 consumption rates varied from -0.06 to -0.68 g CH4-C( )/ha/d in sandy loam soil and -0.59 to -1.00 g CH4-C/ha/d in clay soil. Application of RB resulted in CH4 uptake of -0.52 to -0.55 g CH4-C/ha/d in sandy loam and -0.76 to -0.91 g CH4-C/ha/d in clay soil. Addition of MM showed low CH4 emissions or consumption at 60% water-filled pore space (WFPS) in both soils. However, at high water contents (>75% WFPS) the application of MM produced high rates of CH4 emissions which were significantly suppressed when RB was added. Cumulative emissions of the MM treatment produced 108.9 g CH4-C/ha at 75% WFPS and 11 459.3 g CH4-C/ha at 90% WFPS in sandy loam soil over a period of 60 days. RB can increase CH4 uptake under low soil water content (SWC) and decrease CH4 emissions under anaerobic conditions. CM expressed more potential to reduce CH4 emissions than those of MM.
Resumo:
The carousel wind tunnel (CWT) can be a significant tool for the determination of the nature and magnitude of interparticlar forces at threshold of motion. By altering particle and drum surface electrical properties and/or by applying electric potential difference across the inner and outer drums, it should be possible to separate electrostatic effects from other forces of cohesion. Besides particle trajectory and bedform analyses, suggestions for research include particle aggregation in zero and sub-gravity environments, effect of suspension-saltation ratio on soil abrasion, and the effects of shear and shear free turbulence on particle aggregation as applied to evolution of solar nebula.
Resumo:
Numerous crops grow in sugar regions that have the potential to increase the amount of biomass available to a small bagasse-based pulp factory. Arundo donax and Sorghum offer unique advantages to farmers compared to other agricultural crops. Sorghum bicolour requires only 1/3 of the water of sugarcane. Arundo donax is a very high yield crop, it can also grow with little water but it has the further advantage in that it is also highly stress tolerant, making it suitable for land which is unsuited to other crops. Pulps produced from these crops were benchmarked against sugarcane bagasse pulp. Arundo, sorghum and bagasse were pulped using KOH and anthraquinone to 20 Kappa number so as to produce a bleachable pulp. The unbleached sorghum pulp has better tensile strength properties than the unbleached Arundo pulp (43.8 Nm/g compared to 21.4 Nm/g) and the bleached sorghum pulp tensile strength was similar to bagasse (28.4 Nm/g). At 20 Kappa number, sorghum pulp had acceptable yield for a non-wood fibre (45% c.f. 55% for bagasse), Arundo donax pulp had low tensile strength, and relatively low yield (38.7%), even for an agricultural fibre and required severe cooking conditions to achieve similar delignification to sugarcane bagasse or sorghum. Sorghum and Arundo donax produced thicker handsheets than bagasse (>160 μm c.f. 122 μm for bagasse). In preliminary experiments sorghum and bagasse responded slightly better to Totally Chlorine Free bleaching (QPP), although none achieved a satisfactory brightness level and more optimisation is needed.
Resumo:
The texture of agricultural crops changes during harvesting, post harvesting and processing stages due to different loading processes. There are different source of loading that deform agricultural crop tissues and these include impact, compression, and tension. Scanning Electron Microscope (SEM) method is a common way of analysing cellular changes of materials before and after these loading operations. This paper examines the structural changes of pumpkin peel and flesh tissues under mechanical loading. Compression and indentation tests were performed on peel and flesh samples. Samples structure were then fixed and dehydrated in order to capture the cellular changes under SEM. The results were compared with the images of normal peel and flesh tissues. The findings suggest that normal flesh tissue had bigger size cells, while the cellular arrangement of peel was smaller. Structural damage was clearly observed in tissue structure after compression and indentation. However, the damages that resulted from the flat end indenter was much more severe than that from the spherical end indenter and compression test. An integrated deformed tissue layer was observed in compressed tissue, while the indentation tests shaped a deformed area under the indenter and left the rest of the tissue unharmed. There was an obvious broken layer of cells on the walls of the hole after the flat end indentations, whereas the spherical indenter created a squashed layer all around the hole. Furthermore, the influence of loading was lower on peel samples in comparison with the flesh samples. The experiments have shown that the rate of damage on tissue under constant rate of loading is highly dependent on the shape of equipment. This fact and observed structural changes after loading underline the significance of deigning post harvesting equipments to reduce the rate of damage on agricultural crop tissues.
Resumo:
Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.
Resumo:
This research project explores how interdisciplinary art practices can provide ways for questioning and envisaging alternative modes of coexistence between humans and the non-humans who together, make up the environment. As a practiceled project, it combines a body of creative work (50%) and this exegesis (50%). My interdisciplinary artistic practice appropriates methods and processes from science and engineering and merges them into artistic contexts for critical and poetic ends. By blending pseudo-scientific experimentation with creative strategies like visual fiction, humour, absurd public performance and scripted audience participation, my work engages with a range of debates around ecology. This exegesis details the interplay between critical theory relating to these debates, the work of other creative practitioners and my own evolving artistic practice. Through utilising methods and processes drawn from my prior career in water engineering, I present an interdisciplinary synthesis that seeks to promote improved understandings of the causes and consequences of our ecological actions and inactions.
Resumo:
A procedure for the evaluation of multiple scattering contributions is described, for deep inelastic neutron scattering (DINS) studies using an inverse geometry time-of-flight spectrometer. The accuracy of a Monte Carlo code DINSMS, used to calculate the multiple scattering, is tested by comparison with analytic expressions and with experimental data collected from polythene, polycrystalline graphite and tin samples. It is shown that the Monte Carlo code gives an accurate representation of the measured data and can therefore be used to reliably correct DINS data.
Resumo:
Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.
Resumo:
In this paper we describe cooperative control algorithms for robots and sensor nodes in an underwater environment. Cooperative navigation is defined as the ability of a coupled system of autonomous robots to pool their resources to achieve long-distance navigation and a larger controllability space. Other types of useful cooperation in underwater environments include: exchange of information such as data download and retasking; cooperative localization and tracking; and physical connection (docking) for tasks such as deployment of underwater sensor networks, collection of nodes and rescue of damaged robots. We present experimental results obtained with an underwater system that consists of two very different robots and a number of sensor network modules. We present the hardware and software architecture of this underwater system. We then describe various interactions between the robots and sensor nodes and between the two robots, including cooperative navigation. Finally, we describe our experiments with this underwater system and present data.
Resumo:
The research addresses how an understanding of the fundamentals of economics will better inform general journalists who report on issues or events affecting rural and regional Australia. The research draws on practice-based experience of the author, formal economics studies, interviews with news editors from Australian television news organisations, and interviews from leading economists. A guidebook has also been written to help journalists apply economic theories to their reporting. The guidebook enables reporters to think strategically and consider the 'big picture' when they inform society about policies, commodity trade, the environment, or any issues involving rural and regional Australia.
Resumo:
This article examines manual textual categorisation by human coders with the hypothesis that the law of total probability may be violated for difficult categories. An empirical evaluation was conducted to compare a one step categorisation task with a two step categorisation task using crowdsourcing. It was found that the law of total probability was violated. Both a quantum and classical probabilistic interpretations for this violation are presented. Further studies are required to resolve whether quantum models are more appropriate for this task.
Resumo:
A new cold-formed and resistance welded section known as the Hollow Flange Beam (HFB) has been developed recently in Australia. In contrast to the common lateral torsional buckling mode of I-beams, this unique section comprising two stiff triangular flanges and a slender web is susceptible to a lateral distortional buckling mode of failure involving lateral deflection, twist and cross-section change due to web distortion. This lateral distortional buckling behaviour has been shown to cause significant reduction of the available flexural strength of HFBs. An investigation using finite element analyses and large scale experiments was carried out into the use of transverse web plate stiffeners to improve the lateral buckling capacity of HFBs. This paper presents the details of the experimental investigation, the results, and the final stiffener arrangement whereas the details of the finite element analyses are presented in a companion paper at this conference.
Resumo:
The hollow flange beam (HFB) is a new cold-formed and resistance-welded section developed in Australia. Due to its unique geometry comprising two stiff triangular flanges and a slender web, the HFB is susceptible to a lateral-distortional buckling mode of failure involving web distortion. Investigation using finite-element analyses showed that the use of transverse web plate stiffeners effectively eliminated lateral-distortional buckling of HFBs and thus any associated reduction in flexural capacity. A detailed experimental investigation was then carried out to validate the results from the finite-element analysis and to improve the stiffener configuration further. This led to the development of a special stiffener that is screw-fastened to the flanges on alternate sides of the web. This paper presents the details of the experimental investigations, the results, and the final stiffener arrangement whereas the details of the finite-element analyses are presented in a companion paper.