884 resultados para Fully automated
Resumo:
Global cereal production will need to increase by 50% to 70% to feed a world population of about 9 billion by 2050. This intensification is forecast to occur mostly in subtropical regions, where warm and humid conditions can promote high N2O losses from cropped soils. To secure high crop production without exacerbating N2O emissions, new nitrogen (N) fertiliser management strategies are necessary. This one-year study evaluated the efficacy of a nitrification inhibitor (3,4-dimethylpyrazole phosphate—DMPP) and different N fertiliser rates to reduce N2O emissions in a wheat–maize rotation in subtropical Australia. Annual N2O emissions were monitored using a fully automated greenhouse gas measuring system. Four treatments were fertilized with different rates of urea, including a control (40 kg-N ha−1 year−1), a conventional N fertiliser rate adjusted on estimated residual soil N (120 kg-N ha−1 year−1), a conventional N fertiliser rate (240 kg-N ha−1 year−1) and a conventional N fertiliser rate (240 kg-N ha−1 year−1) with nitrification inhibitor (DMPP) applied at top dressing. The maize season was by far the main contributor to annual N2O emissions due to the high soil moisture and temperature conditions, as well as the elevated N rates applied. Annual N2O emissions in the four treatments amounted to 0.49, 0.84, 2.02 and 0.74 kg N2O–N ha−1 year−1, respectively, and corresponded to emission factors of 0.29%, 0.39%, 0.69% and 0.16% of total N applied. Halving the annual conventional N fertiliser rate in the adjusted N treatment led to N2O emissions comparable to the DMPP treatment but extensively penalised maize yield. The application of DMPP produced a significant reduction in N2O emissions only in the maize season. The use of DMPP with urea at the conventional N rate reduced annual N2O emissions by more than 60% but did not affect crop yields. The results of this study indicate that: (i) future strategies aimed at securing subtropical cereal production without increasing N2O emissions should focus on the fertilisation of the summer crop; (ii) adjusting conventional N fertiliser rates on estimated residual soil N is an effective practice to reduce N2O emissions but can lead to substantial yield losses if the residual soil N is not assessed correctly; (iii) the application of DMPP is a feasible strategy to reduce annual N2O emissions from sub-tropical wheat–maize rotations. However, at the N rates tested in this study DMPP urea did not increase crop yields, making it impossible to recoup extra costs associated with this fertiliser. The findings of this study will support farmers and policy makers to define effective fertilisation strategies to reduce N2O emissions from subtropical cereal cropping systems while maintaining high crop productivity. More research is needed to assess the use of DMPP urea in terms of reducing conventional N fertiliser rates and subsequently enable a decrease of fertilisation costs and a further abatement of fertiliser-induced N2O emissions.
Resumo:
Vegetable cropping systems are often characterised by high inputs of nitrogen fertiliser. Elevated emissions of nitrous oxide (N2O) can be expected as a consequence. In order to mitigate N2O emissions from fertilised agricultural fields, the use of nitrification inhibitors, in combination with ammonium based fertilisers, has been promoted. However, no data is currently available on the use of nitrification inhibitors in sub-tropical vegetable systems. A field experiment was conducted to investigate the effect of the nitrification inhibitor 3,4-dimethylpyrazole phosphate (DMPP) on N2O emissions and yield from broccoli production in sub-tropical Australia. Soil N2O fluxes were monitored continuously (3 h sampling frequency) with fully automated, pneumatically operated measuring chambers linked to a sampling control system and a gas chromatograph. Cumulative N2O emissions over the 5 month observation period amounted to 298 g-N/ha, 324 g-N/ha, 411 g-N/ha and 463 g-N/ha in the conventional fertiliser (CONV), the DMPP treatment (DMPP), the DMMP treatment with a 10% reduced fertiliser rate (DMPP-red) and the zero fertiliser (0N), respectively. The temporal variation of N2O fluxes showed only low emissions over the broccoli cropping phase, but significantly elevated emissions were observed in all treatments following broccoli residues being incorporated into the soil. Overall 70–90% of the total emissions occurred in this 5 weeks fallow phase. There was a significant inhibition effect of DMPP on N2O emissions and soil mineral N content over the broccoli cropping phase where the application of DMPP reduced N2O emissions by 75% compared to the standard practice. However, there was no statistical difference between the treatments during the fallow phase or when the whole season was considered. This study shows that DMPP has the potential to reduce N2O emissions from intensive vegetable systems, but also highlights the importance of post-harvest emissions from incorporated vegetable residues. N2O mitigation strategies in vegetable systems need to target these post-harvest emissions and a better evaluation of the effect of nitrification inhibitors over the fallow phase is needed.
Resumo:
Alternative sources of N are required to bolster subtropical cereal production without increasing N2O emissions from these agro-ecosystems. The reintroduction of legumes in cereal cropping systems is a possible strategy to reduce synthetic N inputs but elevated N2O losses have sometimes been observed after the incorporation of legume residues. However, the magnitude of these losses is highly dependent on local conditions and very little data are available for subtropical regions. The aim of this study was to assess whether, under subtropical conditions, the N mineralised from legume residues can substantially decrease the synthetic N input required by the subsequent cereal crop and reduce overall N2O emissions during the cereal cropping phase. Using a fully automated measuring system, N2O emissions were monitored in a cereal crop (sorghum) following a legume pasture and compared to the same crop in rotation with a grass pasture. Each crop rotation included a nil and a fertilised treatment to assess the N availability of the residues. The incorporation of legumes provided enough readily available N to effectively support crop development but the low labile C left by these residues is likely to have limited denitrification and therefore N2O emissions. As a result, N2O emissions intensities (kgN2O-N yield-1ha-1) were considerably lower in the legume histories than in the grass. Overall, these findings indicate that the C supplied by the crop residue can be more important than the soil NO3 - content in stimulating denitrification and that introducing a legume pasture in a subtropical cereal cropping system is a sustainable practice from both environmental and agronomic perspectives.
Resumo:
Urbanization is becoming increasingly important in terms of climate change and ecosystem functionality worldwide. We are only beginning to understand how the processes of urbanization influence ecosystem dynamics and how peri-urban environments contribute to climate change. Brisbane in South East Queensland (SEQ) currently has the most extensive urban sprawl of all Australian cities. This leads to substantial land use changes in urban and peri-urban environments and the subsequent gaseous emissions from soils are to date neglected for IPCC climate change estimations. This research examines how land use change effects methane (CH4) and nitrous oxide (N2O) fluxes from peri-urban soils and consequently influences the Global Warming Potential (GWP) of rural ecosystems in agricultural use undergoing urbanization. Therefore, manual and fully automated static chamber measurements determined soil gas fluxes over a full year and an intensive sampling campaign of 80 days after land use change. Turf grass, as the major peri-urban land cover, increased the GWP by 415 kg CO2-e ha 1 over the first 80 days after conversion from a well-established pasture. This results principally from increased daily average N2O emissions of 0.5 g N2O ha-1 d-1 from the pasture to 18.3 g N2O ha-1 d-1 from the turf grass due to fertilizer application during conversion. Compared to the native dry sclerophyll eucalypt forest, turf grass establishment increases the GWP by another 30 kg CO2-e ha 1. The results presented in this study clearly indicate the substantial impact of urbanization on soil-atmosphere gas exchange in form of non-CO2 greenhouse gas emissions particularly after turf grass establishment.
Resumo:
Alternative sources of N are required to bolster subtropical cereal production without increasing N2O emissions from these agro-ecosystems. The reintroduction of legumes in cereal cropping systems is a possible strategy to reduce synthetic N inputs but elevated N2O losses have sometimes been observed after the incorporation of legume residues. However, the magnitude of these losses is highly dependent on local conditions and very little data are available for subtropical regions. The aim of this study was to assess whether, under subtropical conditions, the N mineralised from legume residues can substantially decrease the synthetic N input required by the subsequent cereal crop and reduce overall N2O emissions during the cereal cropping phase. Using a fully automated measuring system, N2O emissions were monitored in a cereal crop (sorghum) following a legume pasture and compared to the same crop in rotation with a grass pasture. Each crop rotation included a nil and a fertilised treatment to assess the N availability of the residues. The incorporation of legumes provided enough readily available N to effectively support crop development but the low labile C left by these residues is likely to have limited denitrification and therefore N2O emissions. As a result, N2O emissions intensities (kg N2O-N yield−1 ha−1) were considerably lower in the legume histories than in the grass. Overall, these findings indicate that the C supplied by the crop residue can be more important than the soil NO3− content in stimulating denitrification and that introducing a legume pasture in a subtropical cereal cropping system is a sustainable practice from both environmental and agronomic perspectives.
Resumo:
Existing business process drift detection methods do not work with event streams. As such, they are designed to detect inter-trace drifts only, i.e. drifts that occur between complete process executions (traces), as recorded in event logs. However, process drift may also occur during the execution of a process, and may impact ongoing executions. Existing methods either do not detect such intra-trace drifts, or detect them with a long delay. Moreover, they do not perform well with unpredictable processes, i.e. processes whose logs exhibit a high number of distinct executions to the total number of executions. We address these two issues by proposing a fully automated and scalable method for online detection of process drift from event streams. We perform statistical tests over distributions of behavioral relations between events, as observed in two adjacent windows of adaptive size, sliding along with the stream. An extensive evaluation on synthetic and real-life logs shows that our method is fast and accurate in the detection of typical change patterns, and performs significantly better than the state of the art.
Resumo:
A compact, high brightness 13.56 MHz inductively coupled plasma ion source without any axial or radial multicusp magnetic fields is designed for the production of a focused ion beam. Argon ion current of density more than 30 mA/cm(2) at 4 kV potential is extracted from this ion source and is characterized by measuring the ion energy spread and brightness. Ion energy spread is measured by a variable-focusing retarding field energy analyzer that minimizes the errors due t divergence of ion beam inside the analyzer. Brightness of the ion beam is determined from the emittance measured by a fully automated and locally developed electrostatic sweep scanner. By optimizing various ion source parameters such as RF power, gas pressure and Faraday shield, ion beams with energy spread of less than 5 eV and brightness of 7100 Am(-2)sr(-1)eV(-1) have been produced. Here, we briefly report the details of the ion source, measurement and optimization of energy spread and brightness of the ion beam. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Mathematical modelling plays a vital role in the design, planning and operation of flexible manufacturing systems (FMSs). In this paper, attention is focused on stochastic modelling of FMSs using Markov chains, queueing networks, and stochastic Petri nets. We bring out the role of these modelling tools in FMS performance evaluation through several illustrative examples and provide a critical comparative evaluation. We also include a discussion on the modelling of deadlocks which constitute an important source of performance degradation in fully automated FMSs.
Resumo:
MATLAB is an array language, initially popular for rapid prototyping, but is now being increasingly used to develop production code for numerical and scientific applications. Typical MATLAB programs have abundant data parallelism. These programs also have control flow dominated scalar regions that have an impact on the program's execution time. Today's computer systems have tremendous computing power in the form of traditional CPU cores and throughput oriented accelerators such as graphics processing units(GPUs). Thus, an approach that maps the control flow dominated regions to the CPU and the data parallel regions to the GPU can significantly improve program performance. In this paper, we present the design and implementation of MEGHA, a compiler that automatically compiles MATLAB programs to enable synergistic execution on heterogeneous processors. Our solution is fully automated and does not require programmer input for identifying data parallel regions. We propose a set of compiler optimizations tailored for MATLAB. Our compiler identifies data parallel regions of the program and composes them into kernels. The problem of combining statements into kernels is formulated as a constrained graph clustering problem. Heuristics are presented to map identified kernels to either the CPU or GPU so that kernel execution on the CPU and the GPU happens synergistically and the amount of data transfer needed is minimized. In order to ensure required data movement for dependencies across basic blocks, we propose a data flow analysis and edge splitting strategy. Thus our compiler automatically handles composition of kernels, mapping of kernels to CPU and GPU, scheduling and insertion of required data transfer. The proposed compiler was implemented and experimental evaluation using a set of MATLAB benchmarks shows that our approach achieves a geometric mean speedup of 19.8X for data parallel benchmarks over native execution of MATLAB.
Resumo:
MATLAB is an array language, initially popular for rapid prototyping, but is now being increasingly used to develop production code for numerical and scientific applications. Typical MATLAB programs have abundant data parallelism. These programs also have control flow dominated scalar regions that have an impact on the program's execution time. Today's computer systems have tremendous computing power in the form of traditional CPU cores and throughput oriented accelerators such as graphics processing units(GPUs). Thus, an approach that maps the control flow dominated regions to the CPU and the data parallel regions to the GPU can significantly improve program performance. In this paper, we present the design and implementation of MEGHA, a compiler that automatically compiles MATLAB programs to enable synergistic execution on heterogeneous processors. Our solution is fully automated and does not require programmer input for identifying data parallel regions. We propose a set of compiler optimizations tailored for MATLAB. Our compiler identifies data parallel regions of the program and composes them into kernels. The problem of combining statements into kernels is formulated as a constrained graph clustering problem. Heuristics are presented to map identified kernels to either the CPU or GPU so that kernel execution on the CPU and the GPU happens synergistically and the amount of data transfer needed is minimized. In order to ensure required data movement for dependencies across basic blocks, we propose a data flow analysis and edge splitting strategy. Thus our compiler automatically handles composition of kernels, mapping of kernels to CPU and GPU, scheduling and insertion of required data transfer. The proposed compiler was implemented and experimental evaluation using a set of MATLAB benchmarks shows that our approach achieves a geometric mean speedup of 19.8X for data parallel benchmarks over native execution of MATLAB.
Resumo:
In order to meet the ever growing demand for the prediction of oceanographic parametres in the Indian Ocean for a variety of applications, the Indian National Centre for Ocean Information Services (INCOIS) has recently set-up an operational ocean forecast system, viz. the Indian Ocean Forecast System (INDOFOS). This fully automated system, based on a state-of-the-art ocean general circulation model issues six-hourly forecasts of the sea-surface temperature, surface currents and depths of the mixed layer and the thermocline up to five-days of lead time. A brief account of INDOFOS and a statistical validation of the forecasts of these parametres using in situ and remote sensing data are presented in this article. The accuracy of the sea-surface temperature forecasts by the system is high in the Bay of Bengal and the Arabian Sea, whereas it is moderate in the equatorial Indian Ocean. On the other hand, the accuracy of the depth of the thermocline and the isothermal layers and surface current forecasts are higher near the equatorial region, while it is relatively lower in the Bay of Bengal.
Resumo:
Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1 +/- 1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.
Resumo:
Elettra is one of the first 3rd-generation storage rings, recently upgraded to routinely operate in top-up mode at both 2.0 and 2.4 GeV. The facility hosts four dedicated beamlines for crystallography, two open to the users and two under construction, and expected to be ready for public use in 2015. In service since 1994, XRD1 is a general-purpose diffraction beamline. The light source for this wide (4-21 keV) energy range beamline is a permanent magnet wiggler. XRD1 covers experiments ranging from grazing incidence X-ray diffraction to macromolecular crystallography, from industrial applications of powder diffraction to X-ray phasing with long wavelengths. The bending magnet powder diffraction beamline MCX has been open to users since 2009, with a focus on microstructural investigations and studies under non-ambient conditions. A superconducting wiggler delivers a high photon flux to a new fully automated beamline dedicated to macromolecular crystallography and to a branch beamline hosting a high-pressure powder X-ray diffraction station (both currently under construction). Users of the latter experimental station will have access to a specialized sample preparation laboratory, shared with the SISSI infrared beamline. A high throughput crystallization platform equipped with an imaging system for the remote viewing, evaluation and scoring of the macromolecular crystallization experiments has also been established and is open to the user community.
Resumo:
Breast cancer is one of the leading cause of cancer related deaths in women and early detection is crucial for reducing mortality rates. In this paper, we present a novel and fully automated approach based on tissue transition analysis for lesion detection in breast ultrasound images. Every candidate pixel is classified as belonging to the lesion boundary, lesion interior or normal tissue based on its descriptor value. The tissue transitions are modeled using a Markov chain to estimate the likelihood of a candidate lesion region. Experimental evaluation on a clinical dataset of 135 images show that the proposed approach can achieve high sensitivity (95 %) with modest (3) false positives per image. The approach achieves very similar results (94 % for 3 false positives) on a completely different clinical dataset of 159 images without retraining, highlighting the robustness of the approach.
Resumo:
Subtle concurrency errors in multithreaded libraries that arise because of incorrect or inadequate synchronization are often difficult to pinpoint precisely using only static techniques. On the other hand, the effectiveness of dynamic race detectors is critically dependent on multithreaded test suites whose execution can be used to identify and trigger races. Usually, such multithreaded tests need to invoke a specific combination of methods with objects involved in the invocations being shared appropriately to expose a race. Without a priori knowledge of the race, construction of such tests can be challenging. In this paper, we present a lightweight and scalable technique for synthesizing precisely these kinds of tests. Given a multithreaded library and a sequential test suite, we describe a fully automated analysis that examines sequential execution traces, and produces as its output a concurrent client program that drives shared objects via library method calls to states conducive for triggering a race. Experimental results on a variety of well-tested Java libraries yield 101 synthesized multithreaded tests in less than four minutes. Analyzing the execution of these tests using an off-the-shelf race detector reveals 187 harmful races, including several previously unreported ones.