141 resultados para Sugarcane - Tillage operations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rodenticide use in agriculture can lead to the secondary poisoning of avian predators. Currently the Australian sugarcane industry has two rodenticides, Racumin® and Rattoff®, available for in-crop use but, like many agricultural industries, it lacks an ecologically-based method of determining the potential secondary poisoning risk the use of these rodenticides poses to avian predators. The material presented in this thesis addresses this by: a. determining where predator/prey interactions take place in sugar producing districts; b. quantifying the amount of rodenticide available to avian predators and the probability of encounter; and c. developing a stochastic model that allows secondary poisoning risk under various rodenticide application scenarios to be investigated. Results demonstrate that predator/prey interactions are highly constrained by environmental structure. Rodents used crops that provided high levels of canopy cover and therefore predator protection and poorly utilised open canopy areas. In contrast, raptors over-utilised areas with low canopy cover and low rodent densities, but which provided high accessibility to prey. Given this pattern of habitat use, and that industry baiting protocols preclude rodenticide application in open canopy crops, these results indicate that secondary poisoning can only occur if poisoned rodents leave closed canopy crops and become available for predation in open canopy areas. Results further demonstrate that after in-crop rodenticide application, only a small proportion of rodents available in open areas are poisoned and that these rodents carry low levels of toxicant. Coupled with the low level of rodenticide use in the sugar industry, the high toxic threshold raptors have to these toxicants and the low probability of encountering poisoned rodents, results indicate that the risk of secondary poisoning events occurring is minimal. A stochastic model was developed to investigate the effect of manipulating factors that might influence secondary poisoning hazard in a sugarcane agro-ecosystem. These simulations further suggest that in all but extreme scenarios, the risk of secondary poisoning is also minimal. Collectively, these studies demonstrate that secondary poisoning of avian predators associated with the use of the currently available rodenticides in Australian sugar producing districts is minimal. Further, the ecologically-based method of assessing secondary poisoning risk developed in this thesis has broader applications in other agricultural systems where rodenticide use may pose risks to avian predators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Six Sigma provides a framework for quality improvement and business excellence. Introduced in the 1980s in manufacturing, the concept of Six Sigma has gained popularity in service organizations. After initial success in healthcare and banking, Six Sigma has gradually gained traction in other types of service industries, including hotels and lodging. Starwood Hotels and Resorts was the first hospitality giant to embrace Six Sigma. In 2001, Starwood adopted the method to develop innovative, customer-focused solutions and to transfer these solutions throughout the global organization. To analyze Starwood's use of Six Sigma, the authors collected data from articles, interviews, presentations and speeches published in magazines, newspapers and Web sites. This provided details to corroborate information, and they also made inferences from these sources. Financial metrics can explain the success of Six Sigma in any organization. There was no shortage of examples of Starwood's success resulting from Six Sigma project metrics uncovered during the research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews some aspects of calcium phosphate chemistry since phosphate in juice is an important parameter in all sugar juice clarification systems. It uses basic concepts to try and explain the observed differences in clarification performance obtained with various liming techniques. The paper also examines the current colorimetric method used for the determination of phosphate in sugar juice. In this method, a phosphomolybdate blue complex formed due to the addition of a dye is measured at 660 nm. Unfortunately, at this wavelength there is interference of the colour arising from within the juice and results in the underestimation of the amount of soluble inorganic phosphate content of juice. It is suggested that phosphate analysis be conducted at the higher wavelength of 875 nm where the interference of the juice colour is minimised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term loss of soil C stocks under conventional tillage and accrual of soil C following adoption of no-tillage have been well documented. No-tillage use is spreading, but it is common to occasionally till within a no-till regime or to regularly alternate between till and no-till practices within a rotation of different crops. Short-term studies indicate that substantial amounts of C can be lost from the soil immediately following a tillage event, but there are few field studies that have investigated the impact of infrequent tillage on soil C stocks. How much of the C sequestered under no-tillage is likely to be lost if the soil is tilled? What are the longer-term impacts of continued infrequent no-tillage? If producers are to be compensated for sequestering C in soil following adoption of conservation tillage practices, the impacts of infrequent tillage need to be quantified. A few studies have examined the short-term impacts of tillage on soil C and several have investigated the impacts of adoption of continuous no-tillage. We present: (1) results from a modeling study carried out to address these questions more broadly than the published literature allows, (2) a review of the literature examining the short-term impacts of tillage on soil C, (3) a review of published studies on the physical impacts of tillage and (4) a synthesis of these components to assess how infrequent tillage impacts soil C stocks and how changes in tillage frequency could impact soil C stocks and C sequestration. Results indicate that soil C declines significantly following even one tillage event (1-11 % of soil C lost). Longer-term losses increase as frequency of tillage increases. Model analyses indicate that cultivating and ripping are less disruptive than moldboard plowing, and soil C for those treatments average just 6% less than continuous NT compared to 27% less for CT. Most (80%) of the soil C gains of NT can be realized with NT coupled with biannual cultivating or ripping. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

No-tillage (NT) management has been promoted as a practice capable of offsetting greenhouse gas (GHG) emissions because of its ability to sequester carbon in soils. However, true mitigation is only possible if the overall impact of NT adoption reduces the net global warming potential (GWP) determined by fluxes of the three major biogenic GHGs (i.e. CO2, N2O, and CH4). We compiled all available data of soil-derived GHG emission comparisons between conventional tilled (CT) and NT systems for humid and dry temperate climates. Newly converted NT systems increase GWP relative to CT practices, in both humid and dry climate regimes, and longer-term adoption (>10 years) only significantly reduces GWP in humid climates. Mean cumulative GWP over a 20-year period is also reduced under continuous NT in dry areas, but with a high degree of uncertainty. Emissions of N2O drive much of the trend in net GWP, suggesting improved nitrogen management is essential to realize the full benefit from carbon storage in the soil for purposes of global warming mitigation. Our results indicate a strong time dependency in the GHG mitigation potential of NT agriculture, demonstrating that GHG mitigation by adoption of NT is much more variable and complex than previously considered, and policy plans to reduce global warming through this land management practice need further scrutiny to ensure success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer simulation has been widely accepted as an essential tool for the analysis of many engineering systems. It is nowadays perceived to be the most readily available and feasible means of evaluating operations in real railway systems. Based on practical experience and theoretical models developed in various applications, this paper describes the design of a general-purpose simulation system for train operations. Its prime objective is to provide a single comprehensive computer-aided engineering tool for most studies on railway operations so that various aspects of the railway systems with different operation characteristics can be investigated and analysed in depth. This system consists of three levels of simulation. The first is a single-train simulator calculating the running time of a train between specific points under different track geometry and traction conditions. The second is a dual-train simulator which is to find the minimum headway between two trains under different movement constraints, such as signalling systems. The third is a whole-system multi-train simulator which carries out process simulation of the real operation of a railway system according to a practical or planned train schedule or headway; and produces an overall evaluation of system performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several key issues need to be resolved before an efficient and reproducible Agrobacterium-mediated sugarcane transformation method can be developed for a wider range of sugarcane cultivars. These include loss of morphogenetic potential in sugarcane cells after Agrobacterium-mediated transformation, effect of exposure to abiotic stresses during in vitro selection, and most importantly the hypersensitive cell death response of sugarcane (and other nonhost plants) to Agrobacterium tumefaciens. Eight sugarcane cultivars (Q117, Q151, Q177, Q200, Q208, KQ228, QS94-2329, and QS94-2174) were evaluated for loss of morphogenetic potential in response to the age of the culture, exposure to Agrobacterium strains, and exposure to abiotic stresses during selection. Corresponding changes in the polyamine profiles of these cultures were also assessed. Strategies were then designed to minimize the negative effects of these factors on the cell survival and callus proliferation following Agrobacterium-mediated transformation. Some of these strategies, including the use of cell death protector genes and regulation of intracellular polyamine levels, will be discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Operations management is an area concerned with the production of goods and services ensuring that business operations are efficient in utilizing resource and effective to meet customer requirements. It deals with the design and management of products, processes, services and supply chains and considers the acquisition, development, and effective and efficient utilization of resources. Unlike other engineering subjects, content of these units could be very wide and vast. It is therefore necessary to cover the content that is most related to the contemporary industries. It is also necessary to understand what engineering management skills are critical for engineers working in the contemporary organisations. Most of the operations management books contain traditional Operations Management techniques. For example ‘inventory management’ is an important topic in operations management. All OM books deal with effective method of inventory management. However, new trend in OM is Just in time (JIT) delivery or minimization of inventory. It is therefore important to decide whether to emphasise on keeping inventory (as suggested by most books) or minimization of inventory. Similarly, for OM decisions like forecasting, optimization and linear programming most organisations now a day’s use software. Now it is important for us to determine whether some of these software need to be introduced in tutorial/ lab classes. If so, what software? It is established in the Teaching and Learning literature that there must be a strong alignment between unit objectives, assessment and learning activities to engage students in learning. Literature also established that engaging students is vital for learning. However, engineering units (more specifically Operations management) is quite different from other majors. Only alignment between objectives, assessment and learning activities cannot guarantee student engagement. Unit content must be practical oriented and skills to be developed should be those demanded by the industry. Present active learning research, using a multi-method research approach, redesigned the operations management content based on latest developments in Engineering Management area and the necessity of Australian industries. The redesigned unit has significantly helped better student engagement and better learning. It was found that students are engaged in the learning if they find the contents are helpful in developing skills that are necessary in their practical life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for calculating the in-bucket payload volume on a dragline for the purpose of estimating the material’s bulk density in real-time. Knowledge of the bulk density can provide instant feedback to mine planning and scheduling to improve blasting and in turn provide a more uniform bulk density across the excavation site. Furthermore costs and emissions in dragline operation, maintenance and downstream material processing can be reduced. The main challenge is to determine an accurate position and orientation of the bucket with the constraint of real-time performance. The proposed solution uses a range bearing and tilt sensor to locate and scan the bucket between the lift and dump stages of the dragline cycle. Various scanning strategies are investigated for their benefits in this real-time application. The bucket is segmented from the scene using cluster analysis while the pose of the bucket is calculated using the iterative closest point (ICP) algorithm. Payload points are segmented from the bucket by a fixed distance neighbour clustering method to preserve boundary points and exclude low density clusters introduced by overhead chains and the spreader bar. A height grid is then used to represent the payload from which the volume can be calculated by summing over the grid cells. We show volume calculated on a scaled system with an accuracy of greater than 95 per cent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the face of increasing concern over global warming and climate change, interest in the utilizzation of solar energy for building operations is rapidly growing. In this entry, the importance of using renewable energy in building operations is first introduced. This is followed by a general overview on the energy from the sun and the methods to utilize solar energy. Possible applications of solar energy in building operations are then discussed, which include the use of solar energy in the forms of daylighting, hot water heating, space heating and cooling, and building-integrated photovoltaics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.