56 resultados para Minimum tillage


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the real world there are many problems in network of networks (NoNs) that can be abstracted to a so-called minimum interconnection cut problem, which is fundamentally different from those classical minimum cut problems in graph theory. Thus, it is desirable to propose an efficient and effective algorithm for the minimum interconnection cut problem. In this paper we formulate the problem in graph theory, transform it into a multi-objective and multi-constraint combinatorial optimization problem, and propose a hybrid genetic algorithm (HGA) for the problem. The HGA is a penalty-based genetic algorithm (GA) that incorporates an effective heuristic procedure to locally optimize the individuals in the population of the GA. The HGA has been implemented and evaluated by experiments. Experimental results have shown that the HGA is effective and efficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostics is based on the characterization of mechanical system condition and allows early detection of a possible fault. Signal processing is an approach widely used in diagnostics, since it allows directly characterizing the state of the system. Several types of advanced signal processing techniques have been proposed in the last decades and added to more conventional ones. Seldom, these techniques are able to consider non-stationary operations. Diagnostics of roller bearings is not an exception of this framework. In this paper, a new vibration signal processing tool, able to perform roller bearing diagnostics in whatever working condition and noise level, is developed on the basis of two data-adaptive techniques as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED), coupled by means of the mathematics related to the Hilbert transform. The effectiveness of the new signal processing tool is proven by means of experimental data measured in a test-rig that employs high power industrial size components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impaction bone grafting for reconstitution of bone stock in revision hip surgery has been used for nearly 30 years. We used this technique, in combination with a cemented acetabular component, in the acetabula of 304 hips in 292 patients revised for aseptic loosening between 1995 and 2001. The only additional supports used were stainless steel meshes placed against the medial wall or laterally around the acetabular rim to contain the graft. All Paprosky grades of defect were included. Clinical and radiographic outcomes were collected in surviving patients at a minimum of 10 years following the index operation. Mean follow-up was 12.4 years (SD 1.5; range 10.0-16.0). Kaplan-Meier survivorship with revision for aseptic loosening as the endpoint was 85.9% (95% CI 81.0 to 90.8%) at 13.5 years. Clinical scores for pain relief remained satisfactory, and there was no difference in clinical scores between cups that appeared stable and those that appeared loose radiographically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This study explores recent claims that humans exhibit a minimum cost of transport (CoTmin) for running which occurs at an intermediate speed, and assesses individual physiological, gait and training characteristics. Methods Twelve healthy participants with varying levels of fitness and running experience ran on a treadmill at six self-selected speeds in a discontinuous protocol over three sessions. Running speed (km[middle dot]hr-1), V[spacing dot above]O2 (mL[middle dot]kg-1[middle dot]km-1), CoT (kcal[middle dot]km-1), heart rate (beats[middle dot]min-1) and cadence (steps[middle dot]min-1) were continuously measured. V[spacing dot above]O2 max was measured on a fourth testing session. The occurrence of a CoTmin was investigated and its presence or absence examined with respect to fitness, gait and training characteristics. Results Five participants showed a clear CoTmin at an intermediate speed and a statistically significant (p < 0.05) quadratic CoT-speed function, while the other participants did not show such evidence. Participants were then categorized and compared with respect to the strength of evidence for a CoTmin (ClearCoTmin and NoCoTmin). The ClearCoTmin group displayed significantly higher correlation between speed and cadence; more endurance training and exercise sessions per week; than the NoCoTmin group; and a marginally non-significant but higher aerobic capacity. Some runners still showed a CoTmin at an intermediate speed even after subtraction of resting energy expenditure. Conclusion The findings confirm the existence of an optimal speed for human running, in some but not all participants. Those exhibiting a COTmin undertook a higher volume of running, ran with a cadence that was more consistently modulated with speed, and tended to be aerobically fitter. The ability to minimise the energetic cost of transport appears not to be ubiquitous feature of human running but may emerge in some individuals with extensive running experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Government contracts for services typically include terms requiring contractors to comply with minimum labour standards laws. Procurement contract clauses specify reporting procedures and sanctions for non-compliance, implying that government contracting agencies will monitor and enforce minimum labour standards within contract performance management. In this article, the case of school cleaners employed under New South Wales government contracts between 2010 and 2011 is the vehicle for exploring the effectiveness of these protective clauses. We find that the inclusion of these protective clauses in procurement contracts is unnecessary in the Australian context, and any expectations that government contracting agencies will monitor and enforce labour standards are misleading. At best, the clauses are rhetoric, and at worst, they are a distraction for parties with enforcement powers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research investigated the use of DNA fingerprinting to characterise the bacteria Streptococcus pneumoniae or pneumococcus, and hence gain insight into the development of new vaccines or antibiotics. Different bacterial DNA fingerprinting methods were studied, and a novel method was developed and validated, which characterises different cell coatings that pneumococci produce. This method was used to study the epidemiology of pneumococci in Queensland before and after the introduction of the current pneumococcal vaccine. This study demonstrated that pneumococcal disease is highly prevalent in children under four years, that the bacteria can `switch' its cell coating to evade the vaccine, and that some DNA fingerprinting methods are more discriminatory than others. This has an impact on understanding which strains are more prone to cause invasive disease. Evidence of the excellent research findings have been published in high impact internationally refereed journals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of tillage practises and the methods of chemical application on atrazine and alachlor losses through run-off were evaluated for five treatments: conservation (untilled) and surface (US), disk and surface, plow and surface, disk and preplant-incorporated, and plow and preplant-incorporated treatments. A rainfall simulator was used to create 63.5 mm h-1 of rainfall for 60 min and 127 mm h-1 for 15 min. Rainfall simulation occurred 24-36 h after chemical application. There was no significant difference in the run-off volume among the treatments but the untilled treatment significantly reduced erosion loss. The untilled treatments had the highest herbicide concentration and the disk treatments were higher than the plow treatments. The surface treatments showed a higher concentration than the incorporated treatments. The concentration of herbicides in the water decreased with time. Among the experimental sites, the one with sandy loam soil produced the greatest losses, both in terms of the run-off volume and herbicide loss. The US treatments had the highest loss and the herbicide incorporation treatments had smaller losses through run-off as the residue cover was effective in preventing herbicide losses. Incorporation might be a favorable method of herbicide application to reduce the herbicide losses by run-off.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The two-year trial of the Queensland minimum passing distance (MPD) road rule began on 7 April 2014. The rule requires motor vehicles to provide cyclists a minimum lateral passing distance of one metre when overtaking cyclists in a speed zone of 60 km/h or less, and 1.5 metres when the speed limit is greater than 60 km/h. This document summarises the evaluation of the effectiveness of the new rule in terms of its: 1. practical implementation; 2. impact on road users’ attitudes and perceptions; and 3. road safety benefits. The Centre for Accident Research and Road Safety – Queensland (CARRS-Q) developed the evaluation framework (Haworth, Schramm, Kiata-Holland, Vallmuur, Watson & Debnath; 2014) for the Queensland Department of Transport and Main Roads (TMR) and was later commissioned to undertake the evaluation. The evaluation included the following components: • Review of correspondence received by TMR; • Interviews and focus groups with Queensland Police Service (QPS) officers; • Road user survey; • Observational study; and • Crash, injury and infringement data analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study we examined the impact of weather variability and tides on the transmission of Barmah Forest virus (BFV) disease and developed a weather-based forecasting model for BFV disease in the Gladstone region, Australia. We used seasonal autoregressive integrated moving-average (SARIMA) models to determine the contribution of weather variables to BFV transmission after the time-series data of response and explanatory variables were made stationary through seasonal differencing. We obtained data on the monthly counts of BFV cases, weather variables (e.g., mean minimum and maximum temperature, total rainfall, and mean relative humidity), high and low tides, and the population size in the Gladstone region between January 1992 and December 2001 from the Queensland Department of Health, Australian Bureau of Meteorology, Queensland Department of Transport, and Australian Bureau of Statistics, respectively. The SARIMA model shows that the 5-month moving average of minimum temperature (β = 0.15, p-value < 0.001) was statistically significantly and positively associated with BFV disease, whereas high tide in the current month (β = −1.03, p-value = 0.04) was statistically significantly and inversely associated with it. However, no significant association was found for other variables. These results may be applied to forecast the occurrence of BFV disease and to use public health resources in BFV control and prevention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spaces without northerly orientations have an impact on the ‘energy behaviour’ of a building. This paper outlines possible energy savings and better performance achieved by different zenithal solar passive strategies (skylights, roof monitors and clerestory roof windows) and element arrangements across the roof in zones of cold to temperate climates typical of the central and central-southern Argentina. Analyses were undertaken considering daylighting, thermal and ventilation performances of the different strategies. The results indicate that heating,ventilation and lighting loads in spaces without an equator-facing facade can be significantly reduced by implementing solar passive strategies. In the thermal aspect, the solar saving fraction reached for the different strategies were averaged 43.16% for clerestories, 41.4% for roof monitors and 38.86% for skylights for a glass area of 9% to the floor area. The results also indicate average illuminance levels above 500 lux for the different clerestory and monitor arrangements, uniformity ratios of 0.66–0.82 for the most distributed arrangements and day-lighting factors between 11.78 and 20.30% for clear sky conditions, depending on the strategy. In addition, minimum air changes rates of 4 were reached for the most extreme conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel technique was used to measure emission factors for commonly used commercial aircraft including a range of Boeing and Airbus airframes under real world conditions. Engine exhaust emission factors for particles in terms of particle number and mass (PM2.5), along with those for CO2, and NOx were measured for over 280 individual aircraft during the various modes of landing/takeoff (LTO) cycle. Results from this study show that particle number, and NOx emission factors are dependant on aircraft engine thrust level. Minimum and maximum emissions factors for particle number, PM2.5, and NOx emissions were found to be in the range of 4.16×1015-5.42×1016 kg-1, 0.03-0.72 g.kg-1, and 3.25-37.94 g.kg-1 respectively for all measured airframes and LTO cycle modes. Number size distributions of emitted particles for the naturally diluted aircraft plumes in each mode of LTO cycle showed that particles were predominantly in the range of 4 to 100 nm in diameter in all cases. In general, size distributions exhibit similar modality during all phases of the LTO cycle. A very distinct nucleation mode was observed in all particle size distributions, except for taxiing and landing of A320 aircraft. Accumulation modes were also observed in all particle size distributions. Analysis of aircraft engine emissions during LTO cycle showed that aircraft thrust level is considerably higher during taxiing than idling suggesting that International Civil Aviation Organization (ICAO) standards need to be modified as the thrust levels for taxi and idle are considered to be the same (7% of total thrust) [1].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Changes in fluidization behaviour behaviour was characterised for parallelepiped particles with three aspect ratios, 1:1, 2:1 and 3:1 and spherical particles. All drying experiments were conducted at 500C and 15 % RH using a heat pump dehumidifier system. Fluidization experiments were undertaken for the bed heights of 100, 80, 60 and 40 mm and at 10 moisture content levels. Due to irregularities in shape minimum fluidisation velocity of parallelepiped particulates (potato) could not fitted to any empirical model. Also a generalized equation was used to predict minimum fluidization velocity. The modified quasi-stationary method (MQSM) has been proposed to describe drying kinetics of parallelepiped particulates at 30o C, 40o C and 50o C that dry mostly in the falling rate period in a batch type fluid bed dryer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some Engineering Faculties are turning to the problem-based learning (PBL)paradigm to engender necessary skills and competence in their graduates. Since, at the same time, some Faculties are moving towards distance education, questions are being asked about the effectiveness of PBL for technical fields such as Engineering when delivered in virtual space. This paper outlines an investigation of how student attributes affect their learning experience in PBL courses offered in virtual space. A frequency distribution was superimposed on the outcome space of a phenomenographical study on a suitable PBL course to investigate the effect of different student attributes on the learning experience. It was discovered that the quality, quantity, and style of facilitator interaction had the greatest impact on the student learning experience. This highlights the need to establish consistent student interaction plans and to set, and ensure compliance with, minimum standards with respect to facilitation and student interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of polycaprolactone (PCL) as a biomaterial, especially in the fields of drug delivery and tissue engineering, has enjoyed significant growth. Understanding how such a device or scaffold eventually degrades in vivo is paramount as the defect site regenerates and remodels. Degradation studies of three-dimensional PCL and PCL-based composite scaffolds were conducted in vitro (in phosphate buffered saline) and in vivo (rabbit model). Results up to 6 months are reported. All samples recorded virtually no molecular weight changes after 6 months, with a maximum mass loss of only about 7% from the PCL-composite scaffolds degraded in vivo, and a minimum of 1% from PCL scaffolds. Overall, crystallinity increased slightly because of the effects of polymer recrystallization. This was also a contributory factor for the observed stiffness increment in some of the samples, while only the PCL-composite scaffold registered a decrease. Histological examination of the in vivo samples revealed good biocompatibility, with no adverse host tissue reactions up to 6 months. Preliminary results of medical-grade PCL scaffolds, which were implanted for 2 years in a critical-sized rabbit calvarial defect site, are also reported here and support our scaffold design goal for gradual and late molecular weight decreases combined with excellent long-term biocompatibility and bone regeneration. (C) 2008 Wiley Periodicals, Inc. J Biomed Mater Res 90A: 906-919, 2009