933 resultados para Application techniques


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Registration is a necessarily sophisticated evaluation process applied to vertebrate pesticide products. Although conducted to minimize any potential impacts upon public health, the environment and food production, the all-encompassing process of registration can stifle innovation. Vertebrate pesticides are rarely used to control pest animals in food crops. In contrast to agrochemicals, relatively small amounts of vertebrate pesticides are used (50.1%), usually in solid or paste baits, and generally by discrete application methods rather than by broad-scale spray applications. We present a hierarchy or sliding scale of typical data requirements relative to application techniques, to help clarify an evolving science-based approach which focuses on requiring data to address key scientific questions while allowing waivers where additional data have minor value. Such an approach will facilitate the development and delivery of increasingly humane, species-targeted, low residue pesticides in the New World, along with the phasing out of less desirable chemicals that continue to be used due to a lack of alternatives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Natural stones have been widely used in the construction field since antiquity. Building materials undergo decay processes due to mechanical,chemical, physical and biological causes that can act together. Therefore an interdisciplinary approach is required in order to understand the interaction between the stone and the surrounding environment. Utilization of buildings, inadequate restoration activities and in general anthropogenic weathering factors may contribute to this degradation process. For this reasons, in the last few decades new technologies and techniques have been developed and introduced in the restoration field. Consolidants are largely used in restoration and conservation of cultural heritage in order to improve the internal cohesion and to reduce the weathering rate of building materials. It is important to define the penetration depth of a consolidant for determining its efficacy. Impregnation mainly depends on the microstructure of the stone (i.e. porosity) and on the properties of the product itself. Throughout this study, tetraethoxysilane (TEOS) applied on globigerina limestone samples has been chosen as object of investigation. After hydrolysis and condensation, TEOS deposits silica gel inside the pores, improving the cohesion of the grains. X-ray computed tomography has been used to characterize the internal structure of the limestone samples,treated and untreated with a TEOS-based consolidant. The aim of this work is to investigate the penetration depth and the distribution of the TEOS inside the porosity, using both traditional approaches and advanced X-ray tomographic techniques, the latter allowing the internal visualization in three dimensions of the materials. Fluid transport properties and porosity have been studied both at macroscopic scale, by means of capillary uptake tests and radiography, and at microscopic scale,investigated with X-ray Tomographic Microscopy (XTM). This allows identifying changes in the porosity, by comparison of the images before and after the treatment, and locating the consolidant inside the stone. Tests were initially run at University of Bologna, where characterization of the stone was carried out. Then the research continued in Switzerland: X-ray tomography and radiography were performed at Empa, Swiss Federal Laboratories for Materials Science and Technology, while XTM measurements with synchrotron radiation were run at Paul Scherrer Institute in Villigen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The outcome and morbidity in the treatment of prostate cancer by radiation therapy depends on the balance between tumour control and normal tissue damage. Recent technological advances have allowed to reduce the amount of normal tissue included in target treatment volumes. This diminishes morbidity and provides an opportunity for dose escalation, increasing tumour control rates. The new application techniques are discussed along with their integration in treatment concepts. Although there are no randomised studies to provide evidence of increased survival, the available evidence supports the hypothesis that the introduction of novel radiation techniques leads to survival rates equivalent to surgical series with sufficient safety.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The indications for direct resin composite restorations are nowadays extended due to the development of modern resin materials with improved material properties. However, there are still some difficulties regarding handling of resin composite material, especially in large restorations. The reconstruction of a functional and individual occlusion is difficult to achieve with direct application techniques. The aim of the present publication was to introduce a new "stamp"-technique for placing large composite restorations. The procedure of this "stamp"-technique is presented by three typical indications: large single-tooth restoration, occlusal rehabilitation of a compromised occlusal surface due to erosions and direct fibre-reinforced fixed partial denture. A step-by-step description of the technique and clinical figures illustrates the method. Large single-tooth restorations can be built-up with individual, two- piece silicone stamps. Large occlusal abrasive and/or erosive defects can be restored by copying the wax-up from the dental technician using the "stamp"-technique. Even fiber-reinforced resin-bonded fixed partial dentures can be formed with this intraoral technique with more precision and within a shorter treatment time. The presented "stamp"-technique facilitates the placement of large restoration with composite and can be recommended for the clinical use.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El objetivo del presente trabajo fue estudiar la actividad tóxica del aceite esencial de laurel (Laurus nobilis L.) y del cineol, monoterpeno cíclico considerado un aleloquímico puro, sobre Brevicoryne brassicae L. en repollo (Brassica oleracea var. capitata L.). Las concentraciones de aceite esencial utilizadas fueron: 1; 1,5; 2 y 3%, y para cineol: 0,5; 1,5 y 2,5%, ambas formuladas en solución acuosa con 2% de oleato de polietilenglicol como emulsionante. Se utilizaron dos técnicas de aplicación: papeles impregnados y pulverización directa. A las 24 horas del tratamiento se evaluó el porcentaje de mortalidad. Los resultados se analizaron por ANOVA de dos vías y Test de Tukey. Se encontraron diferencias significativas para los productos evaluados en todas las concentraciones ensayadas. Las técnicas de aplicación no dieron diferencias significativas. La mayor mortalidad correspondió a 52% para el aceite esencial de laurel y 27,5% para cineol a las mayores concentraciones en ambos casos. Se concluye que los productos ensayados podrían ser una herramienta para el Manejo Integrado de Plagas para control de áfidos en cultivos hortícolas.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main aim of this thesis is to analyse and optimise a public hospital Emergency Department. The Emergency Department (ED) is a complex system with limited resources and a high demand for these resources. Adding to the complexity is the stochastic nature of almost every element and characteristic in the ED. The interaction with other functional areas also complicates the system as these areas have a huge impact on the ED and the ED is powerless to change them. Therefore it is imperative that OR be applied to the ED to improve the performance within the constraints of the system. The main characteristics of the system to optimise included tardiness, adherence to waiting time targets, access block and length of stay. A validated and verified simulation model was built to model the real life system. This enabled detailed analysis of resources and flow without disruption to the actual ED. A wide range of different policies for the ED and a variety of resources were able to be investigated. Of particular interest was the number and type of beds in the ED and also the shift times of physicians. One point worth noting was that neither of these resources work in isolation and for optimisation of the system both resources need to be investigated in tandem. The ED was likened to a flow shop scheduling problem with the patients and beds being synonymous with the jobs and machines typically found in manufacturing problems. This enabled an analytic scheduling approach. Constructive heuristics were developed to reactively schedule the system in real time and these were able to improve the performance of the system. Metaheuristics that optimised the system were also developed and analysed. An innovative hybrid Simulated Annealing and Tabu Search algorithm was developed that out-performed both simulated annealing and tabu search algorithms by combining some of their features. The new algorithm achieves a more optimal solution and does so in a shorter time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Access to cardiac services is essential for appropriate implementation of evidence-based therapies to improve outcomes. The Cardiac Accessibility and Remoteness Index for Australia (Cardiac ARIA) aimed to derive an objective, geographic measure reflecting access to cardiac services. Methods: An expert panel defined an evidence-based clinical pathway. Using Geographic Information Systems (GIS), a numeric/alpha index was developed at two points along the continuum of care. The acute category (numeric) measured the time from the emergency call to arrival at an appropriate medical facility via road ambulance. The aftercare category (alpha) measured access to four basic services (family doctor, pharmacy, cardiac rehabilitation, and pathology services) when a patient returned to their community. Results: The numeric index ranged from 1 (access to principle referral center with cardiac catheterization service ≤ 1 hour) to 8 (no ambulance service, > 3 hours to medical facility, air transport required). The alphabetic index ranged from A (all 4 services available within 1 hour drive-time) to E (no services available within 1 hour). 13.9 million (71%) Australians resided within Cardiac ARIA 1A locations (hospital with cardiac catheterization laboratory and all aftercare within 1 hour). Those outside Cardiac 1A were over-represented by people aged over 65 years (32%) and Indigenous people (60%). Conclusion: The Cardiac ARIA index demonstrated substantial inequity in access to cardiac services in Australia. This methodology can be used to inform cardiology health service planning and the methodology could be applied to other common disease states within other regions of the world.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.