410 resultados para PROCESSING TECHNIQUE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Use of focus groups as a technique of inquiry is gaining attention in the area of health-care research. This paper will report on the technique of focus group interviewing to investigate the role of the infection control practitioner. Infection control is examined as a specialty area of health-care practice that has received little research attention to date. Additionally, it is an area of practice that is expanding in response to social, economic and microbiological forces. The focus group technique in this study helped a group of infection control practitioners from urban, regional and rural areas throughout Queensland identify and categorise their daily work activities. The outcomes of this process were then analysed to identify the growth in breadth and complexity of the role of the infection control practitioner in the contemporary health-care environment. Findings indicate that the role of the infection control practitioner in Australia has undergone changes consistent with and reflecting changing models of health-care delivery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systematic studies that evaluate the quality of decision-making processes are relatively rare. Using the literature on decision quality, this research develops a framework to assess the quality of decision-making processes for resolving boundary conflicts in the Philippines. The evaluation framework breaks down the decision-making process into three components (the decision procedure, the decision method, and the decision unit) and is applied to two ex-post (one resolved and one unresolved) and one ex-ante cases. The evaluation results from the resolved and the unresolved cases show that the choice of decision method plays a minor role in resolving boundary conflicts whereas the choice of decision procedure is more influential. In the end, a decision unit can choose a simple method to resolve the conflict. The ex-ante case presents a follow-up intended to resolve the unresolved case for a changing decision-making process in which the associated decision unit plans to apply the spatial multi criteria evaluation (SMCE) tool as a decision method. The evaluation results from the ex-ante case confirm that the SMCE has the potential to enhance the decision quality because: a) it provides high quality as a decision method in this changing process, and b) the weaknesses associated with the decision unit and the decision procedure of the unresolved case were found to be eliminated in this process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Capacity probability models of generating units are commonly used in many power system reliability studies, at hierarchical level one (HLI). Analytical modelling of a generating system with many units or generating units with many derated states in a system, can result in an extensive number of states in the capacity model. Limitations on available memory and computational time of present computer facilities can pose difficulties for assessment of such systems in many studies. A cluster procedure using the nearest centroid sorting method was used for IEEE-RTS load model. The application proved to be very effective in producing a highly similar model with substantially fewer states. This paper presents an extended application of the clustering method to include capacity probability representation. A series of sensitivity studies are illustrated using IEEE-RTS generating system and load models. The loss of load and energy expectations (LOLE, LOEE), are used as indicators to evaluate the application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multilevel converters, because of the benefits they attract in generating high quality output voltage, are used in several applications. Various modulation and control techniques are introduced by several researchers to control the output voltage of the multilevel converters like space vector modulation and harmonic elimination (HE) methods. Multilevel converters may have a DC link with equal or unequal DC voltages. In this study a new HE technique based on the HE method is proposed for multilevel converters with unequal DC link voltage. The DC link voltage levels are considered as additional variables for the HE method and the voltage levels are defined based on the HE results. Increasing the number of voltage levels can reduce lower order harmonic content because of the fact that more variables are created. In comparison to previous methods, this new technique has a positive effect on the output voltage quality by reducing its total harmonic distortion, which must take into consideration for some applications such as uninterruptable power supply, motor drive systems and piezoelectric transducer excitation. In order to verify the proposed modulation technique, MATLAB simulations and experimental tests are carried out for a single-phase four-level diode-clamped converter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of face recognition, Sparse Representation (SR) has received considerable attention during the past few years. Most of the relevant literature focuses on holistic descriptors in closed-set identification applications. The underlying assumption in SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such assumption is easily violated in the more challenging face verification scenario, where an algorithm is required to determine if two faces (where one or both have not been seen before) belong to the same person. In this paper, we first discuss why previous attempts with SR might not be applicable to verification problems. We then propose an alternative approach to face verification via SR. Specifically, we propose to use explicit SR encoding on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which are then concatenated to form an overall face descriptor. Due to the deliberate loss spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment & various image deformations. Within the proposed framework, we evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN), and an implicit probabilistic technique based on Gaussian Mixture Models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the proposed local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, in both verification and closed-set identification problems. The experiments also show that l1-minimisation based encoding has a considerably higher computational than the other techniques, but leads to higher recognition rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

utomatic pain monitoring has the potential to greatly improve patient diagnosis and outcomes by providing a continuous objective measure. One of the most promising methods is to do this via automatically detecting facial expressions. However, current approaches have failed due to their inability to: 1) integrate the rigid and non-rigid head motion into a single feature representation, and 2) incorporate the salient temporal patterns into the classification stage. In this paper, we tackle the first problem by developing a “histogram of facial action units” representation using Active Appearance Model (AAM) face features, and then utilize a Hidden Conditional Random Field (HCRF) to overcome the second issue. We show that both of these methods improve the performance on the task of pain detection in sequence level compared to current state-of-the-art-methods on the UNBC-McMaster Shoulder Pain Archive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We sought to determine the impact of electrospinning parameters on a trustworthy criterion that could evidently improve the maximum applicability of fibrous scaffolds for tissue regeneration. We used an image analysis technique to elucidate the web permeability index (WPI) by modeling the formation of electrospun scaffolds. Poly(3-hydroxybutyrate) (P3HB) scaffolds were fabricated according to predetermined conditions of levels in a Taguchi orthogonal design. The material parameters were the polymer concentration, conductivity, and volatility of the solution. The processing parameters were the applied voltage and nozzle-to-collector distance. With a law to monitor the WPI values when the polymer concentration or the applied voltage was increased, the pore interconnectivity was decreased. The quality of the jet instability altered the pore numbers, areas, and other structural characteristics, all of which determined the scaffold porosity and aperture interconnectivity. An initial drastic increase was observed in the WPI values because of the chain entanglement phenomenon above a 6 wt % P3HB content. Although the solution mixture significantly (p < 0.05) changed the scaffold architectural characteristics as a function of the solution viscosity and surface tension, it had a minor impact on the WPI values. The solution mixture gained the third place of significance, and the distance was approved as the least important factor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Production of nanofibrous polyacrylonitrile/calcium carbonate (PAN/CaCO3) nanocomposite web was carried out through solution electrospinning process. Pore generating nanoparticles were leached from the PAN matrices in hydrochloric acid bath with the purpose of producing an ultimate nanoporous structure. The possible interaction between CaCO3 nanoparticles and PAN functional groups was investigated. Atomic absorption method was used to measure the amount of extracted CaCO3 nanoparticles. Morphological observation showed nanofibers of 270–720 nm in diameter containing nanopores of 50–130 nm. Monitoring the governing parameters statistically, it was found that the amount of extraction (ε) of CaCO3was increased when the web surface area (a) was broadened according to a simple scaling law (ε = 3.18 a0.4). The leaching process was maximized in the presence of 5% v/v of acid in the extraction bath and 5 wt % of CaCO3 in the polymer solution. Collateral effects of the extraction time and temperature showed exponential growth within a favorable extremum at 50°C for 72 h. Concentration of dimethylformamide as the solvent had no significant impact on the extraction level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several approaches have been introduced in literature for active noise control (ANC) systems. Since FxLMS algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. This paper proposes a new version of FxLMS algorithm. In many ANC applications an online secondary path modelling method using a white noise as a training signal is required to ensure convergence of the system. This paper also proposes a new approach for online secondary path modelling in feedfoward ANC systems. The proposed algorithm stops injection of the white noise at the optimum point and reactivate the injection during the operation, if needed, to maintain performance of the system. Benefiting new version of FxLMS algorithm and not continually injection of white noise makes the system more desirable and improves the noise attenuation performance. Comparative simulation results indicate effectiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We applied a texture-based flow visualisation technique to a numerical hydrodynamic model of the Pumicestone Passage in southeast Queensland, Australia. The quality of the visualisations using our flow visualisation tool, are compared with animations generated using more traditional drogue release plot and velocity contour and vector techniques. The texture-based method is found to be far more effective in visualising advective flow within the model domain. In some instances, it also makes it easier for the researcher to identify specific hydrodynamic features within the complex flow regimes of this shallow tidal barrier estuary as compared with the direct and geometric based methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a unified view of the relationship between (1) quantity and (2) price generating mechanisms in estimating individual prime construction costs/prices. A brief review of quantity generating techniques is provided with particular emphasis on experientially based assumptive approaches and this is compared with the level of pricing data available for the quantities generated in terms of reliability of the ensuing prime cost estimates. It is argued that there is a tradeoff between the reliability of quantity items and reliability of rates. Thus it is shown that the level of quantity generation is optimised by maximising the joint reliability function of the quantity items and their associated rates. Some thoughts on how this joint reliability function can be evaluated and quantified follow. The application of these ideas is described within the overall strategy of the estimator's decision - "Which estimating technique shall I use for a given level of contract information? - and a case is made for the computer generation of estimates by several methods, with an indication of the reliability of each estimate, the ultimate choice of estimate being left to the estimator concerned. Finally, the potential for the development of automatic estimating systems within this framework is examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two approaches are described, which aid the selection of the most appropriate procurement arrangements for a building project. The first is a multi-attribute technique based on the National Economic Development Office procurement path decision chart. A small study is described in which the utility factors involved were weighted by averaging the scores of five 'experts' for three hypothetical building projects. A concordance analysis is used to provide some evidence of any abnormal data sources. When applied to the study data, one of the experts was seen to be atypical. The second approach is by means of discriminant analysis. This was found to provide reasonably consistent predictions through three discriminant functions. The analysis also showed the quality criteria to have no significant impact on the decision process. Both approaches provided identical and intuitively correct answers in the study described. Some concluding remarks are made on the potential of discriminant analysis for future research and development in procurement selection techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vision-based SLAM is mostly a solved problem providing clear, sharp images can be obtained. However, in outdoor environments a number of factors such as rough terrain, high speeds and hardware limitations can result in these conditions not being met. High speed transit on rough terrain can lead to image blur and under/over exposure, problems that cannot easily be dealt with using low cost hardware. Furthermore, recently there has been a growth in interest in lifelong autonomy for robots, which brings with it the challenge in outdoor environments of dealing with a moving sun and lack of constant artificial lighting. In this paper, we present a lightweight approach to visual localization and visual odometry that addresses the challenges posed by perceptual change and low cost cameras. The approach combines low resolution imagery with the SLAM algorithm, RatSLAM. We test the system using a cheap consumer camera mounted on a small vehicle in a mixed urban and vegetated environment, at times ranging from dawn to dusk and in conditions ranging from sunny weather to rain. We first show that the system is able to provide reliable mapping and recall over the course of the day and incrementally incorporate new visual scenes from different times into an existing map. We then restrict the system to only learning visual scenes at one time of day, and show that the system is still able to localize and map at other times of day. The results demonstrate the viability of the approach in situations where image quality is poor and environmental or hardware factors preclude the use of visual features.