900 resultados para work system method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project examines the application of the Suzuki Actor Training Method (the Suzuki Method) within the work ofTadashi Suzuki's company in Japan, the Shizuoka Performing Arts Complex (SPAC), within the work of Brisbane theatre company Frank:Austral Asian Performance Ensemble (Frank:AAPE), and as related to the development of the theatre performance Surfacing. These three theatrical contexts have been studied from the viewpoint of a "participant- observer". The researcher has trained in the Suzuki Method with Frank:AAPE and SP AC, performed with Frank:AAPE, and was the solo performer and collaborative developer in the performance Surfacing (directed by Leah Mercer). Observations of these three groups are based on a phenomenological definition of the "integrated actor", an actor who is able to achieve a totality or unity between the body and the mind, and between the body and the voice, through a powerful sense of intention. The term "integrated actor" has been informed by the philosophy of Merleau-Ponty and his concept of the "lived body". Three main hypotheses are presented in this study: that the Suzuki Method focuses on actors learning through their body; that the Suzuki Method presents an holistic approach to the body and the voice; and that the Suzuki Method develops actors with a strong sense of intention. These three aspects of the Suzuki Method are explored in relation to the stylistic features of the work of SPAC, Frank:AAPE and the performance Surfacing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis discusses various aspects of the integrity monitoring of GPS applied to civil aircraft navigation in different phases of flight. These flight phases include en route, terminal, non-precision approach and precision approach. The thesis includes four major topics: probability problem of GPS navigation service, risk analysis of aircraft precision approach and landing, theoretical analysis of Receiver Autonomous Integrity Monitoring (RAIM) techniques and RAIM availability, and GPS integrity monitoring at a ground reference station. Particular attention is paid to the mathematical aspects of the GPS integrity monitoring system. The research has been built upon the stringent integrity requirements defined by civil aviation community, and concentrates on the capability and performance investigation of practical integrity monitoring systems with rigorous mathematical and statistical concepts and approaches. Major contributions of this research are: • Rigorous integrity and continuity risk analysis for aircraft precision approach. Based on the joint probability density function of the affecting components, the integrity and continuity risks of aircraft precision approach with DGPS were computed. This advanced the conventional method of allocating the risk probability. • A theoretical study of RAIM test power. This is the first time a theoretical study on RAIM test power based on the probability statistical theory has been presented, resulting in a new set of RAIM criteria. • Development of a GPS integrity monitoring and DGPS quality control system based on GPS reference station. A prototype of GPS integrity monitoring and DGPS correction prediction system has been developed and tested, based on the A USN A V GPS base station on the roof of QUT ITE Building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experts in injection molding often refer to previous solutions to find a mold design similar to the current mold and use previous successful molding process parameters with intuitive adjustment and modification as a start for the new molding application. This approach saves a substantial amount of time and cost in experimental based corrective actions which are required in order to reach optimum molding conditions. A Case-Based Reasoning (CBR) System can perform the same task by retrieving a similar case which is applied to the new case from the case library and uses the modification rules to adapt a solution to the new case. Therefore, a CBR System can simulate human e~pertise in injection molding process design. This research is aimed at developing an interactive Hybrid Expert System to reduce expert dependency needed on the production floor. The Hybrid Expert System (HES) is comprised of CBR, flow analysis, post-processor and trouble shooting systems. The HES can provide the first set of operating parameters in order to achieve moldability condition and producing moldings free of stress cracks and warpage. In this work C++ programming language is used to implement the expert system. The Case-Based Reasoning sub-system is constructed to derive the optimum magnitude of process parameters in the cavity. Toward this end the Flow Analysis sub-system is employed to calculate the pressure drop and temperature difference in the feed system to determine the required magnitude of parameters at the nozzle. The Post-Processor is implemented to convert the molding parameters to machine setting parameters. The parameters designed by HES are implemented using the injection molding machine. In the presence of any molding defect, a trouble shooting subsystem can determine which combination of process parameters must be changed iii during the process to deal with possible variations. Constraints in relation to the application of this HES are as follows. - flow length (L) constraint: 40 mm < L < I 00 mm, - flow thickness (Th) constraint: -flow type: - material types: I mm < Th < 4 mm, unidirectional flow, High Impact Polystyrene (HIPS) and Acrylic. In order to test the HES, experiments were conducted and satisfactory results were obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a study of the mechanical properties of thin films. The main aim was to determine the properties of sol-gel derived coatings. These films are used in a range of different applications and are known to be quite porous. Very little work has been carried out in this area and in order to study the mechanical properties of sol-gel films, some of the work was carried out on magnetron sputtered metal coatings in order to validate the techniques developed in this work. The main part of the work has concentrated on the development of various bending techniques to study the elastic modulus of the thin films, including both a small scale three-point bending, as well as a novel bi-axial bending technique based on a disk resting on three supporting balls. The bending techniques involve a load being applied to the sample being tested and the bending response to this force being recorded. These experiments were carried out using an ultra micro indentation system with very sensitive force and depth recording capabilities. By analysing the result of these forces and deflections using existing theories of elasticity, the elastic modulus may be determined. In addition to the bi-axial bending study, a finite element analysis of the stress distribution in a disk during bending was carried out. The results from the bi-axial bending tests of the magnetron sputtered films was confirmed by ultra micro indentation tests, giving information of the hardness and elastic modulus of the films. It was found that while the three point bending method gave acceptable results for uncoated steel substrates, it was very susceptible to slight deformations of the substrate. Improvements were made by more careful preparation of the substrates in order to avoid deformation. However the technique still failed to give reasonable results for coated specimens. In contrast, biaxial bending gave very reliable results even for very thin films and this technique was also found to be useful for determination of the properties of sol-gel coatings. In addition, an ultra micro indentation study of the hardness and elastic modulus of sol-gel films was conducted. This study included conventionally fired films as well as films ion implanted in a range of doses. The indentation tests showed that for implantation of H+ ions at doses exceeding 3x1016 ions/cm2, the mechanical properties closely resembled those of films that were conventionally fired to 450°C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obesity represents a major health, social and economic burden to many developing and Westernized communities, with the prevalence increasing at a rate exceeding almost all other medical conditions. Despite major recent advances in our understanding of adipose tissue metabolism and dynamics, we still have limited insight into the regulation of adipose tissue mass in humans. Any significant increase in adipose tissue mass requires proliferation and differentiation of precursor cells (preadipocytes) present in the stromo-vascular compartment of adipose tissue. These processes are very complex and an increasing number of growth factors and hormones have been shown to modulate the expression of genes involved in preadipocyte proliferation and differentiation. A number of transcription factors, including the C/EBP family and PP ARy, have been identified as integral to adipose tissue development and preadipocyte differentiation. Together PP ARy and C/EBPa regulate important events in the activation and maintenance of the terminally differentiated phenotype. The ability of PP ARy to increase transcription through its DNA recognition site is dependent on the binding of ligands. This suggests that an endogenous PP ARy ligand may be an important regulator of adipogenesis. Adipose tissue functions as both the major site of energy storage in the body and as an endocrine organ synthesizing and secreting a number of important molecules involved in regulation of energy balance. For optimum functioning therefore, adipose tissue requires extensive vascularization and previous studies have shown that growth of adipose tissue is preceded by development of a microvascular network. This suggests that paracrine interactions between constituent cells in adipose tissue may be involved in both new capillary formation and fat cell growth. To address this hypothesis the work in this project was aimed at (a) further development of a method for inducing preadipocyte differentiation in subcultured human cells; (b) establishing a method for simultaneous isolation and separate culture of both preadipocytes and microvascular endothelial cells from the same adipose tissue biopsies; (c) to determine, using conditioned medium and co-culture techniques, if endothelial cell-derived factors influence the proliferation and/or differentiation of human preadipocytes; and (d) commence characterization of factors that may be responsible for any observed paracrine effects on aspects of human adipogenesis. Major findings of these studies were as follows: (A) Inclusion of either linoleic acid (a long-chain fatty acid reported to be a naturally occurring ligand for PP ARy) or Rosiglitazone (a member of the thiazolidinedione class of insulin-sensitizing drugs and a synthetic PPARy ligand) in differentiation medium had markedly different effects on preadipocyte differentiation. These studies showed that human preadipocytes have the potential to accumulate triacylglycerol irrespective of their stage of biochemical differentiation, and that thiazolidinediones and fatty acids may exert their adipogenic and lipogenic effects via different biochemical pathways. It was concluded that Rosiglitazone is a more potent inducer of human preadipocyte differentiation than linoleic acid. (B) A method for isolation and culture of both endothelial cells and preadipocytes from the same adipose tissue biopsy was developed. Adipose-derived microvascular endothelial cells were found to produce factor/s, which enhance both proliferation and differentiation of human preadipocytes. (C) The adipogenic effects of microvascular endothelial cells can be mimicked by exposure of preadipocytes to members of the Fibroblast Growth Factor family, specifically ~-ECGF and FGF-1. (D) Co-culture of human preadipocytes with endothelial cells or exposure of preadipocytes to either ~-ECGF or FGF-1 were found to 'prime' human preadipocytes, during their proliferative phase of growth, for thiazolidinedione-induced differentiation. (E) FGF -1 was not found to be acting as a ligand for PP ARy in this system. Findings from this project represent a significant step forward in our understanding of factors involved in growth of human adipose tissue and may lead to the development of therapeutic strategies aimed at modifying the process. Such strategies would have potential clinical utility in the treatment of obesity and obesity related disorders such as Type II Diabetes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The numerical modelling of electromagnetic waves has been the focus of many research areas in the past. Some specific applications of electromagnetic wave scattering are in the fields of Microwave Heating and Radar Communication Systems. The equations that govern the fundamental behaviour of electromagnetic wave propagation in waveguides and cavities are the Maxwell's equations. In the literature, a number of methods have been employed to solve these equations. Of these methods, the classical Finite-Difference Time-Domain scheme, which uses a staggered time and space discretisation, is the most well known and widely used. However, it is complicated to implement this method on an irregular computational domain using an unstructured mesh. In this work, a coupled method is introduced for the solution of Maxwell's equations. It is proposed that the free-space component of the solution is computed in the time domain, whilst the load is resolved using the frequency dependent electric field Helmholtz equation. This methodology results in a timefrequency domain hybrid scheme. For the Helmholtz equation, boundary conditions are generated from the time dependent free-space solutions. The boundary information is mapped into the frequency domain using the Discrete Fourier Transform. The solution for the electric field components is obtained by solving a sparse-complex system of linear equations. The hybrid method has been tested for both waveguide and cavity configurations. Numerical tests performed on waveguides and cavities for inhomogeneous lossy materials highlight the accuracy and computational efficiency of the newly proposed hybrid computational electromagnetic strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n the field of tissue engineering new polymers are needed to fabricate scaffolds with specific properties depending on the targeted tissue. This work aimed at designing and developing a 3D scaffold with variable mechanical strength, fully interconnected porous network, controllable hydrophilicity and degradability. For this, a desktop-robot-based melt-extrusion rapid prototyping technique was applied to a novel tri-block co-polymer, namely poly(ethylene glycol)-block-poly(epsi-caprolactone)-block-poly(DL-lactide), PEG-PCL-P(DL)LA. This co-polymer was melted by electrical heating and directly extruded out using computer-controlled rapid prototyping by means of compressed purified air to build porous scaffolds. Various lay-down patterns (0/30/60/90/120/150°, 0/45/90/135°, 0/60/120° and 0/90°) were produced by using appropriate positioning of the robotic control system. Scanning electron microscopy and micro-computed tomography were used to show that 3D scaffold architectures were honeycomb-like with completely interconnected and controlled channel characteristics. Compression tests were performed and the data obtained agreed well with the typical behavior of a porous material undergoing deformation. Preliminary cell response to the as-fabricated scaffolds has been studied with primary human fibroblasts. The results demonstrated the suitability of the process and the cell biocompatibility of the polymer, two important properties among the many required for effective clinical use and efficient tissue-engineering scaffolding.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flying capacitor multilevel inverter (FCMLI) is a multiple voltage level inverter topology intended for high-power and high-voltage operations at low distortion. It uses capacitors, called flying capacitors, to clamp the voltage across the power semiconductor devices. A method for controlling the FCMLI is proposed which ensures that the flying capacitor voltages remain nearly constant using the preferential charging and discharging of these capacitors. A static synchronous compensator (STATCOM) and a static synchronous series compensator (SSSC) based on five-level flying capacitor inverters are proposed. Control schemes for both the FACTS controllers are developed and verified in terms of voltage control, power flow control, and power oscillation damping when installed in a single-machine infinite bus (SMIB) system. Simulation studies are performed using PSCAD/EMTDC to validate the efficacy of the control scheme and the FCMLI-based flexible alternating current transmission system (FACTS) controllers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Establishing a nationwide Electronic Health Record system has become a primary objective for many countries around the world, including Australia, in order to improve the quality of healthcare while at the same time decreasing its cost. Doing so will require federating the large number of patient data repositories currently in use throughout the country. However, implementation of EHR systems is being hindered by several obstacles, among them concerns about data privacy and trustworthiness. Current IT solutions fail to satisfy patients’ privacy desires and do not provide a trustworthiness measure for medical data. This thesis starts with the observation that existing EHR system proposals suer from six serious shortcomings that aect patients’ privacy and safety, and medical practitioners’ trust in EHR data: accuracy and privacy concerns over linking patients’ existing medical records; the inability of patients to have control over who accesses their private data; the inability to protect against inferences about patients’ sensitive data; the lack of a mechanism for evaluating the trustworthiness of medical data; and the failure of current healthcare workflow processes to capture and enforce patient’s privacy desires. Following an action research method, this thesis addresses the above shortcomings by firstly proposing an architecture for linking electronic medical records in an accurate and private way where patients are given control over what information can be revealed about them. This is accomplished by extending the structure and protocols introduced in federated identity management to link a patient’s EHR to his existing medical records by using pseudonym identifiers. Secondly, a privacy-aware access control model is developed to satisfy patients’ privacy requirements. The model is developed by integrating three standard access control models in a way that gives patients access control over their private data and ensures that legitimate uses of EHRs are not hindered. Thirdly, a probabilistic approach for detecting and restricting inference channels resulting from publicly-available medical data is developed to guard against indirect accesses to a patient’s private data. This approach is based upon a Bayesian network and the causal probabilistic relations that exist between medical data fields. The resulting definitions and algorithms show how an inference channel can be detected and restricted to satisfy patients’ expressed privacy goals. Fourthly, a medical data trustworthiness assessment model is developed to evaluate the quality of medical data by assessing the trustworthiness of its sources (e.g. a healthcare provider or medical practitioner). In this model, Beta and Dirichlet reputation systems are used to collect reputation scores about medical data sources and these are used to compute the trustworthiness of medical data via subjective logic. Finally, an extension is made to healthcare workflow management processes to capture and enforce patients’ privacy policies. This is accomplished by developing a conceptual model that introduces new workflow notions to make the workflow management system aware of a patient’s privacy requirements. These extensions are then implemented in the YAWL workflow management system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The purpose of this study was to assess the capacity of a written intervention, in this case a patient information brochure, to improve patient satisfaction during an Emergency Department (ED) visit. For the purpose of measuring the effect of the intervention the ED journey was conceptualised as a series of distinct areas of service comprising waiting time, service by the triage nurse, care from doctors and nurses and information giving Background of study: Research into patient satisfaction has become a widespread activity endorsed by both governments and hospital administrations. The literature on ED patient satisfaction has consistently indicated three primary areas of patient dissatisfaction: waiting time, nursing care and communication. Recent developments in the literature on patient satisfaction studies however have highlighted the relationship between patients. expectations of a service encounter and their consequent assessment of the experience as dissatisfying or satisfying. Disconfirmation theory posits that the degree to which expectations are confirmed will affect subsequent levels of satisfaction. The conceptual framework utilised in this study is Coye.s (2004) model of disconfirmation. Coye while reiterating satisfaction is a consequence of the degree expectations are either confirmed or disconfirmed also posits that expectations can be modified by interventions. Coye.s work conceptualises these interventions as intra encounter experiences (cues) which function to adjust expectations. Coye suggests some cues are unintended and may have a negative impact which also reinforces the value of planned cues intended to meet or exceed consumer expectations. Consequently the brochure can be characterized as a potentially positive cue, encouraging the patient to understand processes and to orient them in what can be a confronting environment. Only a limited number of studies have examined the effect of written interventions within an ED. No studies could be located which have tested the effect of ED interventions using a conceptual framework which relates the effect of the degree to which expectations are confirmed or disconfirmed in terms of satisfaction with services. Method: Two studies were conducted. Study One used qualitative methods to explore patients. expectations of the ED from the perspective of both patients and health care professionals. Study One was used in part to direct the development of the intervention (brochure) in Study Two. The brochure was an intervention designed to modify patients. expectations thus increasing their satisfaction with the provision of ED service. As there was no existing tools to measure ED patients. expectations and satisfaction a new tool was also developed based on the findings and the literature of Study One. Study Two used a non-randomised, quasi-experimental approach using a non-equivalent post-test only comparison group design used to investigate the effect of the patient education brochure (Stommel and Wills, 2004). The brochure was disseminated to one of two study groups (the intervention group). The effect of the brochure was assessed by comparing the data obtained from both the intervention and control group. These two groups consisted of 150 participants each. It was expected that any differences in the relevant domains selected for examination would indicate the effect of the brochure both on expectation and potentially satisfaction. Results: Study One revealed several areas of common ground between patients and nurses in terms of relevant content for the written intervention, including the need for information on the triage system and waiting times. Areas of difference were also found with patients emphasizing communication issues, whereas focus group members expressed concern that patients were often unable to assimilate verbal information. The findings suggested the potential utility of written material to reinforce verbal communication particularly in terms of the triage process and other ED protocols. This material was synthesized within the final version of the written intervention. Overall the results of Study Two indicated no significant differences between the two groups. The intervention group did indicate a significant number of participants who viewed the brochure of having changed their expectations. The effect of the brochure may have been obscured by a lack of parity between the two groups as the control group presented with statistically significantly higher levels of acuity and experienced significantly shorter waiting times. In terms of disconfirmation theory this would suggest expectations that had been met or exceeded. The results confirmed the correlation of expectations with satisfaction. Several domains also indicated age as a significant predictor with older patients tending to score higher satisfaction results. Other significant predictors of satisfaction established were waiting time and care from nurses, reinforcing the combination of efficient service and positive interpersonal experiences as being valued by patients. Conclusions: Information presented in written form appears to benefit a significant number of ED users in terms of orientation and explaining systems and procedures. The degree to which these effects may interact with other dimensions of satisfaction however is likely to be limited. Waiting time and interpersonal behaviours from staff also provide influential cues in determining satisfaction. Written material is likely to be one element in a series of coordinated strategies to improve patient satisfaction during periods of peak demand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Camera calibration information is required in order for multiple camera networks to deliver more than the sum of many single camera systems. Methods exist for manually calibrating cameras with high accuracy. Manually calibrating networks with many cameras is, however, time consuming, expensive and impractical for networks that undergo frequent change. For this reason, automatic calibration techniques have been vigorously researched in recent years. Fully automatic calibration methods depend on the ability to automatically find point correspondences between overlapping views. In typical camera networks, cameras are placed far apart to maximise coverage. This is referred to as a wide base-line scenario. Finding sufficient correspondences for camera calibration in wide base-line scenarios presents a significant challenge. This thesis focuses on developing more effective and efficient techniques for finding correspondences in uncalibrated, wide baseline, multiple-camera scenarios. The project consists of two major areas of work. The first is the development of more effective and efficient view covariant local feature extractors. The second area involves finding methods to extract scene information using the information contained in a limited set of matched affine features. Several novel affine adaptation techniques for salient features have been developed. A method is presented for efficiently computing the discrete scale space primal sketch of local image features. A scale selection method was implemented that makes use of the primal sketch. The primal sketch-based scale selection method has several advantages over the existing methods. It allows greater freedom in how the scale space is sampled, enables more accurate scale selection, is more effective at combining different functions for spatial position and scale selection, and leads to greater computational efficiency. Existing affine adaptation methods make use of the second moment matrix to estimate the local affine shape of local image features. In this thesis, it is shown that the Hessian matrix can be used in a similar way to estimate local feature shape. The Hessian matrix is effective for estimating the shape of blob-like structures, but is less effective for corner structures. It is simpler to compute than the second moment matrix, leading to a significant reduction in computational cost. A wide baseline dense correspondence extraction system, called WiDense, is presented in this thesis. It allows the extraction of large numbers of additional accurate correspondences, given only a few initial putative correspondences. It consists of the following algorithms: An affine region alignment algorithm that ensures accurate alignment between matched features; A method for extracting more matches in the vicinity of a matched pair of affine features, using the alignment information contained in the match; An algorithm for extracting large numbers of highly accurate point correspondences from an aligned pair of feature regions. Experiments show that the correspondences generated by the WiDense system improves the success rate of computing the epipolar geometry of very widely separated views. This new method is successful in many cases where the features produced by the best wide baseline matching algorithms are insufficient for computing the scene geometry.