50 resultados para Minimum Entropy Deconvolution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The preservation of meniscal tissue is important to protect joint surfaces. Purpose We have an aggressive approach to meniscal repair, including repairing tears other than those classically suited to repair. Here we present the medium- to long-term outcome of meniscal repair (inside-out) in elite athletes. Study Design Case series; Level of evidence, 4. Methods Forty-two elite athletes underwent 45 meniscal repairs. All repairs were performed using an arthroscopically assisted inside-out technique. Eighty-three percent of these athletes had ACL reconstruction at the same time. Patients returned a completed questionnaire (including Lysholm and International Knee Documentation Committee [IKDC] scores). Mean follow-up was 8.5 years. Failure was defined by patients developing symptoms of joint line pain and/or locking or swelling requiring repeat arthroscopy and partial meniscectomy. Results The average Lysholm and subjective IKDC scores were 89.6 and 85.4, respectively. Eighty-one percent of patients returned to their main sport and most to a similar level at a mean time of 10.4 months after repair, reflecting the high level of ACL reconstruction in this group. We identified 11 definite failures, 10 medial and 1 lateral meniscus, that required excision; this represents a 24% failure rate. We identified 1 further patient who had possible failed repairs, giving a worst-case failure rate of 26.7% at a mean of 42 months after surgery. However, 7 of these failures were associated with a further injury. Therefore, the atraumatic failure rate was 11%. Age and size and location of the tears were not associated with a higher failure rate. Medial meniscal repairs were significantly more likely to fail than lateral meniscal repairs, with a failure rate of 36.4% and 5.6%, respectively (P < .05). Conclusion Meniscal repair and healing are possible, and most elite athletes can return to their preinjury level of activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uninhabited aerial vehicles (UAVs) are a cutting-edge technology that is at the forefront of aviation/aerospace research and development worldwide. Many consider their current military and defence applications as just a token of their enormous potential. Unlocking and fully exploiting this potential will see UAVs in a multitude of civilian applications and routinely operating alongside piloted aircraft. The key to realising the full potential of UAVs lies in addressing a host of regulatory, public relation, and technological challenges never encountered be- fore. Aircraft collision avoidance is considered to be one of the most important issues to be addressed, given its safety critical nature. The collision avoidance problem can be roughly organised into three areas: 1) Sense; 2) Detect; and 3) Avoid. Sensing is concerned with obtaining accurate and reliable information about other aircraft in the air; detection involves identifying potential collision threats based on available information; avoidance deals with the formulation and execution of appropriate manoeuvres to maintain safe separation. This thesis tackles the detection aspect of collision avoidance, via the development of a target detection algorithm that is capable of real-time operation onboard a UAV platform. One of the key challenges of the detection problem is the need to provide early warning. This translates to detecting potential threats whilst they are still far away, when their presence is likely to be obscured and hidden by noise. Another important consideration is the choice of sensors to capture target information, which has implications for the design and practical implementation of the detection algorithm. The main contributions of the thesis are: 1) the proposal of a dim target detection algorithm combining image morphology and hidden Markov model (HMM) filtering approaches; 2) the novel use of relative entropy rate (RER) concepts for HMM filter design; 3) the characterisation of algorithm detection performance based on simulated data as well as real in-flight target image data; and 4) the demonstration of the proposed algorithm's capacity for real-time target detection. We also consider the extension of HMM filtering techniques and the application of RER concepts for target heading angle estimation. In this thesis we propose a computer-vision based detection solution, due to the commercial-off-the-shelf (COTS) availability of camera hardware and the hardware's relatively low cost, power, and size requirements. The proposed target detection algorithm adopts a two-stage processing paradigm that begins with an image enhancement pre-processing stage followed by a track-before-detect (TBD) temporal processing stage that has been shown to be effective in dim target detection. We compare the performance of two candidate morphological filters for the image pre-processing stage, and propose a multiple hidden Markov model (MHMM) filter for the TBD temporal processing stage. The role of the morphological pre-processing stage is to exploit the spatial features of potential collision threats, while the MHMM filter serves to exploit the temporal characteristics or dynamics. The problem of optimising our proposed MHMM filter has been examined in detail. Our investigation has produced a novel design process for the MHMM filter that exploits information theory and entropy related concepts. The filter design process is posed as a mini-max optimisation problem based on a joint RER cost criterion. We provide proof that this joint RER cost criterion provides a bound on the conditional mean estimate (CME) performance of our MHMM filter, and this in turn establishes a strong theoretical basis connecting our filter design process to filter performance. Through this connection we can intelligently compare and optimise candidate filter models at the design stage, rather than having to resort to time consuming Monte Carlo simulations to gauge the relative performance of candidate designs. Moreover, the underlying entropy concepts are not constrained to any particular model type. This suggests that the RER concepts established here may be generalised to provide a useful design criterion for multiple model filtering approaches outside the class of HMM filters. In this thesis we also evaluate the performance of our proposed target detection algorithm under realistic operation conditions, and give consideration to the practical deployment of the detection algorithm onboard a UAV platform. Two fixed-wing UAVs were engaged to recreate various collision-course scenarios to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. Based on this collected data, our proposed detection approach was able to detect targets out to distances ranging from about 400m to 900m. These distances, (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning ahead of impact that approaches the 12.5 second response time recommended for human pilots. Furthermore, readily available graphic processing unit (GPU) based hardware is exploited for its parallel computing capabilities to demonstrate the practical feasibility of the proposed target detection algorithm. A prototype hardware-in- the-loop system has been found to be capable of achieving data processing rates sufficient for real-time operation. There is also scope for further improvement in performance through code optimisations. Overall, our proposed image-based target detection algorithm offers UAVs a cost-effective real-time target detection capability that is a step forward in ad- dressing the collision avoidance issue that is currently one of the most significant obstacles preventing widespread civilian applications of uninhabited aircraft. We also highlight that the algorithm development process has led to the discovery of a powerful multiple HMM filtering approach and a novel RER-based multiple filter design process. The utility of our multiple HMM filtering approach and RER concepts, however, extend beyond the target detection problem. This is demonstrated by our application of HMM filters and RER concepts to a heading angle estimation problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper argues a model of adaptive design for sustainable architecture within a framework of entropy evolution. The spectrum of sustainable architecture consists of efficient use of energy and material resource in the life-cycle of buildings, active involvement of the occupants into micro-climate control within the building, and the natural environment as the physical context. The interactions amongst all the parameters compose a complex system of sustainable architecture design, of which the conventional linear and fragmented design technologies are insufficient to indicate holistic and ongoing environmental performance. The latest interpretation of the Second Law of Thermodynamics states a microscopic formulation of an entropy evolution of complex open systems. It provides a design framework for an adaptive system evolves for the optimization in open systems, this adaptive system evolves for the optimization of building environmental performance. The paper concludes that adaptive modelling in entropy evolution is a design alternative for sustainable architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the many factors that influence enforcement agencies, this article examines the role of the institutional location (and independence) of agencies, and an incumbent government's ideology. It is argued that institutional location affects the level of political influence on the agency's operations, while government ideology affects its willingness to resource enforcement agencies and approve regulatory activities. Evidence from the agency regulating minimum labour standards in the Australian federal industrial relations jurisdiction (currently the Fair Work Ombudsman) highlights two divergences from the regulatory enforcement literature generally. First, notions of independence from political interference offered by institutional location are more illusory than real and, second, political need motivates political action to a greater extent than political ideology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complex transition from convict to free labour influenced state intervention in the employment relationship, and initiated the first minimum labour standards in Australia in 1828. Since then, two principal sets of tensions have affected the enforcement of such standards: tensions between government and employers, and tensions between the major political parties over industrial and economic issues. This article argues that these tensions have resulted in a sustained legacy affecting minimum labour standards’ enforcement in Australia. The article outlines broad historical developments and contexts of minimum labour standards’ enforcement in Australia since 1828, with more contemporary exploration focusing specifically on enforcement practices and policies in the Australian federal industrial relations jurisdiction. Current enforcement practices are an outcome of this volatile history, and past influences remain strong.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper establishes practical stability results for an important range of approximate discrete-time filtering problems involving mismatch between the true system and the approximating filter model. Using local consistency assumption, the practical stability established is in the sense of an asymptotic bound on the amount of bias introduced by the model approximation. Significantly, these practical stability results do not require the approximating model to be of the same model type as the true system. Our analysis applies to a wide range of estimation problems and justifies the common practice of approximating intractable infinite dimensional nonlinear filters by simpler computationally tractable filters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hybrid system representations have been applied to many challenging modeling situations. In these hybrid system representations, a mixture of continuous and discrete states is used to capture the dominating behavioural features of a nonlinear, possible uncertain, model under approximation. Unfortunately, the problem of how to best design a suitable hybrid system model has not yet been fully addressed. This paper proposes a new joint state measurement relative entropy rate based approach for this design purpose. Design examples and simulation studies are presented which highlight the benefits of our proposed design approaches.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a robust filtering problem for uncertain discrete-time, homogeneous, first-order, finite-state hidden Markov models (HMMs). The class of uncertain HMMs considered is described by a conditional relative entropy constraint on measures perturbed from a nominal regular conditional probability distribution given the previous posterior state distribution and the latest measurement. Under this class of perturbations, a robust infinite horizon filtering problem is first formulated as a constrained optimization problem before being transformed via variational results into an unconstrained optimization problem; the latter can be elegantly solved using a risk-sensitive information-state based filtering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulatory commentators have identified the need for more responsive regulation to allow enforcement agencies to respond to different types and degrees of non-compliance. One tool considered to support responsive enforcement is the Enforceable Undertaking (EU). EUs are used extensively by Australian regulators in decisions that forego litigation in exchange for offenders promising to (amongst other things) correct behaviour and comply in the future. This arguably allows regulatory agencies greater flexibility in how they obtain compliance with regulations. EUs became an additional enforcement tool for the Fair Work Ombudsman (FWO) under the Fair Work Act 2009. This paper is a preliminary exploration of the comparative use of EUs by the Australian Competition and Consumer Commission and the FWO to assess their effectiveness for the minimum labour standards' environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we present an optimized fuzzy visual servoing system for obstacle avoidance using an unmanned aerial vehicle. The cross-entropy theory is used to optimise the gains of our controllers. The optimization process was made using the ROS-Gazebo 3D simulation with purposeful extensions developed for our experiments. Visual servoing is achieved through an image processing front-end that uses the Camshift algorithm to detect and track objects in the scene. Experimental flight trials using a small quadrotor were performed to validate the parameters estimated from simulation. The integration of cross- entropy methods is a straightforward way to estimate optimal gains achieving excellent results when tested in real flights.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quick detection of abrupt (unknown) parameter changes in an observed hidden Markov model (HMM) is important in several applications. Motivated by the recent application of relative entropy concepts in the robust sequential change detection problem (and the related model selection problem), this paper proposes a sequential unknown change detection algorithm based on a relative entropy based HMM parameter estimator. Our proposed approach is able to overcome the lack of knowledge of post-change parameters, and is illustrated to have similar performance to the popular cumulative sum (CUSUM) algorithm (which requires knowledge of the post-change parameter values) when examined, on both simulated and real data, in a vision-based aircraft manoeuvre detection problem.