140 resultados para High-frequency induction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Assessment of the condition of connectors in the overhead electricity network has traditionally relied on the heat dissipation or voltage drop from existing load current (50Hz) as a measurable parameter to differentiate between satisfactory and failing connectors. This research has developed a technique which does not rely on the 50Hz current and a prototype connector tester has been developed. In this system a high frequency signal is injected into the section of line under test and measures the resistive voltage drop and the current at the test frequency to yield the resistance in micro-ohms. From the value of resistance a decision as to whether a connector is satisfactory or approaching failure can be made. Determining the resistive voltage drop in the presence of a large induced voltage was achieved by the innovative approach of using a representative sample of the magnetic flux producing the induced voltage as the phase angle reference for the signal processing rather than the phase angle of the current, which can be affected by the presence of nearby metal objects. Laboratory evaluation of the connector tester has validated the measurement technique. The magnitude of the load current (50Hz) has minimal effect on the measurement accuracy. Addition of a suitable battery based power supply system and isolated communications, probably radio and refinement of the printed circuit board design and software are the remaining development steps to a production instrument.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Basic competencies in assessing and treating substance use disorders should be core to the training of any clinical psychologist, because of the high frequency of risky or problematic substance use in the community, and its high co-occurrence with other problems. Skills in establishing trust and a therapeutic alliance are particularly important in addiction, given the stigma and potential for legal sanctions that surround it. The knowledge and skills of all clinical practitioners should be sufficient to allow valid screening and diagnosis of substance use disorders, accurate estimation of consumption and a basic functional analysis. Practitioners should also be able to undertake brief interventions including motivational interviews, and appropriately apply generic interventions such as problem solving or goal setting to addiction. Furthermore, clinical psychologists should have an understanding of the nature, evidence base and indications for biochemical assays, pharmacotherapies and other medical treatments, and ways these can be integrated with psychological practice. Specialists in addiction should have more sophisticated competencies in each of these areas. They need to have a detailed understating of current addiction theories and basic and applied research, be able to undertake and report on a detailed psychological assessment, and display expert competence in addiction treatment. These skills should include an ability to assess and manage complex or co-occurring problems, to adapt interventions to the needs of different groups, and to assist people who have not responded to basic treatments. They should also be able to provide consultation to others, undertake evaluations of their practice, and monitor and evaluate emerging research data in the field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uncooperative iris identification systems at a distance and on the move often suffer from poor resolution and poor focus of the captured iris images. The lack of pixel resolution and well-focused images significantly degrades the iris recognition performance. This paper proposes a new approach to incorporate the focus score into a reconstruction-based super-resolution process to generate a high resolution iris image from a low resolution and focus inconsistent video sequence of an eye. A reconstruction-based technique, which can incorporate middle and high frequency components from multiple low resolution frames into one desired super-resolved frame without introducing false high frequency components, is used. A new focus assessment approach is proposed for uncooperative iris at a distance and on the move to improve performance for variations in lighting, size and occlusion. A novel fusion scheme is then proposed to incorporate the proposed focus score into the super-resolution process. The experiments conducted on the The Multiple Biometric Grand Challenge portal database shows that our proposed approach achieves an EER of 2.1%, outperforming the existing state-of-the-art averaging signal-level fusion approach by 19.2% and the robust mean super-resolution approach by 8.7%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most information retrieval (IR) models treat the presence of a term within a document as an indication that the document is somehow "about" that term, they do not take into account when a term might be explicitly negated. Medical data, by its nature, contains a high frequency of negated terms - e.g. "review of systems showed no chest pain or shortness of breath". This papers presents a study of the effects of negation on information retrieval. We present a number of experiments to determine whether negation has a significant negative affect on IR performance and whether language models that take negation into account might improve performance. We use a collection of real medical records as our test corpus. Our findings are that negation has some affect on system performance, but this will likely be confined to domains such as medical data where negation is prevalent.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acoustic emission (AE) is the phenomenon where high frequency stress waves are generated by rapid release of energy within a material by sources such as crack initiation or growth. AE technique involves recording these stress waves by means of sensors placed on the surface and subsequent analysis of the recorded signals to gather information such as the nature and location of the source. It is one of the several diagnostic techniques currently used for structural health monitoring (SHM) of civil infrastructure such as bridges. Some of its advantages include ability to provide continuous in-situ monitoring and high sensitivity to crack activity. But several challenges still exist. Due to high sampling rate required for data capture, large amount of data is generated during AE testing. This is further complicated by the presence of a number of spurious sources that can produce AE signals which can then mask desired signals. Hence, an effective data analysis strategy is needed to achieve source discrimination. This also becomes important for long term monitoring applications in order to avoid massive date overload. Analysis of frequency contents of recorded AE signals together with the use of pattern recognition algorithms are some of the advanced and promising data analysis approaches for source discrimination. This paper explores the use of various signal processing tools for analysis of experimental data, with an overall aim of finding an improved method for source identification and discrimination, with particular focus on monitoring of steel bridges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper assesses the capacity of high-frequency ultrasonic waves for detecting changes in the proteoglycan (PG) content of articular cartilage. 50 cartilage-on-bone samples were exposed to ultrasonic waves via an ultrasound transducer at a frequency of 20MHz. Histology and ImageJ processing were conducted to determine the PG content of the specimen. The ratios of the reflected signals from both the surface and the osteochondral junction (OCJ) were determined from the experimental data. The initial results show an inconsistency in the capacity of ultrasound to distinguish samples with severe proteoglycan loss (i.e. >90% PG loss) from the normal intact sample. This lack of clear distinction was also demonstrated at for samples with less than 60% depletion, while there is a clear differentiation between the normal intact sample and those with 55-70% PG loss.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The outcome of interspecific hybridization between native and invasive species depends on the relative frequencies of parental taxa and viability of hybrid progeny. We investigated individual and population level consequences of hybridization between the Australian native, Senecio pinnatifolius, and the exotic S. madagascariensis, with AFLP markers and used this information to simulate the expected outcome of hybridization.A high frequency (range 8.3-75.6 %) of hybrids was detected in open pollinated seeds of both species, but mature hybrids were absent from sympatric populations indicating that sympatric populations represent tension zones. A hybridization advantage was observed for S. madagascariensis,where significantly more progeny than expected were sired based on proportional representation of the two species in sympatric populations. Simulations indicated S. pinnatifolius could be replaced in sympatric populations if hybridization was density dependent.For this native-exotic pair, prezygotic isolating barriers are weak, but low hybrid viability maintains a strong postzygotic barrier to introgression. Due to asymmetric hybridization, S. pinnatifolius appears under threat from demographic swamping, and local extinction is possible where it occurs in sympatry with S. madagascariensis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to compare between electrical muscle stimulation (EMS) and maximal voluntary (VOL) isometric contractions of the elbow flexors for changes in biceps brachii muscle oxygenation (tissue oxygenation index, TOI) and haemodynamics (total haemoglobin volume, tHb = oxygenated-Hb + deoxygenated-Hb) determined by near-infrared spectroscopy (NIRS). The biceps brachii muscle of 10 healthy men (23–39 years) was electrically stimulated at high frequency (75 Hz) via surface electrodes to evoke 50 intermittent (4-s contraction, 15-s relaxation) isometric contractions at maximum tolerated current level (EMS session). The contralateral arm performed 50 intermittent (4-s contraction, 15-s relaxation) maximal voluntary isometric contractions (VOL session) in a counterbalanced order separated by 2–3 weeks. Results indicated that although the torque produced during EMS was approximately 50% of VOL (P<0Æ05), there was no significant difference in the changes in TOI amplitude or TOI slope between EMS and VOL over the 50 contractions. However, the TOI amplitude divided by peak torque was approximately 50% lower for EMS than VOL (P<0Æ05), which indicates EMS was less efficient than VOL. This seems likely because of the difference in the muscles involved in the force production between conditions. Mean decrease in tHb amplitude during the contraction phases was significantly (P<0Æ05) greater for EMS than VOL from the 10th contraction onwards, suggesting that the muscle blood volume was lower in EMS than VOL. It is concluded that local oxygen demand of the biceps brachii sampled by NIRS is similar between VOL and EMS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acoustic emission (AE) is the phenomenon where high frequency stress waves are generated by rapid release of energy within a material by sources such as crack initiation or growth. AE technique involves recording these stress waves by means of sensors placed on the surface and subsequent analysis of the recorded signals to gather information such as the nature and location of the source. AE is one of the several non-destructive testing (NDT) techniques currently used for structural health monitoring (SHM) of civil, mechanical and aerospace structures. Some of its advantages include ability to provide continuous in-situ monitoring and high sensitivity to crack activity. Despite these advantages, several challenges still exist in successful application of AE monitoring. Accurate localization of AE sources, discrimination between genuine AE sources and spurious noise sources and damage quantification for severity assessment are some of the important issues in AE testing and will be discussed in this paper. Various data analysis and processing approaches will be applied to manage those issues.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes and evaluates the novel utility of network methods for understanding human interpersonal interactions within social neurobiological systems such as sports teams. We show how collective system networks are supported by the sum of interpersonal interactions that emerge from the activity of system agents (such as players in a sports team). To test this idea we trialled the methodology in analyses of intra-team collective behaviours in the team sport of water polo. We observed that the number of interactions between team members resulted in varied intra-team coordination patterns of play, differentiating between successful and unsuccessful performance outcomes. Future research on small-world networks methodologies needs to formalize measures of node connections in analyses of collective behaviours in sports teams, to verify whether a high frequency of interactions is needed between players in order to achieve competitive performance outcomes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work a novel hybrid approach is presented that uses a combination of both time domain and frequency domain solution strategies to predict the power distribution within a lossy medium loaded within a waveguide. The problem of determining the electromagnetic fields evolving within the waveguide and the lossy medium is decoupled into two components, one for computing the fields in the waveguide including a coarse representation of the medium (the exterior problem) and one for a detailed resolution of the lossy medium (the interior problem). A previously documented cell-centred Maxwell’s equations numerical solver can be used to resolve the exterior problem accurately in the time domain. Thereafter the discrete Fourier transform can be applied to the computed field data around the interface of the medium to estimate the frequency domain boundary condition in-formation that is needed for closure of the interior problem. Since only the electric fields are required to compute the power distribution generated within the lossy medium, the interior problem can be resolved efficiently using the Helmholtz equation. A consistent cell-centred finite-volume method is then used to discretise this equation on a fine mesh and the underlying large, sparse, complex matrix system is solved for the required electric field using the iterative Krylov subspace based GMRES iterative solver. It will be shown that the hybrid solution methodology works well when a single frequency is considered in the evaluation of the Helmholtz equation in a single mode waveguide. A restriction of the scheme is that the material needs to be sufficiently lossy, so that any penetrating waves in the material are absorbed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Planar magnetic elements are becoming a replacement for their conventional rivals. Among the reasons supporting their application, is their smaller size. Taking less bulk in the electronic package is a critical advantage from the manufacturing point of view. The planar structure consists of the PCB copper tracks to generate the desired windings .The windings on each PCB layer could be connected in various ways to other winding layers to produce a series or parallel connection. These windings could be applied coreless or with a core depending on the application in Switched Mode Power Supplies (SMPS). Planar shapes of the tracks increase the effective conduction area in the windings, brings about more inductance compared to the conventional windings with the similar copper loss case. The problem arising from the planar structure of magnetic inductors is the leakage current between the layers generated by a pulse width modulated voltage across the inductor. This current value relies on the capacitive coupling between the layers, which in its turn depends on the physical parameters of the planar scheme. In order to reduce this electrical power dissipation due to the leakage current and Electromagnetic Interference (EMI), reconsideration in the planar structure might be effective. The aim of this research is to address problem of these capacitive coupling in planar layers and to find out a better structure for the planar inductance which offers less total capacitive coupling and thus less thermal dissipation from the leakage currents. Through Finite Element methods (FEM) several simulations have been carried out for various planar structures. The labs prototypes of these structures are built with the similar specification of the simulation cases. The capacitive couplings of the samples are determined with Spectrum Analyser whereby the test analysis verified the simulation results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The flood flow in urbanised areas constitutes a major hazard to the population and infrastructure as seen during the summer 2010-2011 floods in Queensland (Australia). Flood flows in urban environments have been studied relatively recently, although no study considered the impact of turbulence in the flow. During the 12-13 January 2011 flood of the Brisbane River, some turbulence measurements were conducted in an inundated urban environment in Gardens Point Road next to Brisbane's central business district (CBD) at relatively high frequency (50 Hz). The properties of the sediment flood deposits were characterised and the acoustic Doppler velocimeter unit was calibrated to obtain both instantaneous velocity components and suspended sediment concentration in the same sampling volume with the same temporal resolution. While the flow motion in Gardens Point Road was subcritical, the water elevations and velocities fluctuated with a distinctive period between 50 and 80 s. The low frequency fluctuations were linked with some local topographic effects: i.e, some local choke induced by an upstream constriction between stairwells caused some slow oscillations with a period close to the natural sloshing period of the car park. The instantaneous velocity data were analysed using a triple decomposition, and the same triple decomposition was applied to the water depth, velocity flux, suspended sediment concentration and suspended sediment flux data. The velocity fluctuation data showed a large energy component in the slow fluctuation range. For the first two tests at z = 0.35 m, the turbulence data suggested some isotropy. At z = 0.083 m, on the other hand, the findings indicated some flow anisotropy. The suspended sediment concentration (SSC) data presented a general trend with increasing SSC for decreasing water depth. During a test (T4), some long -period oscillations were observed with a period about 18 minutes. The cause of these oscillations remains unknown to the authors. The last test (T5) took place in very shallow waters and high suspended sediment concentrations. It is suggested that the flow in the car park was disconnected from the main channel. Overall the flow conditions at the sampling sites corresponded to a specific momentum between 0.2 to 0.4 m2 which would be near the upper end of the scale for safe evacuation of individuals in flooded areas. But the authors do not believe the evacuation of individuals in Gardens Point Road would have been safe because of the intense water surges and flow turbulence. More generally any criterion for safe evacuation solely based upon the flow velocity, water depth or specific momentum cannot account for the hazards caused by the flow turbulence, water depth fluctuations and water surges.