527 resultados para SNR maximisation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in three-dimensional (313) electron microscopy (EM) and image processing are providing considerable improvements in the resolution of subcellular volumes, macromolecular assemblies and individual proteins. However, the recovery of high-frequency information from biological samples is hindered by specimen sensitivity to beam damage. Low dose electron cryo-microscopy conditions afford reduced beam damage but typically yield images with reduced contrast and low signal-to-noise ratios (SNRs). Here, we describe the properties of a new discriminative bilateral (DBL) filter that is based upon the bilateral filter implementation of Jiang et al. (Jiang, W., Baker, M.L., Wu, Q., Bajaj, C., Chin, W., 2003. Applications of a bilateral denoising filter in biological electron microscopy. J. Struc. Biol. 128, 82-97.). In contrast to the latter, the DBL filter can distinguish between object edges and high-frequency noise pixels through the use of an additional photometric exclusion function. As a result, high frequency noise pixels are smoothed, yet object edge detail is preserved. In the present study, we show that the DBL filter effectively reduces noise in low SNR single particle data as well as cellular tomograms of stained plastic sections. The properties of the DBL filter are discussed in terms of its usefulness for single particle analysis and for pre-processing cellular tomograms ahead of image segmentation. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a novel interpretation and usage of Neural Network (NN) in modeling physiological signals, which are allowed to be nonlinear and/or nonstationary. The method consists of training a NN for the k-step prediction of a physiological signal, and then examining the connection-weight-space (CWS) of the NN to extract information about the signal generator mechanism. We de. ne a novel feature, Normalized Vector Separation (gamma(ij)), to measure the separation of two arbitrary states i and j in the CWS and use it to track the state changes of the generating system. The performance of the method is examined via synthetic signals and clinical EEG. Synthetic data indicates that gamma(ij) can track the system down to a SNR of 3.5 dB. Clinical data obtained from three patients undergoing carotid endarterectomy of the brain showed that EEG could be modeled (within a root-means-squared-error of 0.01) by the proposed method, and the blood perfusion state of the brain could be monitored via gamma(ij), with small NNs having no more than 21 connection weight altogether.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper the performance of a multiple input multiple output (MIMO) wireless communication system operating in an indoor environment, featuring both line of sight (LOS) and non-line of sight (NLOS) signal propagation, is assessed. In the model the scattering objects are assumed to be uniformly distributed in an area surrounding the transmitting and receiving array antennas. Mutual coupling effects in the arrays are treated in an exact manner. However interactions with scattering objects are taken into account via a single bounce approach. Computer simulations are carried out for the system capacity for varying inter-element spacing in the receiving array for assumed values of LOS/NLOS power fraction and signal to noise ratio (SNR).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The results of empirical studies are limited to particular contexts, difficult to generalise and the studies themselves are expensive to perform. Despite these problems, empirical studies in software engineering can be made effective and they are important to both researchers and practitioners. The key to their effectiveness lies in the maximisation of the information that can be gained by examining existing studies, conducting power analyses for an accurate minimum sample size and benefiting from previous studies through replication. This approach was applied in a controlled experiment examining the combination of automated static analysis tools and code inspection in the context of verification and validation (V&V) of concurrent Java components. The combination of these V&V technologies was shown to be cost-effective despite the size of the study, which thus contributes to research in V&V technology evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate interpretation of distortion product otoacoustic emission (DPOAE) data cannot be made without realizing the effects of non-pathological factors on DPOAEs. The present study aimed to examine the effects of ear asymmetry, gender and handedness on DPOAEs obtained from school children. One thousand and three children (528 boys and 475 girls) with a mean age of 6.2 years (SD = 0.4, range = 5.2 7.9 years) were tested in a quiet room at their schools using the GSI-60 DPOAE system. The stimuli consisted of two pure tones of different frequencies f1 and f2 presented at 65 and 55dB SPL respectively. A DP-gram was obtained for each ear with f2 varying from 1.1 to 6.0 kHz and the ratio of f2/f1 being kept at 1.21. The signal-to-noise ratios (SNR) (DPOAE amplitude minus the mean noise floor) at the tested frequencies 1.1, 1.5, 1.9, 2.4, 3.0, 3.8, 4.8, and 6.0 kHz were measured. The results revealed a small, but significant difference in SNR between ears, with right ears showing a higher mean SNR than left ears at 1.9, 3.0, 3.8 and 6.0 kHz. At these frequencies, the difference in mean SNR between ears was less than 1 dB. A significant gender effect was also found, with girls exhibiting a higher SNR than boys at 3.8, 4.8 and 6.0 kHz. The difference in mean SNR, as a result of the gender effect, was about 1 to 2 dB at these frequencies. The results from the present study indicated no significant difference in mean SNR between left-handed and right-handed children for all tested frequencies. In conclusion, these non-pathological characteristics of DPOAEs should be considered in the interpretation of DPOAE results for school children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper derives the performance union bound of space-time trellis codes in orthogonal frequency division multiplexing system (STTC-OFDM) over quasi-static frequency selective fading channels based on the distance spectrum technique. The distance spectrum is the enumeration of the codeword difference measures and their multiplicities by exhausted searching through all the possible error event paths. Exhaustive search approach can be used for low memory order STTC with small frame size. However with moderate memory order STTC and moderate frame size the computational cost of exhaustive search increases exponentially, and may become impractical for high memory order STTCs. This requires advanced computational techniques such as Genetic Algorithms (GAS). In this paper, a GA with sharing function method is used to locate the multiple solutions of the distance spectrum for high memory order STTCs. Simulation evaluates the performance union bound and the complexity comparison of non-GA aided and GA aided distance spectrum techniques. It shows that the union bound give a close performance measure at high signal-to-noise ratio (SNR). It also shows that GA sharing function method based distance spectrum technique requires much less computational time as compared with exhaustive search approach but with satisfactory accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse how the Generative Topographic Mapping (GTM) can be modified to cope with missing values in the training data. Our approach is based on an Expectation -Maximisation (EM) method which estimates the parameters of the mixture components and at the same time deals with the missing values. We incorporate this algorithm into a hierarchical GTM. We verify the method on a toy data set (using a single GTM) and a realistic data set (using a hierarchical GTM). The results show our algorithm can help to construct informative visualisation plots, even when some of the training points are corrupted with missing values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The efficiency literature, both using parametric and non-parametric methods, has been focusing mainly on cost efficiency analysis rather than on profit efficiency. In for-profit organisations, however, the measurement of profit efficiency and its decomposition into technical and allocative efficiency is particularly relevant. In this paper a newly developed method is used to measure profit efficiency and to identify the sources of any shortfall in profitability (technical and/or allocative inefficiency). The method is applied to a set of Portuguese bank branches first assuming long run and then a short run profit maximisation objective. In the long run most of the scope for profit improvement of bank branches is by becoming more allocatively efficient. In the short run most of profit gain can be realised through higher technical efficiency. © 2003 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The sudden loss of the plasma magnetic confinement, known as disruption, is one of the major issue in a nuclear fusion machine as JET (Joint European Torus), Disruptions pose very serious problems to the safety of the machine. The energy stored in the plasma is released to the machine structure in few milliseconds resulting in forces that at JET reach several Mega Newtons. The problem is even more severe in the nuclear fusion power station where the forces are in the order of one hundred Mega Newtons. The events that occur during a disruption are still not well understood even if some mechanisms that can lead to a disruption have been identified and can be used to predict them. Unfortunately it is always a combination of these events that generates a disruption and therefore it is not possible to use simple algorithms to predict it. This thesis analyses the possibility of using neural network algorithms to predict plasma disruptions in real time. This involves the determination of plasma parameters every few milliseconds. A plasma boundary reconstruction algorithm, XLOC, has been developed in collaboration with Dr. D. Ollrien and Dr. J. Ellis capable of determining the plasma wall/distance every 2 milliseconds. The XLOC output has been used to develop a multilayer perceptron network to determine plasma parameters as ?i and q? with which a machine operational space has been experimentally defined. If the limits of this operational space are breached the disruption probability increases considerably. Another approach for prediction disruptions is to use neural network classification methods to define the JET operational space. Two methods have been studied. The first method uses a multilayer perceptron network with softmax activation function for the output layer. This method can be used for classifying the input patterns in various classes. In this case the plasma input patterns have been divided between disrupting and safe patterns, giving the possibility of assigning a disruption probability to every plasma input pattern. The second method determines the novelty of an input pattern by calculating the probability density distribution of successful plasma patterns that have been run at JET. The density distribution is represented as a mixture distribution, and its parameters arc determined using the Expectation-Maximisation method. If the dataset, used to determine the distribution parameters, covers sufficiently well the machine operational space. Then, the patterns flagged as novel can be regarded as patterns belonging to a disrupting plasma. Together with these methods, a network has been designed to predict the vertical forces, that a disruption can cause, in order to avoid that too dangerous plasma configurations are run. This network can be run before the pulse using the pre-programmed plasma configuration or on line becoming a tool that allows to stop dangerous plasma configuration. All these methods have been implemented in real time on a dual Pentium Pro based machine. The Disruption Prediction and Prevention System has shown that internal plasma parameters can be determined on-line with a good accuracy. Also the disruption detection algorithms showed promising results considering the fact that JET is an experimental machine where always new plasma configurations are tested trying to improve its performances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work sets out to evaluate the potential benefits and pit-falls in using a priori information to help solve the Magnetoencephalographic (MEG) inverse problem. In chapter one the forward problem in MEG is introduced, together with a scheme that demonstrates how a priori information can be incorporated into the inverse problem. Chapter two contains a literature review of techniques currently used to solve the inverse problem. Emphasis is put on the kind of a priori information that is used by each of these techniques and the ease with which additional constraints can be applied. The formalism of the FOCUSS algorithm is shown to allow for the incorporation of a priori information in an insightful and straightforward manner. In chapter three it is described how anatomical constraints, in the form of a realistically shaped source space, can be extracted from a subject’s Magnetic Resonance Image (MRI). The use of such constraints relies on accurate co-registration of the MEG and MRI co-ordinate systems. Variations of the two main co-registration approaches, based on fiducial markers or on surface matching, are described and the accuracy and robustness of a surface matching algorithm is evaluated. Figures of merit introduced in chapter four are shown to given insight into the limitations of a typical measurement set-up and potential value of a priori information. It is shown in chapter five that constrained dipole fitting and FOCUSS outperform unconstrained dipole fitting when data with low SNR is used. However, the effect of errors in the constraints can reduce this advantage. Finally, it is demonstrated in chapter six that the results of different localisation techniques give corroborative evidence about the location and activation sequence of the human visual cortical areas underlying the first 125ms of the visual magnetic evoked response recorded with a whole head neuromagnetometer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. High Throughput Screening (HTS) is an important tool in the pharmaceutical industry for discovering leads which can be optimised and further developed into candidate drugs. Since the development of new robotic technologies, the ability to test the activities of compounds has considerably increased in recent years. Traditional methods, looking at tables and graphical plots for analysing relationships between measured activities and the structure of compounds, have not been feasible when facing a large HTS dataset. Instead, data visualisation provides a method for analysing such large datasets, especially with high dimensions. So far, a few visualisation techniques for drug design have been developed, but most of them just cope with several properties of compounds at one time. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine the distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of t.he hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E- and M-step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model. In this thesis we also demonstrate the applicability of the hierarchy of latent trait models in the field of document data mining.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report a novel real-time homodyne coherent receiver based on a DPSK optical-electrical-optical (OEO) regenerator used to extract a carrier from carrier-less phase modulated signals based on feed-forward based modulation stripping. The performance of this non-DSP based coherent receiver was evaluated for 10.66Gbit/s BPSK signals. Self-homodyne coherent detection and homodyne detection with an injection-locked local oscillator laser was demonstrated. The performance was evaluated by measuring the electrical signal-to-noise (SNR) and recording the eye diagrams. Using injection-locking for the LO improves the performance and enables homodyne detection with optical injection-locking to operate with carrier-less BPSK signals without the need for polarization multiplexed pilot-tones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a congestion control mechanism is presented for multiservice wireless OFDMA networks. The revenue rate and the user SNR's are used to partition the bandwidth in accordance with a complete partitioning structure. Moreover, through the use of our scheme the QoS of any ongoing connections can be satisfied. Results show that the revenue rate plays an important role in prioritizing the different services. © 2013 Springer Science+Business Media New York.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique. Methods and patients: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes), and glaucoma suspect patients (38 eyes). All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis protocol: the hemifield sector analysis protocol. Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P<0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group). The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P<0.001), statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P<0.01), and only 1/11 pair was statistically significant (t-test P<0.9). The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86% respectively and 89% and 79% in glaucoma suspects. These results showed that the new analysis protocol was able to confirm existing visual field defects detected by standard perimetry, was able to differentiate between the three study groups with a clear distinction between normal patients and those with suspected glaucoma, and was able to detect early visual field changes not detected by standard perimetry. In addition, the distinction between normal and glaucoma patients was especially clear and significant using this analysis. Conclusion: The new hemifield sector analysis protocol used in mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol, it can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. The sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucomatous visual field loss. The intersector analysis protocol can detect early field changes not detected by the standard Humphrey Field Analyzer test. © 2013 Mousa et al, publisher and licensee Dove Medical Press Ltd.