123 resultados para Gaussian extended cubature formula
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
It is shown that the Mel'nikov-Meshkov formalism for bridging the very low damping (VLD) and intermediate-to-high damping (IHD) Kramers escape rates as a function of the dissipation parameter for mechanical particles may be extended to the rotational Brownian motion of magnetic dipole moments of single-domain ferromagnetic particles in nonaxially symmetric potentials of the magnetocrystalline anisotropy so that both regimes of damping, occur. The procedure is illustrated by considering the particular nonaxially symmetric problem of superparamagnetic particles possessing uniaxial anisotropy subject to an external uniform field applied at an angle to the easy axis of magnetization. Here the Mel'nikov-Meshkov treatment is found to be in good agreement with an exact calculation of the smallest eigenvalue of Brown's Fokker-Planck equation, provided the external field is large enough to ensure significant departure from axial symmetry, so that the VLD and IHD formulas for escape rates of magnetic dipoles for nonaxially symmetric potentials are valid.
Resumo:
This paper proposes a novel image denoising technique based on the normal inverse Gaussian (NIG) density model using an extended non-negative sparse coding (NNSC) algorithm proposed by us. This algorithm can converge to feature basis vectors, which behave in the locality and orientation in spatial and frequency domain. Here, we demonstrate that the NIG density provides a very good fitness to the non-negative sparse data. In the denoising process, by exploiting a NIG-based maximum a posteriori estimator (MAP) of an image corrupted by additive Gaussian noise, the noise can be reduced successfully. This shrinkage technique, also referred to as the NNSC shrinkage technique, is self-adaptive to the statistical properties of image data. This denoising method is evaluated by values of the normalized signal to noise rate (SNR). Experimental results show that the NNSC shrinkage approach is indeed efficient and effective in denoising. Otherwise, we also compare the effectiveness of the NNSC shrinkage method with methods of standard sparse coding shrinkage, wavelet-based shrinkage and the Wiener filter. The simulation results show that our method outperforms the three kinds of denoising approaches mentioned above.
Resumo:
We show that homodyne measurements can be used to demonstrate violations of Bell's inequality with Gaussian states, when the local rotations used for these types of tests are implemented using nonlinear unitary operations. We reveal that the local structure of the Gaussian state under scrutiny is crucial in the performance of the test. The effects of finite detection efficiency are thoroughly studied and shown to only mildly affect the revelation of Bell violations. We speculate that our approach may be extended to other applications such as entanglement distillation where local operations are necessary elements besides quantum entanglement.
Resumo:
Objective To evaluate participants' perceptions of the impact on them of an additional six months' training beyond the standard 12 month general practice vocational training scheme. Design Qualitative study using focus groups. Setting General practice vocational training in Northern Ireland. Participants 13 general practitioner registrars, six of whom participated in the additional six months' training, and four trainers involved in the additional six months' training. Main outcome measures: Participants' views about their experiences in 18 month and 12 month courses. Results Participants reported that the 12 month course was generally positive but was too pressurised and focused on examinations, and also that it had a negative impact on self care. The nature of the learning and assessment was reported to have left participants feeling averse to further continuing education and lacking in confidence. In contrast, the extended six month component was reported to have restimulated learning by focusing more on patient care and promoting self directed learning. It developed confidence, promoted teamwork, and gave experience of two practice contexts, and was reported as valuable by both ex-registrars and trainers. However, both the 12 and 18 month courses left participants feeling underprepared for practice management and self care. Conclusions 12 months' training in general practice does not provide doctors with the necessary competencies and confidence to enter independent practice. The extended period was reported to promote greater professional development, critical evaluation skills, and orientation to lifelong learning but does not fill all the gaps.
Resumo:
This paper theoretically analysis the recently proposed "Extended Partial Least Squares" (EPLS) algorithm. After pointing out some conceptual deficiencies, a revised algorithm is introduced that covers the middle ground between Partial Least Squares and Principal Component Analysis. It maximises a covariance criterion between a cause and an effect variable set (partial least squares) and allows a complete reconstruction of the recorded data (principal component analysis). The new and conceptually simpler EPLS algorithm has successfully been applied in detecting and diagnosing various fault conditions, where the original EPLS algorithm did only offer fault detection.
Resumo:
High-precision correlation of palaeoclimatic and palaeoenvironmental records is crucial for testing hypotheses of synchronous change. Although radiocarbon is the traditional method for dating late Quaternary sedimentary sequences, particularly during the last glacial–interglacial transition (LGIT; 15–9?ka), there are inherent problems with the method, particularly during periods of climate change which are often accompanied by major perturbations in atmospheric radiocarbon content. An alternative method is the use of tephras that act as time-parallel marker horizons. Within Europe, numerous volcanic centres are known to have erupted during the LGIT, providing considerable potential for high-precision correlation independent of past radiocarbon fluctuations. Here we report the first identification of the Vedde Ash and Askja Tephra in Ireland, significantly extending the known provenance of these events. We have also identified two new horizons (the Roddans Port Tephras A and B) and tentatively recognise an additional horizon from Vallensgård Mose (Denmark) that provide crucial additional chronological control for the LGIT. Two phases of the Laacher See Tephra (LST) are reported, the lower Laacher See Tephra (LLST) and probably the C2 phase of the Middle Laacher See Tephra (MLST-C2) indicating a more northeasterly distribution of this fan than reported previously.
Resumo:
Measures of entanglement, fidelity, and purity are basic yardsticks in quantum-information processing. We propose how to implement these measures using linear devices and homodyne detectors for continuous-variable Gaussian states. In particular, the test of entanglement becomes simple with some prior knowledge that is relevant to current experiments.
Resumo:
We investigate the nonclassicality of a photon-subtracted Gaussian field, which was produced in a recent experiment, using negativity of the Wigner function and the nonexistence of well-behaved positive P function. We obtain the condition to see negativity of the Wigner function for the case including the mixed Gaussian incoming field, the threshold photodetection and the inefficient homodyne measurement. We show how similar the photon-subtracted state is to a superposition of coherent states.
Resumo:
Some non-classical properties such as squeezing, sub-Poissonian photon statistics or oscillations in photon-number distributions may survive longer in a phase-sensitive environment than in a phase-insensitive environment. We examine if entanglement, which is an inter-mode non-classical feature, can also survive longer in a phase-sensitive environment. Differently from the single-mode case, we find that making the environment phase-sensitive does not aid in prolonging the inter-mode non-classical nature, i.e. entanglement.
Resumo:
Course Scheduling consists of assigning lecture events to a limited set of specific timeslots and rooms. The objective is to satisfy as many soft constraints as possible, while maintaining a feasible solution timetable. The most successful techniques to date require a compute-intensive examination of the solution neighbourhood to direct searches to an optimum solution. Although they may require fewer neighbourhood moves than more exhaustive techniques to gain comparable results, they can take considerably longer to achieve success. This paper introduces an extended version of the Great Deluge Algorithm for the Course Timetabling problem which, while avoiding the problem of getting trapped in local optima, uses simple Neighbourhood search heuristics to obtain solutions in a relatively short amount of time. The paper presents results based on a standard set of benchmark datasets, beating over half of the currently published best results with in some cases up to 60% of an improvement.