924 resultados para probabilistic Hough transform
Resumo:
2000 Mathematics Subject Classification: Primary 60J45, 60J50, 35Cxx; Secondary 31Cxx.
Resumo:
The focus of this thesis is the extension of topographic visualisation mappings to allow for the incorporation of uncertainty. Few visualisation algorithms in the literature are capable of mapping uncertain data with fewer able to represent observation uncertainties in visualisations. As such, modifications are made to NeuroScale, Locally Linear Embedding, Isomap and Laplacian Eigenmaps to incorporate uncertainty in the observation and visualisation spaces. The proposed mappings are then called Normally-distributed NeuroScale (N-NS), T-distributed NeuroScale (T-NS), Probabilistic LLE (PLLE), Probabilistic Isomap (PIso) and Probabilistic Weighted Neighbourhood Mapping (PWNM). These algorithms generate a probabilistic visualisation space with each latent visualised point transformed to a multivariate Gaussian or T-distribution, using a feed-forward RBF network. Two types of uncertainty are then characterised dependent on the data and mapping procedure. Data dependent uncertainty is the inherent observation uncertainty. Whereas, mapping uncertainty is defined by the Fisher Information of a visualised distribution. This indicates how well the data has been interpolated, offering a level of ‘surprise’ for each observation. These new probabilistic mappings are tested on three datasets of vectorial observations and three datasets of real world time series observations for anomaly detection. In order to visualise the time series data, a method for analysing observed signals and noise distributions, Residual Modelling, is introduced. The performance of the new algorithms on the tested datasets is compared qualitatively with the latent space generated by the Gaussian Process Latent Variable Model (GPLVM). A quantitative comparison using existing evaluation measures from the literature allows performance of each mapping function to be compared. Finally, the mapping uncertainty measure is combined with NeuroScale to build a deep learning classifier, the Cascading RBF. This new structure is tested on the MNist dataset achieving world record performance whilst avoiding the flaws seen in other Deep Learning Machines.
Resumo:
Cloud computing is a new technological paradigm offering computing infrastructure, software and platforms as a pay-as-you-go, subscription-based service. Many potential customers of cloud services require essential cost assessments to be undertaken before transitioning to the cloud. Current assessment techniques are imprecise as they rely on simplified specifications of resource requirements that fail to account for probabilistic variations in usage. In this paper, we address these problems and propose a new probabilistic pattern modelling (PPM) approach to cloud costing and resource usage verification. Our approach is based on a concise expression of probabilistic resource usage patterns translated to Markov decision processes (MDPs). Key costing and usage queries are identified and expressed in a probabilistic variant of temporal logic and calculated to a high degree of precision using quantitative verification techniques. The PPM cost assessment approach has been implemented as a Java library and validated with a case study and scalability experiments. © 2012 Springer-Verlag Berlin Heidelberg.
Resumo:
The integrability of the nonlinear Schräodinger equation (NLSE) by the inverse scattering transform shown in a seminal work [1] gave an interesting opportunity to treat the corresponding nonlinear channel similar to a linear one by using the nonlinear Fourier transform. Integrability of the NLSE is in the background of the old idea of eigenvalue communications [2] that was resurrected in recent works [3{7]. In [6, 7] the new method for the coherent optical transmission employing the continuous nonlinear spectral data | nonlinear inverse synthesis was introduced. It assumes the modulation and detection of data using directly the continuous part of nonlinear spectrum associated with an integrable transmission channel (the NLSE in the case considered). Although such a transmission method is inherently free from nonlinear impairments, the noisy signal corruptions, arising due to the ampli¯er spontaneous emission, inevitably degrade the optical system performance. We study properties of the noise-corrupted channel model in the nonlinear spectral domain attributed to NLSE. We derive the general stochastic equations governing the signal evolution inside the nonlinear spectral domain and elucidate the properties of the emerging nonlinear spectral noise using well-established methods of perturbation theory based on inverse scattering transform [8]. It is shown that in the presence of small noise the communication channel in the nonlinear domain is the additive Gaussian channel with memory and signal-dependent correlation matrix. We demonstrate that the effective spectral noise acquires colouring", its autocorrelation function becomes slow decaying and non-diagonal as a function of \frequencies", and the noise loses its circular symmetry, becoming elliptically polarized. Then we derive a low bound for the spectral effiency for such a channel. Our main result is that by using the nonlinear spectral techniques one can significantly increase the achievable spectral effiency compared to the currently available methods [9]. REFERENCES 1. Zakharov, V. E. and A. B. Shabat, Sov. Phys. JETP, Vol. 34, 62{69, 1972. 2. Hasegawa, A. and T. Nyu, J. Lightwave Technol., Vol. 11, 395{399, 1993. 3. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4312{4328, 2014. 4. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4329{4345 2014. 5. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4346{4369, 2014. 6. Prilepsky, J. E., S. A. Derevyanko, K. J. Blow, I. Gabitov, and S. K. Turitsyn, Phys. Rev. Lett., Vol. 113, 013901, 2014. 7. Le, S. T., J. E. Prilepsky, and S. K. Turitsyn, Opt. Express, Vol. 22, 26720{26741, 2014. 8. Kaup, D. J. and A. C. Newell, Proc. R. Soc. Lond. A, Vol. 361, 413{446, 1978. 9. Essiambre, R.-J., G. Kramer, P. J. Winzer, G. J. Foschini, and B. Goebel, J. Lightwave Technol., Vol. 28, 662{701, 2010.
Resumo:
The nonlinear Fourier transform, also known as eigenvalue communications, is a transmission and signal processing technique that makes positive use of the nonlinear properties of fibre channels. I will discuss recent progress in this field.
Resumo:
In this work we introduce the periodic nonlinear Fourier transform (PNFT) and propose a proof-of-concept communication system based on it by using a simple waveform with known nonlinear spectrum (NS). We study the performance (addressing the bit-error-rate (BER), as a function of the propagation distance) of the transmission system based on the use of the PNFT processing method and show the benefits of the latter approach. By analysing our simulation results for the system with lumped amplification, we demonstrate the decent potential of the new processing method.
Resumo:
The traditional use of global and centralised control methods, fails for large, complex, noisy and highly connected systems, which typify many real world industrial and commercial systems. This paper provides an efficient bottom up design of distributed control in which many simple components communicate and cooperate to achieve a joint system goal. Each component acts individually so as to maximise personal utility whilst obtaining probabilistic information on the global system merely through local message-passing. This leads to an implied scalable and collective control strategy for complex dynamical systems, without the problems of global centralised control. Robustness is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, can be implemented adaptively and opens a systematic rich way to information sharing. This paper opens the foreseen direction and inspects the proposed design on a linearised version of coupled map lattice with spatiotemporal chaos. A version close to linear quadratic design gives an initial insight into possible behaviours of such networks.
Resumo:
Reliability of power converters is of crucial importance in switched reluctance motor drives used for safety-critical applications. Open-circuit faults in power converters will cause the motor to run in unbalanced states, and if left untreated, they will lead to damage to the motor and power modules, and even cause a catastrophic failure of the whole drive system. This study is focused on using a single current sensor to detect open-circuit faults accurately. An asymmetrical half-bridge converter is considered in this study and the faults of single-phase open and two-phase open are analysed. Three different bus positions are defined. On the basis of a fast Fourier transform algorithm with Blackman window interpolation, the bus current spectrums before and after open-circuit faults are analysed in details. Their fault characteristics are extracted accurately by the normalisations of the phase fundamental frequency component and double phase fundamental frequency component, and the fault characteristics of the three bus detection schemes are also compared. The open-circuit faults can be located by finding the relationship between the bus current and rotor position. The effectiveness of the proposed diagnosis method is validated by the simulation results and experimental tests.
Resumo:
Rationing occurs if the demand for a certain good exceeds its supply. In such situations a rationing method has to be specified in order to determine the allocation of the scarce good to the agents. Moulin (1999) introduced the notion of probabilistic rationing methods for the discrete framework. In this paper we establish a link between classical and probabilistic rationing methods. In particular, we assign to any given classical rationing method a probabilistic rationing method with minimal variance among those probabilistic rationing methods, which result in the same expected distributions as the given classical rationing method.
Resumo:
Az életben számtalan olyan esettel találkozunk, amikor egy jószág iránti kereslet meghaladja a rendelkezésre álló kínálatot. Példaként említhetjük a kárpótlási igényeket, egy csődbement cég hitelezőinek igényeit, valamely szerv átültetésére váró betegek sorát stb. Ilyen helyzetekben valamilyen eljárás szerint oszthatjuk el a szűkös mennyiséget a szereplők között. Szokás megkülönböztetni a determinisztikus és a sztochasztikus elosztási eljárásokat, jóllehet sok esetben csak a determinisztikus eljárásokat alkalmazzák. Azonban igazságossági szempontból gyakran használnak sztochasztikus elosztási eljárásokat is, mint például tette azt az Egyesült államok hadserege a második világháború végét követően a külföldön állomásozó katonáinak visszavonásakor, illetve a vietnami háború során behívandó személyek kiválasztásakor. / === / We investigated the minimal variance methods introduced in Tasnádi [6] based on seven popular axioms. We proved that if a deterministic rationing method satisfies demand monotonicity, resource monotonicity, equal treatment of equals and self-duality, than the minimal variance methods associated with the given deterministic rationing method also satisfies demand monotonicity, resource monotonicity, equal treatment of equals and self-duality. Furthermore, we found that the consistency, the lower composition and the upper composition of a deterministic rationing method does not imply the consistency, the lower composition and the upper composition of a minimal variance method associated with the given deterministic rationing method.
Resumo:
The subject of this dissertation is the nature of the environmental transformations, both symbolic and physical, that took place in Colombia between 1850 and 1930. This period begins with the attempt by the Colombian elite to leave behind colonial ties, overcome economic disorganization, and link Colombia to the international market. These efforts were part of a general project to “civilize” this tropical country. The period closes with the transition toward an industrialization and urbanization process led by the Colombian state during the 1930s. ^ Frequently, environmental studies as an academic field are dominated by biological concerns. However, most environmental thinking accepts their interdisciplinary nature. Under this framework not only spatial but also symbolic concerns are key elements in understanding environmental transformations. ^ This study finds that despite several attempts to transform the Colombian landscape physically, most of the substantive changes were localized and circumscribed to the Andean region. Other changes were mainly symbolic. This dissertation thus uses the Amazon as one of several regions that did not experience significant changes in the forest canopy. While highlanders originally dreamed of the Amazon as an untapped El Dorado, their failed attempts to exploit the region caused them to imagine it as a nightmarish “green hell”. ^ This dissertation concentrates on three pairs of concepts: tropicality/civilization, landscape/territory, and symbolic/material changes. It presents both a general vision of Colombia and case studies of three regions: Cundinamarca, and Cauca Valley are used to compare with the Amazon region that is developed at length. Whereas mainstream Colombian histories have either fixated on the Andean highlands or, in a relegated second place, on the Caribbean region, this dissertation attempts to significantly contribute to the historiography of Colombia by focusing on the largely neglected Amazonian region. ^ To understand imageries about Colombia's landscape, the dissertation relies on travel writings, chorographic descriptions and maps. It also makes uses legal documents and other published primary sources, including literary pieces and memoirs. ^
Resumo:
Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.
Resumo:
Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.
Resumo:
Formation of hydrates is one of the major flow assurance problems faced by the oil and gas industry. Hydrates tend to form in natural gas pipelines with the presence of water and favorable temperature and pressure conditions, generally low temperatures and corresponding high pressures. Agglomeration of hydrates can result in blockage of flowlines and equipment, which can be time consuming to remove in subsea equipment and cause safety issues. Natural gas pipelines are more susceptible to burst and explosion owing to hydrate plugging. Therefore, a rigorous risk-assessment related to hydrate formation is required, which assists in preventing hydrate blockage and ensuring equipment integrity. This thesis presents a novel methodology to assess the probability of hydrate formation and presents a risk-based approach to determine the parameters of winterization schemes to avoid hydrate formation in natural gas pipelines operating in Arctic conditions. It also presents a lab-scale multiphase flow loop to study the effects of geometric and hydrodynamic parameters on hydrate formation and discusses the effects of geometric and hydrodynamic parameters on multiphase development length of a pipeline. Therefore, this study substantially contributes to the assessment of probability of hydrate formation and the decision making process of winterization strategies to prevent hydrate formation in Arctic conditions.
Resumo:
In questo studio, un multi-model ensemble è stato implementato e verificato, seguendo una delle priorità di ricerca del Subseasonal to Seasonal Prediction Project (S2S). Una regressione lineare è stata applicata ad un insieme di previsioni di ensemble su date passate, prodotte dai centri di previsione mensile del CNR-ISAC e ECMWF-IFS. Ognuna di queste contiene un membro di controllo e quattro elementi perturbati. Le variabili scelte per l'analisi sono l'altezza geopotenziale a 500 hPa, la temperatura a 850 hPa e la temperatura a 2 metri, la griglia spaziale ha risoluzione 1 ◦ × 1 ◦ lat-lon e sono stati utilizzati gli inverni dal 1990 al 2010. Le rianalisi di ERA-Interim sono utilizzate sia per realizzare la regressione, sia nella validazione dei risultati, mediante stimatori nonprobabilistici come lo scarto quadratico medio (RMSE) e la correlazione delle anomalie. Successivamente, tecniche di Model Output Statistics (MOS) e Direct Model Output (DMO) sono applicate al multi-model ensemble per ottenere previsioni probabilistiche per la media settimanale delle anomalie di temperatura a 2 metri. I metodi MOS utilizzati sono la regressione logistica e la regressione Gaussiana non-omogenea, mentre quelli DMO sono il democratic voting e il Tukey plotting position. Queste tecniche sono applicate anche ai singoli modelli in modo da effettuare confronti basati su stimatori probabilistici, come il ranked probability skill score, il discrete ranked probability skill score e il reliability diagram. Entrambe le tipologie di stimatori mostrano come il multi-model abbia migliori performance rispetto ai singoli modelli. Inoltre, i valori più alti di stimatori probabilistici sono ottenuti usando una regressione logistica sulla sola media di ensemble. Applicando la regressione a dataset di dimensione ridotta, abbiamo realizzato una curva di apprendimento che mostra come un aumento del numero di date nella fase di addestramento non produrrebbe ulteriori miglioramenti.