958 resultados para Probabilistic generalization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A biplot, which is the multivariate generalization of the two-variable scatterplot, can be used to visualize the results of many multivariate techniques, especially those that are based on the singular value decomposition. We consider data sets consisting of continuous-scale measurements, their fuzzy coding and the biplots that visualize them, using a fuzzy version of multiple correspondence analysis. Of special interest is the way quality of fit of the biplot is measured, since it is well-known that regular (i.e., crisp) multiple correspondence analysis seriously under-estimates this measure. We show how the results of fuzzy multiple correspondence analysis can be defuzzified to obtain estimated values of the original data, and prove that this implies an orthogonal decomposition of variance. This permits a measure of fit to be calculated in the familiar form of a percentage of explained variance, which is directly comparable to the corresponding fit measure used in principal component analysis of the original data. The approach is motivated initially by its application to a simulated data set, showing how the fuzzy approach can lead to diagnosing nonlinear relationships, and finally it is applied to a real set of meteorological data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O principal objetivo de um Planeamento de Experiências reside essencialmente na procura de relações entre variáveis e na comparação de níveis de fatores, recorrendo ao tratamento estatístico dos dados recolhidos. A utilização de blocos no Planeamento de Experiências é fundamental, pois permite reduzir ou eliminar a variabilidade introduzida por fatores que podem influenciar a experiência mas que não interessam e/ou não foram explicitamente incluídos durante o planeamento. Neste trabalho apresentamos os resultados do estudo e investigação dos Planos em Blocos Incompletos Equilibrados (BIBD), Planos em Blocos Incompletos Equilibrados com repetição de blocos (BIBDR) e Planos em Blocos Incompletos com blocos de diferentes dimensões (VBBD). Exploramos algumas propriedades e métodos de construção destes planos e ilustramos, sempre que possível, com exemplos. Tendo como base o planeamento em blocos, apresentamos uma aplicação dos BIBDR na área da Educação com o objetivo de comparar cinco domínios do pensamento algébrico de uma amostra de alunos do 1º ano do ensino superior em Cabo Verde. Para a análise dos dados da amostra foi utilizado o software R, versão 2.12.1. Pudemos constatar que existem diferenças significativas entre alguns dos domínios do pensamento algébrico, nomeadamente entre os domínios da Generalização da Aritmética e Tecnicismo Algébrico com os restantes domínios. Recomendamos a escolha de uma amostra mais representativa constituída por alunos de todas as instituições superiores de Cabo Verde

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We derive a new inequality for uniform deviations of averages from their means. The inequality is a common generalization of previous results of Vapnik and Chervonenkis (1974) and Pollard (1986). Usingthe new inequality we obtain tight bounds for empirical loss minimization learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The generalization of simple (two-variable) correspondence analysis to more than two categorical variables, commonly referred to as multiple correspondence analysis, is neither obvious nor well-defined. We present two alternative ways of generalizing correspondence analysis, one based on the quantification of the variables and intercorrelation relationships, and the other based on the geometric ideas of simple correspondence analysis. We propose a version of multiple correspondence analysis, with adjusted principal inertias, as the method of choice for the geometric definition, since it contains simple correspondence analysis as an exact special case, which is not the situation of the standard generalizations. We also clarify the issue of supplementary point representation and the properties of joint correspondence analysis, a method that visualizes all two-way relationships between the variables. The methodology is illustrated using data on attitudes to science from the International Social Survey Program on Environment in 1993.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To provide a quantitative support to the handwriting evidence evaluation, a new method was developed through the computation of a likelihood ratio based on a Bayesian approach. In the present paper, the methodology is briefly described and applied to data collected within a simulated case of a threatening letter. Fourier descriptors are used to characterise the shape of loops of handwritten characters "a" of the true writer of the threatening letter, and: 1) with reference characters "a" of the true writer of the threatening letter, and then 2) with characters "a" of a writer who did not write the threatening letter. The findings support that the probabilistic methodology correctly supports either the hypothesis of authorship or the alternative hypothesis. Further developments will enable the handwriting examiner to use this methodology as a helpful assistance to assess the strength of evidence in handwriting casework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we use Malliavin calculus techniques to obtain an expression for the short-time behavior of the at-the-money implied volatility skew for a generalization of the Bates model, where the volatility does not need to be neither a difussion, nor a Markov process as the examples in section 7 show. This expression depends on the derivative of the volatility in the sense of Malliavin calculus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neural signatures of humans' movement intention can be exploited by future neuroprosthesis. We propose a method for detecting self-paced upper limb movement intention from brain signals acquired with both invasive and noninvasive methods. In the first study with scalp electroencephalograph (EEG) signals from healthy controls, we report single trial detection of movement intention using movement related potentials (MRPs) in a frequency range between 0.1 to 1 Hz. Movement intention can be detected above chance level (p<0.05) on average 460 ms before the movement onset with low detection rate during the on-movement intention period. Using intracranial EEG (iEEG) from one epileptic subject, we detect movement intention as early as 1500 ms before movement onset with accuracy above 90% using electrodes implanted in the bilateral supplementary motor area (SMA). The coherent results obtained with non-invasive and invasive method and its generalization capabilities across different days of recording, strengthened the theory that self-paced movement intention can be detected before movement initiation for the advancement in robot-assisted neurorehabilitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Models are presented for the optimal location of hubs in airline networks, that take into consideration the congestion effects. Hubs, which are the most congested airports, are modeled as M/D/c queuing systems, that is, Poisson arrivals, deterministic service time, and {\em c} servers. A formula is derived for the probability of a number of customers in the system, which is later used to propose a probabilistic constraint. This constraint limits the probability of {\em b} airplanes in queue, to be lesser than a value $\alpha$. Due to the computational complexity of the formulation. The model is solved using a meta-heuristic based on tabu search. Computational experience is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider adaptive sequential lossy coding of bounded individual sequences when the performance is measured by the sequentially accumulated mean squared distortion. Theencoder and the decoder are connected via a noiseless channel of capacity $R$ and both are assumed to have zero delay. No probabilistic assumptions are made on how the sequence to be encoded is generated. For any bounded sequence of length $n$, the distortion redundancy is defined as the normalized cumulative distortion of the sequential scheme minus the normalized cumulative distortion of the best scalarquantizer of rate $R$ which is matched to this particular sequence. We demonstrate the existence of a zero-delay sequential scheme which uses common randomization in the encoder and the decoder such that the normalized maximum distortion redundancy converges to zero at a rate $n^{-1/5}\log n$ as the length of the encoded sequence $n$ increases without bound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: "Virtual" autopsy by postmortem computed tomography (PMCT) can replace medical autopsy to a certain extent but has limitations for cardiovascular diseases. These limitations might be overcome by adding multiphase PMCT angiography. OBJECTIVE: To compare virtual autopsy by multiphase PMCT angiography with medical autopsy. DESIGN: Prospective cohort study. (ClinicalTrials.gov: NCT01541995) SETTING: Single-center study at the University Medical Center Hamburg-Eppendorf, Hamburg, Germany, between 1 April 2012 and 31 March 2013. PATIENTS: Hospitalized patients who died unexpectedly or within 48 hours of an event necessitating cardiopulmonary resuscitation. MEASUREMENTS: Diagnoses from clinical records were compared with findings from both types of autopsy. New diagnoses identified by autopsy were classified as major or minor, depending on whether they would have altered clinical management. RESULTS: Of 143 eligible patients, 50 (35%) had virtual and medical autopsy. Virtual autopsy confirmed 93% of all 336 diagnoses identified from antemortem medical records, and medical autopsy confirmed 80%. In addition, virtual and medical autopsy identified 16 new major and 238 new minor diagnoses. Seventy-three of the virtual autopsy diagnoses, including 32 cases of coronary artery stenosis, were identified solely by multiphase PMCT angiography. Of the 114 clinical diagnoses classified as cardiovascular, 110 were confirmed by virtual autopsy and 107 by medical autopsy. In 11 cases, multiphase PMCT angiography showed "unspecific filling defects," which were not reported by medical autopsy. LIMITATION: These results come from a single center with concerted interest and expertise in postmortem imaging; further studies are thus needed for generalization. CONCLUSION: In cases of unexpected death, the addition of multiphase PMCT angiography increases the value of virtual autopsy, making it a feasible alternative for quality control and identification of diagnoses traditionally made by medical autopsy. PRIMARY FUNDING SOURCE: University Medical Center Hamburg-Eppendorf.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The practice of sedation, including monitoring practice for digestive endoscopy, continues to evolve throughout the world. In many countries, including Switzerland, there is a trend towards increased utilization of sedation during both routine and advanced endoscopic procedures. Sedation improves patient satisfaction with endoscopy and also improves the quality of the examination. In addition, a trend can be observed towards an increasing use of propofol as the preferred sedative drug. Here we review the latest published data from surveys describing sedation and monitoring practice in different countries and compare them with our own data from successive nationwide surveys among Swiss gastroenterologists over a period of 20 years. This development between these socioeconomically very similar Western industrialized countries, however, shows some unique and surprising differences. In Germany and Switzerland, propofol use has become increasingly widespread, in Switzerland even to the extent that during the last few years propofol has overtaken benzodiazepine sedation, with an absolute majority of Swiss gastroenterologists using it without the assistance of an anesthesiologist. In addition, the change in Switzerland reflects a successful generalization of nonanesthesiologist-administered propofol (NAAP) sedation from the hospital setting to private practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a rule-based Huet’s style anti-unification algorithm for simply-typed lambda-terms in ɳ long β normal form, which computes a least general higher-order pattern generalization. For a pair of arbitrary terms of the same type, such a generalization always exists and is unique modulo α equivalence and variable renaming. The algorithm computes it in cubic time within linear space. It has been implemented and the code is freely available

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of statistical models for forensic fingerprint identification purposes has been the subject of increasing research attention in recent years. This can be partly seen as a response to a number of commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. In addition, key forensic identification bodies such as ENFSI [1] and IAI [2] have recently endorsed and acknowledged the potential benefits of using statistical models as an important tool in support of the fingerprint identification process within the ACE-V framework. In this paper, we introduce a new Likelihood Ratio (LR) model based on Support Vector Machines (SVMs) trained with features discovered via morphometric and spatial analyses of corresponding minutiae configurations for both match and close non-match populations often found in AFIS candidate lists. Computed LR values are derived from a probabilistic framework based on SVMs that discover the intrinsic spatial differences of match and close non-match populations. Lastly, experimentation performed on a set of over 120,000 publicly available fingerprint images (mostly sourced from the National Institute of Standards and Technology (NIST) datasets) and a distortion set of approximately 40,000 images, is presented, illustrating that the proposed LR model is reliably guiding towards the right proposition in the identification assessment of match and close non-match populations. Results further indicate that the proposed model is a promising tool for fingerprint practitioners to use for analysing the spatial consistency of corresponding minutiae configurations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Counterfeit pharmaceutical products have become a widespread problem in the last decade. Various analytical techniques have been applied to discriminate between genuine and counterfeit products. Among these, Near-infrared (NIR) and Raman spectroscopy provided promising results.The present study offers a methodology allowing to provide more valuable information fororganisations engaged in the fight against counterfeiting of medicines.A database was established by analyzing counterfeits of a particular pharmaceutical product using Near-infrared (NIR) and Raman spectroscopy. Unsupervised chemometric techniques (i.e. principal component analysis - PCA and hierarchical cluster analysis - HCA) were implemented to identify the classes within the datasets. Gas Chromatography coupled to Mass Spectrometry (GC-MS) and Fourier Transform Infrared Spectroscopy (FT-IR) were used to determine the number of different chemical profiles within the counterfeits. A comparison with the classes established by NIR and Raman spectroscopy allowed to evaluate the discriminating power provided by these techniques. Supervised classifiers (i.e. k-Nearest Neighbors, Partial Least Squares Discriminant Analysis, Probabilistic Neural Networks and Counterpropagation Artificial Neural Networks) were applied on the acquired NIR and Raman spectra and the results were compared to the ones provided by the unsupervised classifiers.The retained strategy for routine applications, founded on the classes identified by NIR and Raman spectroscopy, uses a classification algorithm based on distance measures and Receiver Operating Characteristics (ROC) curves. The model is able to compare the spectrum of a new counterfeit with that of previously analyzed products and to determine if a new specimen belongs to one of the existing classes, consequently allowing to establish a link with other counterfeits of the database.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three-dimensional analysis of the entire sequence in ski jumping is recommended when studying the kinematics or evaluating performance. Camera-based systems which allow three-dimensional kinematics measurement are complex to set-up and require extensive post-processing, usually limiting ski jumping analyses to small numbers of jumps. In this study, a simple method using a wearable inertial sensors-based system is described to measure the orientation of the lower-body segments (sacrum, thighs, shanks) and skis during the entire jump sequence. This new method combines the fusion of inertial signals and biomechanical constraints of ski jumping. Its performance was evaluated in terms of validity and sensitivity to different performances based on 22 athletes monitored during daily training. The validity of the method was assessed by comparing the inclination of the ski and the slope at landing point and reported an error of -0.2±4.8°. The validity was also assessed by comparison of characteristic angles obtained with the proposed system and reference values in the literature; the differences were smaller than 6° for 75% of the angles and smaller than 15° for 90% of the angles. The sensitivity to different performances was evaluated by comparing the angles between two groups of athletes with different jump lengths and by assessing the association between angles and jump lengths. The differences of technique observed between athletes and the associations with jumps length agreed with the literature. In conclusion, these results suggest that this system is a promising tool for a generalization of three-dimensional kinematics analysis in ski jumping.