975 resultados para F-STATISTICS
Resumo:
The objective of this paper is to introduce a fourth-order cost function of the displaced frame difference (DFD) capable of estimatingmotion even for small regions or blocks. Using higher than second-orderstatistics is appropriate in case the image sequence is severely corruptedby additive Gaussian noise. Some results are presented and compared to those obtained from the mean kurtosis and the mean square error of the DFD.
Resumo:
Web-portaalien aiheenmukaista luokittelua voidaan hyödyntää tunnistamaan käyttäjän kiinnostuksen kohteet keräämällä tilastotietoa hänen selaustottumuksistaan eri kategorioissa. Tämä diplomityö käsittelee web-sovelluksien osa-alueita, joissa kerättyä tilastotietoa voidaan hyödyntää personalisoinnissa. Yleisperiaatteet sisällön personalisoinnista, Internet-mainostamisesta ja tiedonhausta selitetään matemaattisia malleja käyttäen. Lisäksi työssä kuvaillaan yleisluontoiset ominaisuudet web-portaaleista sekä tilastotiedon keräämiseen liittyvät seikat.
Resumo:
Statistics has become an indispensable tool in biomedical research. Thanks, in particular, to computer science, the researcher has easy access to elementary "classical" procedures. These are often of a "confirmatory" nature: their aim is to test hypotheses (for example the efficacy of a treatment) prior to experimentation. However, doctors often use them in situations more complex than foreseen, to discover interesting data structures and formulate hypotheses. This inverse process may lead to misuse which increases the number of "statistically proven" results in medical publications. The help of a professional statistician thus becomes necessary. Moreover, good, simple "exploratory" techniques are now available. In addition, medical data contain quite a high percentage of outliers (data that deviate from the majority). With classical methods it is often very difficult (even for a statistician!) to detect them and the reliability of results becomes questionable. New, reliable ("robust") procedures have been the subject of research for the past two decades. Their practical introduction is one of the activities of the Statistics and Data Processing Department of the University of Social and Preventive Medicine, Lausanne.
Resumo:
This paper aims at detecting spatio-temporal clustering in fire sequences using space?time scan statistics, a powerful statistical framework for the analysis of point processes. The methodology is applied to active fire detection in the state of Florida (US) identified by MODIS (Moderate Resolution Imaging Spectroradiometer) during the period 2003?06. Results of the present study show that statistically significant clusters can be detected and localized in specific areas and periods of the year. Three out of the five most likely clusters detected for the entire frame period are localized in the north of the state, and they cover forest areas; the other two clusters cover a large zone in the south, corresponding to agricultural land and the prairies in the Everglades. In order to analyze if the wildfires recur each year during the same period, the analyses have been performed separately for the 4 years: it emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the southern areas, they are widely present during the whole year. The recognition of overdensities of events and the ability to locate them in space and in time can help in supporting fire management and focussing on prevention measures.
Resumo:
Decision situations are often characterized by uncertainty: we do not know the values of the different options on all attributes and have to rely on information stored in our memory to decide. Several strategies have been proposed to describe how people make inferences based on knowledge used as cues. The present research shows how declarative memory of ACT-R models could be populated based on internet statistics. This will allow to simulate the performance of decision strategies operating on declarative knowledge based on occurrences and co-occurrences of objects and cues in the environment.
Resumo:
Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.
Resumo:
In this article, the fusion of a stochastic metaheuristic as Simulated Annealing (SA) with classical criteria for convergence of Blind Separation of Sources (BSS), is shown. Although the topic of BSS, by means of various techniques, including ICA, PCA, and neural networks, has been amply discussed in the literature, to date the possibility of using simulated annealing algorithms has not been seriously explored. From experimental results, this paper demonstrates the possible benefits offered by SA in combination with high order statistical and mutual information criteria for BSS, such as robustness against local minima and a high degree of flexibility in the energy function.
Resumo:
Conferència Organitzada per l'Escola Politècnica Superior, Universitat de Vic en col·laboració amb Servei d'Estadística de la Universitat Autònoma de Barcelona i CosmoCaixa Barcelona. Celebrada del 18 al 22 de juny de 2012 a Barcelona