971 resultados para Autocorrelation (Statistics)
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
It is usual to hear a strange short sentence: «Random is better than...». Why is randomness a good solution to a certain engineering problem? There are many possible answers, and all of them are related to the considered topic. In this thesis I will discuss about two crucial topics that take advantage by randomizing some waveforms involved in signals manipulations. In particular, advantages are guaranteed by shaping the second order statistic of antipodal sequences involved in an intermediate signal processing stages. The first topic is in the area of analog-to-digital conversion, and it is named Compressive Sensing (CS). CS is a novel paradigm in signal processing that tries to merge signal acquisition and compression at the same time. Consequently it allows to direct acquire a signal in a compressed form. In this thesis, after an ample description of the CS methodology and its related architectures, I will present a new approach that tries to achieve high compression by design the second order statistics of a set of additional waveforms involved in the signal acquisition/compression stage. The second topic addressed in this thesis is in the area of communication system, in particular I focused the attention on ultra-wideband (UWB) systems. An option to produce and decode UWB signals is direct-sequence spreading with multiple access based on code division (DS-CDMA). Focusing on this methodology, I will address the coexistence of a DS-CDMA system with a narrowband interferer. To do so, I minimize the joint effect of both multiple access (MAI) and narrowband (NBI) interference on a simple matched filter receiver. I will show that, when spreading sequence statistical properties are suitably designed, performance improvements are possible with respect to a system exploiting chaos-based sequences minimizing MAI only.
Resumo:
A spatial, electro-optical autocorrelation (EOA) interferometer using the vertically polarized lobes of coherent transition radiation (CTR) has been developed as a single-shot electron bunch length monitor at an optical beam port downstream the 100 MeV preinjector LINAC of the Swiss Light Source. This EOA monitor combines the advantages of step-scan interferometers (high temporal resolution) [D. Mihalcea et al., Phys. Rev. ST Accel. Beams 9, 082801 (2006) and T. Takahashi and K. Takami, Infrared Phys. Technol. 51, 363 (2008)] and terahertz-gating technologies [U. Schmidhammer et al., Appl. Phys. B: Lasers Opt. 94, 95 (2009) and B. Steffen et al., Phys. Rev. ST Accel. Beams 12, 032802 (2009)] (fast response), providing the possibility to tune the accelerator with an online bunch length diagnostics. While a proof of principle of the spatial interferometer was achieved by step-scan measurements with far-infrared detectors, the single-shot capability of the monitor has been demonstrated by electro-optical correlation of the spatial CTR interference pattern with fairly long (500 ps) neodymium-doped yttrium aluminum garnet (Nd:YAG) laser pulses in a ZnTe crystal. In single-shot operation, variations of the bunch length between 1.5 and 4 ps due to different phase settings of the LINAC bunching cavities have been measured with subpicosecond time resolution.
Resumo:
An introductory course in probability and statistics for third-year and fourth-year electrical engineering students is described. The course is centered around several computer-based projects that are designed to achieve two objectives. First, the projects illustrate the course topics and provide hands-on experience for the students. The second and equally important objective of the projects is to convey the relevance and usefulness of probability and statistics to practical problems that undergraduate students can appreciate. The benefit of this course as to motivate electrical engineering students to excel in the study of probability concepts, instead of viewing the subject as one more course requirement toward graduation. The authors co-teach the course, and MATLAB is used for mast of the computer-based projects
Einstein's quantum theory of the monatomic ideal gas: non-statistical arguments for a new statistics
Resumo:
Recent reports by the Centers for Disease Control and Prevention have decried the high rate of fetal mortality in the contemporary United States. Much of the data about fetal and infant deaths, as well as other poor pregnancy outcomes, are tabulated and tracked through vital statistics. In this article, I demonstrate how notions of fetal death became increasingly tied to the surveillance of maternal bodies through the tabulating and tracking of vital statistics in the middle part of the twentieth century. Using a historical analysis of the revisions to the United States Standard Certificate of Live Birth, and the United States Standard Report of Fetal Death, I examine how the categories of analysis utilized in these documents becomes integrally linked to contemporary ideas about fetal and perinatal death, gestational age, and prematurity. While it is evident that there are relationships between maternal behavior and birth outcomes, in this article I interrogate the ways in which the surveillance of maternal bodies through vital statistics has naturalized these relationships. Copyright 2013 Elsevier Ltd. All rights reserved.
Resumo:
Locally affine (polyaffine) image registration methods capture intersubject non-linear deformations with a low number of parameters, while providing an intuitive interpretation for clinicians. Considering the mandible bone, anatomical shape differences can be found at different scales, e.g. left or right side, teeth, etc. Classically, sequential coarse to fine registration are used to handle multiscale deformations, instead we propose a simultaneous optimization of all scales. To avoid local minima we incorporate a prior on the polyaffine transformations. This kind of groupwise registration approach is natural in a polyaffine context, if we assume one configuration of regions that describes an entire group of images, with varying transformations for each region. In this paper, we reformulate polyaffine deformations in a generative statistical model, which enables us to incorporate deformation statistics as a prior in a Bayesian setting. We find optimal transformations by optimizing the maximum a posteriori probability. We assume that the polyaffine transformations follow a normal distribution with mean and concentration matrix. Parameters of the prior are estimated from an initial coarse to fine registration. Knowing the region structure, we develop a blockwise pseudoinverse to obtain the concentration matrix. To our knowledge, we are the first to introduce simultaneous multiscale optimization through groupwise polyaffine registration. We show results on 42 mandible CT images.
Resumo:
Traditionally, the use of Bayes factors has required the specification of proper prior distributions on model parameters implicit to both null and alternative hypotheses. In this paper, I describe an approach to defining Bayes factors based on modeling test statistics. Because the distributions of test statistics do not depend on unknown model parameters, this approach eliminates the subjectivity normally associated with the definition of Bayes factors. For standard test statistics, including the _2, F, t and z statistics, the values of Bayes factors that result from this approach can be simply expressed in closed form.