22 resultados para New statistics for monitoring

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Connections between Statistics and Archaeology have always appeared veryfruitful. The objective of this paper is to offer an outlook of somestatistical techniques that are being developed in the most recentyears and that can be of interest for archaeologists in the short run.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A national survey designed for estimating a specific population quantity is sometimes used for estimation of this quantity also for a small area, such as a province. Budget constraints do not allow a greater sample size for the small area, and so other means of improving estimation have to be devised. We investigate such methods and assess them by a Monte Carlo study. We explore how a complementary survey can be exploited in small area estimation. We use the context of the Spanish Labour Force Survey (EPA) and the Barometer in Spain for our study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within last few years a new type of instruments called Terrestrial Laser Scanners (TLS) entered to the commercial market. These devices brought a possibility to obtain completely new type of spatial, three dimensional data describing the object of interest. TLS instruments are generating a type of data that needs a special treatment. Appearance of this technique made possible to monitor deformations of very large objects, like investigated here landslides, with new quality level. This change is visible especially with relation to the size and number of the details that can be observed with this new method. Taking into account this context presented here work is oriented on recognition and characterization of raw data received from the TLS instruments as well as processing phases, tools and techniques to do them. Main objective are definition and recognition of the problems related with usage of the TLS data, characterization of the quality single point generated by TLS, description and investigation of the TLS processing approach for landslides deformation measurements allowing to obtain 3D deformation characteristic and finally validation of the obtained results. The above objectives are based on the bibliography studies and research work followed by several experiments that will prove the conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The statistical analysis of compositional data should be treated using logratios of parts,which are difficult to use correctly in standard statistical packages. For this reason afreeware package, named CoDaPack was created. This software implements most of thebasic statistical methods suitable for compositional data.In this paper we describe the new version of the package that now is calledCoDaPack3D. It is developed in Visual Basic for applications (associated with Excel©),Visual Basic and Open GL, and it is oriented towards users with a minimum knowledgeof computers with the aim at being simple and easy to use.This new version includes new graphical output in 2D and 3D. These outputs could bezoomed and, in 3D, rotated. Also a customization menu is included and outputs couldbe saved in jpeg format. Also this new version includes an interactive help and alldialog windows have been improved in order to facilitate its use.To use CoDaPack one has to access Excel© and introduce the data in a standardspreadsheet. These should be organized as a matrix where Excel© rows correspond tothe observations and columns to the parts. The user executes macros that returnnumerical or graphical results. There are two kinds of numerical results: new variablesand descriptive statistics, and both appear on the same sheet. Graphical output appearsin independent windows. In the present version there are 8 menus, with a total of 38submenus which, after some dialogue, directly call the corresponding macro. Thedialogues ask the user to input variables and further parameters needed, as well aswhere to put these results. The web site http://ima.udg.es/CoDaPack contains thisfreeware package and only Microsoft Excel© under Microsoft Windows© is required torun the software.Kew words: Compositional data Analysis, Software

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanomotors are nanoscale devices capable of converting energy into movement and forces. Among them, self-propelled nanomotors offer considerable promise for developing new and novel bioanalytical and biosensing strategies based on the direct isolation of target biomolecules or changes in their movement in the presence of target analytes. The mainachievements of this project consists on the development of receptor-functionalized nanomotors that offer direct and rapid target detection, isolation and transport from raw biological samples without preparatory and washing steps. For example, microtube engines functionalized with aptamer, antibody, lectin and enzymes receptors were used for the direct isolation of analytes of biomedical interest, including proteins and whole cells, among others. A target protein was also isolated from a complex sample by using an antigen-functionalized microengine navigating into the reservoirs of a lab-on-a-chip device. The new nanomotorbased target biomarkers detection strategy not only offers highly sensitive, rapid, simple and low cost alternative for the isolation and transport of target molecules, but also represents a new dimension of analytical information based on motion. The recognition events can be easily visualized by optical microscope (without any sophisticated analytical instrument) to reveal the target presence and concentration. The use of artificial nanomachines has shown not only to be useful for (bio)recognition and (bio)transport but also for detection of environmental contamination and remediation. In this context, micromotors modified with superhydrophobic layer demonstrated that effectively interacted, captured, transported and removed oil droplets from oil contaminated samples. Finally, a unique micromotor-based strategy for water-quality testing, that mimics live-fish water-quality testing, based on changes in the propulsion behavior of artificial biocatalytic microswimmers in the presence of aquatic pollutants was also developed. The attractive features of the new micromachine-based target isolation and signal transduction protocols developed in this project offer numerous potential applications in biomedical diagnostics, environmental monitoring, and forensic analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Non-invasive monitoring of respiratory muscle function is an area of increasing research interest, resulting in the appearance of new monitoring devices, one of these being piezoelectric contact sensors. The present study was designed to test whether the use of piezoelectric contact (non-invasive) sensors could be useful in respiratory monitoring, in particular in measuring the timing of diaphragmatic contraction.Methods: Experiments were performed in an animal model: three pentobarbital anesthetized mongrel dogs. The motion of the thoracic cage was acquired by means of a piezoelectric contact sensor placed on the costal wall. This signal is compared with direct measurements of the diaphragmatic muscle length, made by sonomicrometry. Furthermore, to assess the diaphragmatic function other respiratory signals were acquired: respiratory airflow and transdiaphragmatic pressure. Diaphragm contraction time was estimated with these four signals. Using diaphragm length signal as reference, contraction times estimated with the other three signals were compared with the contraction time estimated with diaphragm length signal.Results: The contraction time estimated with the TM signal tends to give a reading 0.06 seconds lower than the measure made with the DL signal (-0.21 and 0.00 for FL and DP signals, respectively), with a standard deviation of 0.05 seconds (0.08 and 0.06 for FL and DP signals, respectively). Correlation coefficients indicated a close link between time contraction estimated with TM signal and contraction time estimated with DL signal (a Pearson correlation coefficient of 0.98, a reliability coefficient of 0.95, a slope of 1.01 and a Spearman's rank-order coefficient of 0.98). In general, correlation coefficients and mean and standard deviation of the difference were better in the inspiratory load respiratory test than in spontaneous ventilation tests.Conclusion: The technique presented in this work provides a non-invasive method to assess the timing of diaphragmatic contraction in canines, using a piezoelectric contact sensor placed on the costal wall.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the impact of "early" nineteenth-century globalization (c.1815-1860) on foreign trade in the Southern Cone (SC). Most of the evidence is drawn from bilateral trades between Britain and the SC, at a time when Britain was the main commercial partner of the new republics. The main conclusion drawn is that early globalization had a positive impact on foreign trade in the SC, and this was due to: improvements in the SC's terms of trade during this period; the SC's per capita consumption of textiles (the main manufacture traded on world markets at that time) increased substantially during this period, at a time when clothing was one of the main items of SC household budgets; British merchants brought with them capital, shipping, insurance, and also facilitated the formation of vast global networks, which further promoted the SC's exports to a wider range of outlets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is widely accepted in the literature about the classicalCournot oligopoly model that the loss of quasi competitiveness is linked,in the long run as new firms enter the market, to instability of the equilibrium. In this paper, though, we present a model in which a stableunique symmetric equilibrium is reached for any number of oligopolistsas industry price increases with each new entry. Consequently, the suspicion that non quasi competitiveness implies, in the long run, instabilityis proved false.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Treatise on Quadrature of Fermat (c. 1659), besides containing the first known proof of the computation of the area under a higher parabola, R x+m/n dx, or under a higher hyperbola, R x-m/n dx with the appropriate limits of integration in each case , has a second part which was not understood by Fermat s contemporaries. This second part of the Treatise is obscure and difficult to read and even the great Huygens described it as'published with many mistakes and it is so obscure (with proofs redolent of error) that I have been unable to make any sense of it'. Far from the confusion that Huygens attributes to it, in this paper we try to prove that Fermat, in writing the Treatise, had a very clear goal in mind and he managed to attain it by means of a simple and original method. Fermat reduced the quadrature of a great number of algebraic curves to the quadrature of known curves: the higher parabolas and hyperbolas of the first part of the paper. Others, he reduced to the quadrature of the circle. We shall see how the clever use of two procedures, quite novel at the time: the change of variables and a particular case of the formulaof integration by parts, provide Fermat with the necessary tools to square very easily curves as well-known as the folium of Descartes, the cissoid of Diocles or the witch of Agnesi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The well--known Minkowski's? $(x)$ function is presented as the asymptotic distribution function of an enumeration of the rationals in (0,1] based on their continued fraction representation. Besides, the singularity of ?$(x)$ is clearly proved in two ways: by exhibiting a set of measure one in which ?ï$(x)$ = 0; and again by actually finding a set of measure one which is mapped onto a set of measure zero and viceversa. These sets are described by means of metrical properties of different systems for real number representation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I discuss the identifiability of a structural New Keynesian Phillips curve when it is embedded in a small scale dynamic stochastic general equilibrium model. Identification problems emerge because not all the structural parameters are recoverable from the semi-structural ones and because the objective functions I consider are poorly behaved. The solution and the moment mappings are responsible for the problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of statistical tests for detecting population growth are described. We compared the statistical power of these tests with that of others available in the literature. The tests evaluated fall into three categories: those tests based on the distribution of the mutation frequencies, on the haplotype distribution, and on the mismatch distribution. We found that, for an extensive variety of cases, the most powerful tests for detecting population growth are Fu"s FS test and the newly developed R2 test. The behavior of the R2 test is superior for small sample sizes, whereas FS is better for large sample sizes. We also show that some popular statistics based on the mismatch distribution are very conservative. Key words: population growth, population expansion, coalescent simulations, neutrality tests