164 resultados para Empirical Testing
Resumo:
This paper explores analytically the contemporary pottery-making community of Pereruela (north-west Spain) that produces cooking pots from a mixture of red clay and kaolin. Analyses by different techniques (XRF, NAA, XRD, SEM and petrography) showed an extremely high variability for cooking ware pottery produced in a single production centre, by the same technology and using local clays. The main source of chemical variation is related to the use of different red clays and the presence of non-normally distributed inclusions of monazite. These two factors induce a high chemical variability, not only in the output of a single production centre, but even in the paste of a single pot, to an extent to which chemical compositions from one"workshop", or even one"pot", could be classified as having different provenances. The implications for the chemical characterization and for provenance studies of archaeological ceramics are addressed.
Resumo:
The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3) is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000¿1:25 000). "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.
Resumo:
In this paper we report on the growth of thick films of magnetoresistive La2/3Sr1/3MnO3 by using spray and screen printing techniques on various substrates (Al2O3 and ZrO2). The growth conditions are explored in order to optimize the microstructure of the films. The films display a room-temperature magnetoresistance of 0.0012%/Oe in the 1 kOe field region. A magnetic sensor is described and tested.
Resumo:
The present study explores the statistical properties of a randomization test based on the random assignment of the intervention point in a two-phase (AB) single-case design. The focus is on randomization distributions constructed with the values of the test statistic for all possible random assignments and used to obtain p-values. The shape of those distributions is investigated for each specific data division defined by the moment in which the intervention is introduced. Another aim of the study consisted in testing the detection of inexistent effects (i.e., production of false alarms) in autocorrelated data series, in which the assumption of exchangeability between observations may be untenable. In this way, it was possible to compare nominal and empirical Type I error rates in order to obtain evidence on the statistical validity of the randomization test for each individual data division. The results suggest that when either of the two phases has considerably less measurement times, Type I errors may be too probable and, hence, the decision making process to be carried out by applied researchers may be jeopardized.
Resumo:
The present work focuses the attention on the skew-symmetry index as a measure of social reciprocity. This index is based on the correspondence between the amount of behaviour that each individual addresses to its partners and what it receives from them in return. Although the skew-symmetry index enables researchers to describe social groups, statistical inferential tests are required. The main aim of the present study is to propose an overall statistical technique for testing symmetry in experimental conditions, calculating the skew-symmetry statistic (Φ) at group level. Sampling distributions for the skew- symmetry statistic have been estimated by means of a Monte Carlo simulation in order to allow researchers to make statistical decisions. Furthermore, this study will allow researchers to choose the optimal experimental conditions for carrying out their research, as the power of the statistical test has been estimated. This statistical test could be used in experimental social psychology studies in which researchers may control the group size and the number of interactions within dyads.
Resumo:
In the first part of the study, nine estimators of the first-order autoregressive parameter are reviewed and a new estimator is proposed. The relationships and discrepancies between the estimators are discussed in order to achieve a clear differentiation. In the second part of the study, the precision in the estimation of autocorrelation is studied. The performance of the ten lag-one autocorrelation estimators is compared in terms of Mean Square Error (combining bias and variance) using data series generated by Monte Carlo simulation. The results show that there is not a single optimal estimator for all conditions, suggesting that the estimator ought to be chosen according to sample size and to the information available of the possible direction of the serial dependence. Additionally, the probability of labelling an actually existing autocorrelation as statistically significant is explored using Monte Carlo sampling. The power estimates obtained are quite similar among the tests associated with the different estimators. These estimates evidence the small probability of detecting autocorrelation in series with less than 20 measurement times.
Resumo:
Objecte: L'aplicació de la NIC 32 en les cooperatives ha generat una important controvèrsia en els últims anys. Fins al moment, s'han realitzat diversos treballs que intenten preveure els possibles efectes de la seva aplicació. Aquest treball pretén analitzar l'impacte de la primera aplicació de la NIC 32 en el sector cooperatiu. Disseny/metodologia/enfocament: S'ha seleccionat una mostra de 98 cooperatives, i s'ha realitzat una anàlisi comparativa de la seva informació financera presentada abans i després de l'aplicació de la NIC 32, per a determinar les diferències existents. S’ha utilitzat la prova de la suma de rangs de Wilcoxon per comprovar si aquestes diferències són significatives. També s’ha utilitzat la prova de la U de Mann Whitney per comprovar si existeixen diferències significatives en l’impacte relatiu de l’aplicació de la NIC 32 entre diversos grups de cooperatives. Finalment, s'ha realitzat una anàlisi dels efectes de l'aplicació de la NIC 32 en la situació patrimonial i econòmica de les cooperatives, i en l'evolució dels seus actius intangibles, mitjançant l’ús de tècniques d’anàlisi econòmico-financera. Aportacions i resultats: Els resultats obtinguts confirmen que l'aplicació de la NIC 32 provoca diferències significatives en algunes partides del balanç de situació i el compte de pèrdues i guanys, així com en les ràtios analitzades. Les principals diferències es concreten en una reducció del nivell de capitalització i un augment de l'endeutament de les cooperatives, així com un empitjorament general dels ràtios de solvència i autonomia financera. Limitacions: Cal tenir en compte que el treball s'ha realitzat amb una mostra de cooperatives que estan obligades a auditar els seus comptes anuals. Per tant, els resultats obtinguts han d'interpretar-se en un context de cooperatives de tamany elevat. També cal tenir en compte que hem realitzat una anàlisi comparativa dels comptes anuals de 2011 i 2010. Això ens ha permès conèixer les diferències en la informació financera de les cooperatives abans i després d'aplicar la NIC 32. Encara que algunes d’aquestes diferències també podrien estar causades per altres factors com la situació econòmica, els canvis en l'aplicació de les normes comptables, etc. Originalitat/valor afegit: Creiem que és el moment idoni per a realitzar aquest treball d'investigació, ja que des de 2011 totes les cooperatives espanyoles han d'aplicar les normes comptables adaptades a la NIC 32. A més, fins on coneixem, no existeixen altres treballs similars realitzats amb comptes anuals de cooperatives que ja han aplicat les normes comptables adaptades a la NIC 32 . Creiem que els resultats d'aquest treball d'investigació poden ser útils per a diferents grups d'interès. En primer lloc, perquè els organismes emissors de normes comptables puguin conèixer l'abast de la NIC 32 en les cooperatives i, puguin plantejar millores en el contingut de la norma. En segon lloc, perquè les pròpies cooperatives, federacions, confederacions i altres organismes cooperatius disposin d'informació sobre l'impacte econòmic de la primera aplicació de la NIC 32, i puguin realitzar les valoracions que creguin convenients. I en tercer lloc, perquè les entitats financeres, auditors i assessors de cooperatives i altres grups d'interès disposin d'informació sobre els canvis en els comptes anuals de les cooperatives, i puguin tenir-los en compte a l'hora de prendre decisions. Paraules clau: Cooperatives, patrimoni net, capital social, NIC 32, solvència, efectes de la normativa comptable, informació financera, ràtios.
Resumo:
Testing weather or not data belongs could been generated by a family of extreme value copulas is difficult. We generalize a test and we prove that it can be applied whatever the alternative hypothesis. We also study the effect of using different extreme value copulas in the context of risk estimation. To measure the risk we use a quantile. Our results have motivated by a bivariate sample of losses from a real database of auto insurance claims. Methods are implemented in R.
Resumo:
This paper proposes new methodologies for evaluating out-of-sample forecastingperformance that are robust to the choice of the estimation window size. The methodologies involve evaluating the predictive ability of forecasting models over a wide rangeof window sizes. We show that the tests proposed in the literature may lack the powerto detect predictive ability and might be subject to data snooping across differentwindow sizes if used repeatedly. An empirical application shows the usefulness of themethodologies for evaluating exchange rate models' forecasting ability.
Resumo:
This paper examines statistical analysis of social reciprocity, that is, the balance between addressing and receiving behaviour in social interactions. Specifically, it focuses on the measurement of social reciprocity by means of directionality and skew-symmetry statistics at different levels. Two statistics have been used as overall measures of social reciprocity at group level: the directional consistency and the skew-symmetry statistics. Furthermore, the skew-symmetry statistic allows social researchers to obtain complementary information at dyadic and individual levels. However, having computed these measures, social researchers may be interested in testing statistical hypotheses regarding social reciprocity. For this reason, it has been developed a statistical procedure, based on Monte Carlo sampling, in order to allow social researchers to describe groups and make statistical decisions.
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
In this work we explore the multivariate empirical mode decomposition combined with a Neural Network classifier as technique for face recognition tasks. Images are simultaneously decomposed by means of EMD and then the distance between the modes of the image and the modes of the representative image of each class is calculated using three different distance measures. Then, a neural network is trained using 10- fold cross validation in order to derive a classifier. Preliminary results (over 98 % of classification rate) are satisfactory and will justify a deep investigation on how to apply mEMD for face recognition.
Resumo:
Artifacts are present in most of the electroencephalography (EEG) recordings, making it difficult to interpret or analyze the data. In this paper a cleaning procedure based on a multivariate extension of empirical mode decomposition is used to improve the quality of the data. This is achieved by applying the cleaning method to raw EEG data. Then, a synchrony measure is applied on the raw and the clean data in order to compare the improvement of the classification rate. Two classifiers are used, linear discriminant analysis and neural networks. For both cases, the classification rate is improved about 20%.
Resumo:
This paper develops an approach to rank testing that nests all existing rank tests andsimplifies their asymptotics. The approach is based on the fact that implicit in every ranktest there are estimators of the null spaces of the matrix in question. The approach yieldsmany new insights about the behavior of rank testing statistics under the null as well as localand global alternatives in both the standard and the cointegration setting. The approach alsosuggests many new rank tests based on alternative estimates of the null spaces as well as thenew fixed-b theory. A brief Monte Carlo study illustrates the results.
Resumo:
Aquest estudi va analitzar la interacció del canvi organitzatiu, els valors culturals i el canvi tecnològic en el sistema sanitari català. L'estudi se subdivideix en cinc parts diferents. La primera és una anàlisi de contingut de webs relacionats amb la salut a Catalunya. La segona és un estudi dels usos d'Internet en qüestions relacionades amb la salut entre la població en general, les associacions de pacients i els professionals de la salut, i es basa en un sondeig per Internet adaptat a cada un d'aquests grups. La tercera part és un estudi de treball de camp dels programes experimentals duts a terme pel Govern català en diverses àrees i hospitals locals per a integrar electrònicament la història clínica dels pacients. La quarta és un estudi de les implicacions organitzatives de la introducció de sistemes d'informació en la gestió d'hospitals i centres d'assistència primària a l'Institut Català de Salut, el principal proveïdor de salut pública a Catalunya, i es basa en un sondeig per Internet i entrevistes en profunditat. La cinquena part és un estudi de cas dels efectes organitzatius i socials de la introducció de les tecnologies de la informació i la comunicació en un dels principals hospitals de Catalunya, l'Hospital Clínic de Barcelona. L'estudi es va dur a terme entre el maig del 2005 i el juliol del 2007.