938 resultados para Test methods
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult to achieve because the relative values of the forecast components often fail to behave in a way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It has been shown that cause-specic mortality forecasts are pessimistic when compared with all-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approach of using log mortality rates and forecasts the density of deaths in the life table. Since these values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbing state), they are intrinsically relative rather than absolute values across decrements as well as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison (1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that the unit sum constraint is honoured. The structure of the best-known, single-decrement mortality-rate forecasting model, devised by Lee and Carter (1992), is expressed in compositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortality by cause of death for Japan
Resumo:
In most psychological tests and questionnaires, a test score is obtained by taking the sum of the item scores. In virtually all cases where the test or questionnaire contains multidimensional forced-choice items, this traditional scoring method is also applied. We argue that the summation of scores obtained with multidimensional forced-choice items produces uninterpretable test scores. Therefore, we propose three alternative scoring methods: a weak and a strict rank preserving scoring method, which both allow an ordinal interpretation of test scores; and a ratio preserving scoring method, which allows a proportional interpretation of test scores. Each proposed scoring method yields an index for each respondent indicating the degree to which the response pattern is inconsistent. Analysis of real data showed that with respect to rank preservation, the weak and strict rank preserving method resulted in lower inconsistency indices than the traditional scoring method; with respect to ratio preservation, the ratio preserving scoring method resulted in lower inconsistency indices than the traditional scoring method
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Introducción La mutación genética Val30Met de la proteína transtiretina (TTR) es causante de la polineuropatía amiloidótica familiar, comprometiendo en fases iniciales las fibras nerviosas pequeñas (mielinizadas Aδ y amielínicas tipo C), involucradas en funciones autonómicas, nocicepción, percepción térmica y sudoración. Los métodos neurofisiológicos convencionales, no logran detectar dichas anormalidades, retardando el inicio de tratamientos específicos para la enfermedad. Metodología El objetivo principal fue evaluar el test de cuantificación sensitiva (QST) como método de detección temprana de anormalidades de fibra pequeña, en individuos Val30Met, seguidos en el Hospital Universitario Santa María, Lisboa. Se clasificaron los pacientes en 3 grupos, según sintomatología y examen neurológico. Se analizaron los umbrales para percepción de frío, dolor con el calor y vibración en los grupos, en correlación con controles sanos. Resultados 18 registros de controles sanos y 33 de individuos con la mutación, divididos en asintomáticos (24,2%), sintomáticos con examen neurológico normal (42,4%) y sintomáticos con examen neurológico anormal (33,3%). No se encontraron diferencias entre los pacientes asintomáticos y los controles. Los umbrales para frío (p=0,042) y en el dolor intermedio con el calor (HP 5) (p=0,007) se encuentran elevados en individuos Val30Met sintomáticos con examen normal. En los pacientes sintomáticos con alteraciones al examen, también se presentaron alteraciones en el intervalo entre el inicio y el dolor intermedio con el calor (HP 5-0,5) (p=0,009). Discusión Los umbrales de frío y de percepción de dolor con el calor, permiten detectar anormalidades en personas con la mutación TTR Val30Met, sintomáticos, incluyendo aquellos sin cambios objetivos al examen neurológico.
Resumo:
Introduction: Schizophrenia is a serious and chronic mental illness that has effect on cognitive and social functioning of a person who suffers it. Recent research points out that social cognition subprocesses, such as Theory of Mind, social perception or emotional processing, have to do with some problems that patients show in their social adjustment. Aim: Assessing ability of recognizing mental states from facial expressions in schizophrenia patients compared to a control group. Subjects and methods: 17 stable schizophrenia patients who are aware of the illness and 17 healthy people, with the same age and sociocultural level, took the “Reading the Mind in the Eyes” Test Revised Version of Baron- Cohen. Results: Compared with the control group, subjects with schizophrenia showed much lower scores. Conclusions: It is confirmed that schizophrenia patients have impairments to understand facial expressions, especially from the eyes. That is typical of this illness, so it is necessary to do interventions at that point. Furthermore, inability to recognize emotions, as a domain of social cognition, contributes to deficit in functional outcome in schizophrenia. Finally, some treatment programs are put forward.
Resumo:
Existen importantes pruebas de valoración que miden habilidades o competencias motoras en el niño; a pesar de ello Colombia carece de estudios que demuestren la validez y la confiabilidad de un test de medición que permita emitir un juicio valorativo relacionado con las competencias motoras infantiles, teniendo presente que la intervención debe basarse en la rigurosidad que exigen los procesos de valoración y evaluación del movimiento corporal. Objetivo. El presente estudio se centró en determinar las propiedades psicométricas del test de competencias motoras Bruininiks Oseretsky –BOT 2- segunda edición. Materiales y métodos. Se realizó una evaluación de pruebas diagnósticas con 24 niños aparentemente sanos de ambos géneros, entre 4 y 7 años, residentes en las ciudades de Chía y Bogotá. La evaluación fue realizada por 3 evaluadores expertos; el análisis para consistencia interna se realizó utilizando el Coeficiente Alfa de Cronbach, el análisis de reproducibilidad se estableció a través del Coeficiente de Correlación Intraclase –CCI- y para el análisis de la validez concurrente se utilizó el Coeficiente de Correlación de Pearson, considerando un alfa=0.05. Resultados. Para la totalidad de las pruebas, se encontraron altos índices de confiabilidad y validez. Conclusiones. El BOT 2 es un instrumento válido y confiable, que puede ser utilizado para la evaluación e identificación del nivel de desarrollo en que se encuentran las competencias motoras en el niño.
Resumo:
Given an observed test statistic and its degrees of freedom, one may compute the observed P value with most statistical packages. It is unknown to what extent test statistics and P values are congruent in published medical papers. Methods: We checked the congruence of statistical results reported in all the papers of volumes 409–412 of Nature (2001) and a random sample of 63 results from volumes 322–323 of BMJ (2001). We also tested whether the frequencies of the last digit of a sample of 610 test statistics deviated from a uniform distribution (i.e., equally probable digits).Results: 11.6% (21 of 181) and 11.1% (7 of 63) of the statistical results published in Nature and BMJ respectively during 2001 were incongruent, probably mostly due to rounding, transcription, or type-setting errors. At least one such error appeared in 38% and 25% of the papers of Nature and BMJ, respectively. In 12% of the cases, the significance level might change one or more orders of magnitude. The frequencies of the last digit of statistics deviated from the uniform distribution and suggested digit preference in rounding and reporting.Conclusions: this incongruence of test statistics and P values is another example that statistical practice is generally poor, even in the most renowned scientific journals, and that quality of papers should be more controlled and valued
Resumo:
The goal of the review is to provide a state-of-the-art survey on sampling and probe methods for the solution of inverse problems. Further, a configuration approach to some of the problems will be presented. We study the concepts and analytical results for several recent sampling and probe methods. We will give an introduction to the basic idea behind each method using a simple model problem and then provide some general formulation in terms of particular configurations to study the range of the arguments which are used to set up the method. This provides a novel way to present the algorithms and the analytic arguments for their investigation in a variety of different settings. In detail we investigate the probe method (Ikehata), linear sampling method (Colton-Kirsch) and the factorization method (Kirsch), singular sources Method (Potthast), no response test (Luke-Potthast), range test (Kusiak, Potthast and Sylvester) and the enclosure method (Ikehata) for the solution of inverse acoustic and electromagnetic scattering problems. The main ideas, approaches and convergence results of the methods are presented. For each method, we provide a historical survey about applications to different situations.
Resumo:
In this paper we consider the scattering of a plane acoustic or electromagnetic wave by a one-dimensional, periodic rough surface. We restrict the discussion to the case when the boundary is sound soft in the acoustic case, perfectly reflecting with TE polarization in the EM case, so that the total field vanishes on the boundary. We propose a uniquely solvable first kind integral equation formulation of the problem, which amounts to a requirement that the normal derivative of the Green's representation formula for the total field vanish on a horizontal line below the scattering surface. We then discuss the numerical solution by Galerkin's method of this (ill-posed) integral equation. We point out that, with two particular choices of the trial and test spaces, we recover the so-called SC (spectral-coordinate) and SS (spectral-spectral) numerical schemes of DeSanto et al., Waves Random Media, 8, 315-414 1998. We next propose a new Galerkin scheme, a modification of the SS method that we term the SS* method, which is an instance of the well-known dual least squares Galerkin method. We show that the SS* method is always well-defined and is optimally convergent as the size of the approximation space increases. Moreover, we make a connection with the classical least squares method, in which the coefficients in the Rayleigh expansion of the solution are determined by enforcing the boundary condition in a least squares sense, pointing out that the linear system to be solved in the SS* method is identical to that in the least squares method. Using this connection we show that (reflecting the ill-posed nature of the integral equation solved) the condition number of the linear system in the SS* and least squares methods approaches infinity as the approximation space increases in size. We also provide theoretical error bounds on the condition number and on the errors induced in the numerical solution computed as a result of ill-conditioning. Numerical results confirm the convergence of the SS* method and illustrate the ill-conditioning that arises.
Resumo:
Satellite observed data for flood events have been used to calibrate and validate flood inundation models, providing valuable information on the spatial extent of the flood. Improvements in the resolution of this satellite imagery have enabled indirect remote sensing of water levels by using an underlying LiDAR DEM to extract the water surface elevation at the flood margin. Further to comparison of the spatial extent, this now allows for direct comparison between modelled and observed water surface elevations. Using a 12.5m ERS-1 image of a flood event in 2006 on the River Dee, North Wales, UK, both of these data types are extracted and each assessed for their value in the calibration of flood inundation models. A LiDAR guided snake algorithm is used to extract an outline of the flood from the satellite image. From the extracted outline a binary grid of wet / dry cells is created at the same resolution as the model, using this the spatial extent of the modelled and observed flood can be compared using a measure of fit between the two binary patterns of flooding. Water heights are extracted using points at intervals of approximately 100m along the extracted outline, and the students T-test is used to compare modelled and observed water surface elevations. A LISFLOOD-FP model of the catchment is set up using LiDAR topographic data resampled to the 12.5m resolution of the satellite image, and calibration of the friction parameter in the model is undertaken using each of the two approaches. Comparison between the two approaches highlights the sensitivity of the spatial measure of fit to uncertainty in the observed data and the potential drawbacks of using the spatial extent when parts of the flood are contained by the topography.
Resumo:
We propose a novel method for scoring the accuracy of protein binding site predictions – the Binding-site Distance Test (BDT) score. Recently, the Matthews Correlation Coefficient (MCC) has been used to evaluate binding site predictions, both by developers of new methods and by the assessors for the community wide prediction experiment – CASP8. Whilst being a rigorous scoring method, the MCC does not take into account the actual 3D location of the predicted residues from the observed binding site. Thus, an incorrectly predicted site that is nevertheless close to the observed binding site will obtain an identical score to the same number of nonbinding residues predicted at random. The MCC is somewhat affected by the subjectivity of determining observed binding residues and the ambiguity of choosing distance cutoffs. By contrast the BDT method produces continuous scores ranging between 0 and 1, relating to the distance between the predicted and observed residues. Residues predicted close to the binding site will score higher than those more distant, providing a better reflection of the true accuracy of predictions. The CASP8 function predictions were evaluated using both the MCC and BDT methods and the scores were compared. The BDT was found to strongly correlate with the MCC scores whilst also being less susceptible to the subjectivity of defining binding residues. We therefore suggest that this new simple score is a potentially more robust method for future evaluations of protein-ligand binding site predictions.
Resumo:
A modified chlorophyll fluorescence technique was evaluated as a rapid diagnostic test of the susceptibility of wheat cultivars to chlorotoluron. Two winter wheat cultivars (Maris Huntsman and Mercia) exhibited differential response to the herbicide. All of the parameters of chlorophyll fluorescence examined were strongly influenced by herbicide concentration. Additionally, the procedure adopted here for the examination of winter wheat cultivar sensitivity to herbicide indicated that the area above the fluorescence induction curve and the ratio F-V/F-M are appropriate chlorophyll fluorescence parameters for detection of differential herbicide response between wheat cultivars. The potential use of this technique as an alternative to traditional methods of screening new winter wheat cultivars for their response to photosynthetic inhibitor herbicide is demonstrated here.
Resumo:
The proportional odds model provides a powerful tool for analysing ordered categorical data and setting sample size, although for many clinical trials its validity is questionable. The purpose of this paper is to present a new class of constrained odds models which includes the proportional odds model. The efficient score and Fisher's information are derived from the profile likelihood for the constrained odds model. These results are new even for the special case of proportional odds where the resulting statistics define the Mann-Whitney test. A strategy is described involving selecting one of these models in advance, requiring assumptions as strong as those underlying proportional odds, but allowing a choice of such models. The accuracy of the new procedure and its power are evaluated.
Resumo:
This paper considers methods for testing for superiority or non-inferiority in active-control trials with binary data, when the relative treatment effect is expressed as an odds ratio. Three asymptotic tests for the log-odds ratio based on the unconditional binary likelihood are presented, namely the likelihood ratio, Wald and score tests. All three tests can be implemented straightforwardly in standard statistical software packages, as can the corresponding confidence intervals. Simulations indicate that the three alternatives are similar in terms of the Type I error, with values close to the nominal level. However, when the non-inferiority margin becomes large, the score test slightly exceeds the nominal level. In general, the highest power is obtained from the score test, although all three tests are similar and the observed differences in power are not of practical importance. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
This paper presents a reappraisal of the blood clotting response (BCR) tests for anticoagulant rodenticides, and proposes a standardised methodology for identifying and quantifying physiological resistance in populations of rodent species. The standardisation is based on the International Normalised Ratio, which is standardised against a WHO international reference preparation of thromboplastin, and allows comparison of data obtained using different thromboplastin reagents. ne methodology is statistically sound, being based on the 50% response, and has been validated against the Norway rat (Rattus norvegicus) and the house mouse (Mus domesticus). Susceptibility baseline data are presented for warfarin, diphacinone, chlorophacinone and coumatetralyl against the Norway rat, and for bromadiolone, difenacoum, difethialone, flocoumafen and brodifacoum against the Norway rat and the house mouse. A 'test dose' of twice the ED50 can be used for initial identification of resistance, and will provide a similar level of information to previously published methods. Higher multiples of the ED50 can be used to assess the resistance factor, and to predict the likely impact on field control.