975 resultados para LIKELIHOOD RATIO TEST


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is a worldwide leading cause of death. The standard method for evaluating critical partial occlusions is coronary arteriography, a catheterization technique which is invasive, time consuming, and costly. There are noninvasive approaches for the early detection of CAD. The basis for the noninvasive diagnosis of CAD has been laid in a sequential analysis of the risk factors, and the results of the treadmill test and myocardial perfusion scintigraphy (MPS). Many investigators have demonstrated that the diagnostic applications of MPS are appropriate for patients who have an intermediate likelihood of disease. Although this information is useful, it is only partially utilized in clinical practice due to the difficulty to properly classify the patients. Since the seminal work of Lotfi Zadeh, fuzzy logic has been applied in numerous areas. In the present study, we proposed and tested a model to select patients for MPS based on fuzzy sets theory. A group of 1053 patients was used to develop the model and another group of 1045 patients was used to test it. Receiver operating characteristic curves were used to compare the performance of the fuzzy model against expert physician opinions, and showed that the performance of the fuzzy model was equal or superior to that of the physicians. Therefore, we conclude that the fuzzy model could be a useful tool to assist the general practitioner in the selection of patients for MPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In numerous motor tasks, muscles around a joint act coactively to generate opposite torques. A variety of indexes based on electromyography signals have been presented in the literature to quantify muscle coactivation. However, it is not known how to estimate it reliably using such indexes. The goal of this study was to test the reliability of the estimation of muscle coactivation using electromyography. Isometric coactivation was obtained at various muscle activation levels. For this task, any coactivation measurement/index should present the maximal score (100% of coactivation). Two coactivation indexes were applied. In the first, the antagonistic muscle activity (the lower electromyographic signal between two muscles that generate opposite joint torques) is divided by the mean between the agonistic and antagonistic muscle activations. In the second, the ratio between antagonistic and agonistic muscle activation is calculated. Moreover, we computed these indexes considering different electromyographic amplitude normalization procedures. It was found that the first algorithm, with all signals normalized by their respective maximal voluntary coactivation, generates the index closest to the true value (100%), reaching 92 ± 6%. In contrast, the coactivation index value was 82 ± 12% when the second algorithm was applied and the electromyographic signal was not normalized (P < 0.04). The new finding of the present study is that muscle coactivation is more reliably estimated if the EMG signals are normalized by their respective maximal voluntary contraction obtained during maximal coactivation prior to dividing the antagonistic muscle activity by the mean between the agonistic and antagonistic muscle activations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generalized maximum likelihood method was used to determine binary interaction parameters between carbon dioxide and components of orange essential oil. Vapor-liquid equilibrium was modeled with Peng-Robinson and Soave-Redlich-Kwong equations, using a methodology proposed in 1979 by Asselineau, Bogdanic and Vidal. Experimental vapor-liquid equilibrium data on binary mixtures formed with carbon dioxide and compounds usually found in orange essential oil were used to test the model. These systems were chosen to demonstrate that the maximum likelihood method produces binary interaction parameters for cubic equations of state capable of satisfactorily describing phase equilibrium, even for a binary such as ethanol/CO2. Results corroborate that the Peng-Robinson, as well as the Soave-Redlich-Kwong, equation can be used to describe phase equilibrium for the following systems: components of essential oil of orange/CO2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affiliation: Claudia Kleinman, Nicolas Rodrigue & Hervé Philippe : Département de biochimie, Faculté de médecine, Université de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objetivo: Recientemente, se han propuesto varios dispositivos de impedancia bioeléctrica (BIA) para la estimación rápida de la grasa corporal. Sin embargo, no han sido publicadas referencias de grasa corporal para niños y adolescentes en población Colombiana. El objetivo de este estudio fue establecer percentiles de grasa corporal por BIA en niños y adolescentes de Bogotá, Colombia de entre 9 y 17.9 años, pertenecientes al estudio FUPRECOL. Métodos: Estudio descriptivo y transversal, realizado en 2.526 niños y 3.324 adolescentes de entre 9 y 17.9 años de edad, pertenecientes a instituciones educativas oficiales de Bogotá, Colombia. El porcentaje de grasa corporal fue medido con Tanita® Analizador de Composición Corporal (Modelo BF-689), según edad y sexo. Se tomaron medidas de peso, talla, circunferencia de cintura, circunferencia de cadera y estado de maduración sexual por auto-reporte. Se calcularon los percentiles (P3, P10, P25, P50, P75, P90 y P97) y curvas centiles por el método LMS según sexo y edad y se realizó una comparación entre los valores de la CC observados con estándares internacionales. Resultados: Se presentan valores de porcentaje de grasa corporal y las curvas de percentiles. En la mayoría de los grupos etáreos la grasa corporal de las chicas fue mayor a la de los chicos. Sujetos cuyo porcentaje de grasa corporal estaba por encima del percentil 90 de la distribución estándar normal se consideró que tenían un elevado riesgo cardiovascular (chicos desde 23,4-28,3 y chicas desde 31,0-34,1). En general, nuestros porcentajes de grasa corporal fueron inferiores a los valores de Turquía, Alemania, Grecia, España y Reino Unido. Conclusiones: Se presentan percentiles del porcentaje de grasa por BIA según edad y sexo que podrán ser usados de referencia en la evaluación del estado nutricional y en la predicción del riesgo cardiovascular desde edades tempranas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objetivo: Determinar la distribución por percentiles de la circunferencia de cintura en una población escolar de Bogotá, Colombia, pertenecientes al estudio FUPRECOL. Métodos: Estudio transversal, realizado en 3.005 niños y 2.916 adolescentes de entre 9 y 17,9 años de edad, de Bogotá, Colombia. Se tomaron medidas de peso, talla, circunferencia de cintura, circunferencia de cadera y estado de maduración sexual por auto-reporte. Se calcularon los percentiles (P3, P10, P25, P50, P75, P90 y P97) y curvas centiles según sexo y edad. Se realizó una comparación entre los valores de la circunferencia de cintura observados con estándares internacionales. Resultados: De la población general (n=5.921), el 57,0% eran chicas (promedio de edad 12,7±2,3 años). En la mayoría de los grupos etáreos la circunferencia de cintura de las chicas fue inferior a la de los chicos. El aumento entre el P50-P97 de la circunferencia de cintura , por edad, fue mínimo de 15,7 cm en chicos de 9-9.9 años y de 16,0 cm en las chicas de 11-11.9 años. Al comparar los resultados de este estudio, por grupos de edad y sexo, con trabajos internacionales de niños y adolescentes, el P50 fue inferior al reportado en Perú e Inglaterra a excepción de los trabajos de la India, Venezuela (Mérida), Estados Unidos y España. Conclusiones: Se presentan percentiles de la circunferencia de cintura según edad y sexo que podrán ser usados de referencia en la evaluación del estado nutricional y en la predicción del riesgo cardiovascular desde edades tempranas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Llegar al proceso evolutivo del aprendizaje de la lectura sirviéndose de un instrumento diagnóstico que permitiera realizar un análisis de errores que parecía aislar con suficiente nitidez su naturaleza evolutiva con relación a este proceso de aprendizaje. Como objetivos más específicos se pretendía: extraer de la lectura oral errores de diferente calidad y cantidad en distintas etapas del desarrollo de la lectura, sistematizar un registro de errores objetivo y contextualizar al máximo la observación de la lectura oral, acercándola a la realidad del aula. Se han tenido en cuenta dos criterios en la selección: los sujetos y las conductas. En la selección de sujetos se centró en niños con problemas de aprendizaje de la lectura. Partiendo de un total de 41612 sujetos de primer curso y 43036 de segundo curso de Educación Primaria para el curso académico 1995-96, con una ratio alumnos/unidad (aula) respectivamente de 22.12 y 21.33. Se trata de un muestreo no probabilístico del tipo 'muestras de juicio, de casos típicos' con 1102 casos. En la selección de conductas el criterio de muestreo inicial fue libre dado que utiliza como unidades de muestra de conductas, todas las ocurrencias o gama de ellas a lo largo de la grabación. En primer lugar se concretó el problema y se elaboró en reactivo, atendiendo a tres criterios contextuales, metodológicos y para el muestreo observacional. Las fases de investigación fueron tres: recogida inicial de la información, con registro de grabaciones y observación cualitativa, categorización de errores y elaboración del primer cuestionario de errores; después se hizo una aplicación piloto del cuestionario, con registro de grabaciones, análisis cuantitativo y formulación del test individual de diagnóstico de errores en lectura (TIDEL) ; y la tercera fue la aplicación del TIDEL. Después hubo un análisis de datos, mediante un análisis descriptivo, seguido de un análisis y validación teniendo en cuenta las evidencias. Sigue la fase de análisis e interpretación de resultados de los perfiles de errores. Finalmente se exponen las conclusiones. Textos de lectura, hojas de observación para registrar los resultados y datos identificativos, encuestas para diferenciar entre los textos que los sujetos conocían realmente, de los que les sonaban, discusiones en grupo para determinar la categorización de los errores. El instrumento más importante fue el TIDEL (Test Individual de Diagnóstico de Errores en Lectura), que permite detectar errores en la lectura. Se recogieron diversos tipos de errores: falta de integración de las palabras, confusión de letras por la forma, deformación y cambio de elementos, doble/triple consonante e inversiones, en las reglas convencionales, reiteración-relectura-inhibición proactiva, variantes y anticipación con sentido, inmadurez en la pronunciación, y cadencia-entonación-gestos marginales, entre otros. Se infiere un proceso determinado en el desarrollo lector, en el sentido de que un descenso progresivo de los errores elementales del perfil urgente pude facilitar, en un segundo perfil atendible, las primeras tentativas de búsqueda del significado, que se hacen más patentes en el perfil aceptable, con la mayor probabilidad de aparición de los errores de 'anticipación del significado' y 'variantes con sentido'. Se produce una preponderancia de los errores elementales y otros producidos por convenciones idiomáticas. En una segunda etapa será vital la transición y salto al sentido, donde se cometen errores anticipatorios relacionados con el contenido del texto. Así pues, la evolución de la habilidad lectora escalaría dos niveles de la inteligencia que irían de la combinatoria -forma de las letras y su sonido-, al sentido (semántico). Una fase posterior de la lectura encajaría con el razonamiento.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses a test for speech perception and scoring to test likelihood of success with mainstreaming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 × 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture–recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture–recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A modified chlorophyll fluorescence technique was evaluated as a rapid diagnostic test of the susceptibility of wheat cultivars to chlorotoluron. Two winter wheat cultivars (Maris Huntsman and Mercia) exhibited differential response to the herbicide. All of the parameters of chlorophyll fluorescence examined were strongly influenced by herbicide concentration. Additionally, the procedure adopted here for the examination of winter wheat cultivar sensitivity to herbicide indicated that the area above the fluorescence induction curve and the ratio F-V/F-M are appropriate chlorophyll fluorescence parameters for detection of differential herbicide response between wheat cultivars. The potential use of this technique as an alternative to traditional methods of screening new winter wheat cultivars for their response to photosynthetic inhibitor herbicide is demonstrated here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This contribution provides a unifying concept for meta-analysis integrating the handling of unobserved heterogeneity, study covariates, publication bias and study quality. It is important to consider these issues simultaneously to avoid the occurrence of artifacts, and a method for doing so is suggested here. METHODS: The approach is based upon the meta-likelihood in combination with a general linear nonparametric mixed model, which lays the ground for all inferential conclusions suggested here. RESULTS: The concept is illustrated at hand of a meta-analysis investigating the relationship of hormone replacement therapy and breast cancer. The phenomenon of interest has been investigated in many studies for a considerable time and different results were reported. In 1992 a meta-analysis by Sillero-Arenas et al. concluded a small, but significant overall effect of 1.06 on the relative risk scale. Using the meta-likelihood approach it is demonstrated here that this meta-analysis is due to considerable unobserved heterogeneity. Furthermore, it is shown that new methods are available to model this heterogeneity successfully. It is argued further to include available study covariates to explain this heterogeneity in the meta-analysis at hand. CONCLUSIONS: The topic of HRT and breast cancer has again very recently become an issue of public debate, when results of a large trial investigating the health effects of hormone replacement therapy were published indicating an increased risk for breast cancer (risk ratio of 1.26). Using an adequate regression model in the previously published meta-analysis an adjusted estimate of effect of 1.14 can be given which is considerably higher than the one published in the meta-analysis of Sillero-Arenas et al. In summary, it is hoped that the method suggested here contributes further to a good meta-analytic practice in public health and clinical disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 x 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture-recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture-recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a reappraisal of the blood clotting response (BCR) tests for anticoagulant rodenticides, and proposes a standardised methodology for identifying and quantifying physiological resistance in populations of rodent species. The standardisation is based on the International Normalised Ratio, which is standardised against a WHO international reference preparation of thromboplastin, and allows comparison of data obtained using different thromboplastin reagents. ne methodology is statistically sound, being based on the 50% response, and has been validated against the Norway rat (Rattus norvegicus) and the house mouse (Mus domesticus). Susceptibility baseline data are presented for warfarin, diphacinone, chlorophacinone and coumatetralyl against the Norway rat, and for bromadiolone, difenacoum, difethialone, flocoumafen and brodifacoum against the Norway rat and the house mouse. A 'test dose' of twice the ED50 can be used for initial identification of resistance, and will provide a similar level of information to previously published methods. Higher multiples of the ED50 can be used to assess the resistance factor, and to predict the likely impact on field control.