219 resultados para wwPDB validation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Validation is the main bottleneck preventing theadoption of many medical image processing algorithms inthe clinical practice. In the classical approach,a-posteriori analysis is performed based on someobjective metrics. In this work, a different approachbased on Petri Nets (PN) is proposed. The basic ideaconsists in predicting the accuracy that will result froma given processing based on the characterization of thesources of inaccuracy of the system. Here we propose aproof of concept in the scenario of a diffusion imaginganalysis pipeline. A PN is built after the detection ofthe possible sources of inaccuracy. By integrating thefirst qualitative insights based on the PN withquantitative measures, it is possible to optimize the PNitself, to predict the inaccuracy of the system in adifferent setting. Results show that the proposed modelprovides a good prediction performance and suggests theoptimal processing approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chaque jour, le médecin utilise dans sa pratique des scores cliniques. Ces scores sont souvent des aides à la décision médicale. Les étapes de validation des scores cliniques sont par contre souvent méconnues du médecin. Cette revue rappelle les bases théoriques de la validation d'un score clinique et propose des exercices pratiques. [Abstract] Physicians are using clinical scores on a regular basis. These scores are generally helpful in making medical decisions. However, the process of validation of clinical scores is often unknown to the physicians. This paper reviews the theory of validation of clinical scores and proposes practical exercises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of our study was to provide an innovative HS-GC/MS method applicable to the routine determination of butane concentration in forensic toxicology laboratories. The main drawback of the GC/MS methods discussed in literature concerning butane measurement was the absence of a specific butane internal standard necessary to perform quantification. Because no stable isotope of butane is commercially available, it is essential to develop a new approach by an in situ generation of standards. To avoid the manipulation of a stable isotope-labelled gas, we have chosen to generate in situ an internal labelled standard gas (C(4)H(9)D) following the basis of the stoichiometric formation of butane by the reaction of deuterated water (D(2)O) with Grignard reagent butylmagnesium chloride (C(4)H(9)MgCl). This method allows a precise measurement of butane concentration and therefore, a full validation by accuracy profile was presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following the seminal work on personal identity of Erikson, Marcia's identity status model has been one of the most enduring paradigms. The Ego Identity Process Questionnaire (EIPQ; Balistreri, Busch-Rossnagel, & Geissinger, 1995) is a widely used measure of identity status. The purpose of this study was to evaluate the factor structure and the reliability of a French version of the EIPQ. The hypothesized structures were not confirmed. In light of the failed attempts to validate the original version, an alternative short-form version of the EIPQ (EIPQ-SF), maintaining the integrity of the original model, was developed in one sample and cross-validated in another sample. Additionally, theoretically consistent associations between the EIPQ-SF dimensions and self-esteem confirmed convergent validity. Globally, the results indicated that the French short-version of the EIPQ might be a useful instrument for the assessment of the identity statuses in adolescence and emerging adulthood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM) was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1 mm high-resolution maps of the longitudinal relaxation rate (R1 = 1/T1), effective proton density (PD(*)), magnetization transfer saturation (MT) and effective transverse relaxation rate (R2(*) = 1/T2(*)). MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV) for typical morphometric measures [i.e., gray matter (GM) probability maps used in voxel-based morphometry] and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1 and 8%, respectively, except for the inter-site CoV of R2(*) (<20%). The GM probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived GM probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: THC-COOH has been proposed as a criterion to help to distinguish between occasional from regular cannabis users. However, to date this indicator has not been adequately assessed under experimental and real-life conditions. Methods: We carried out a controlled administration study of smoked cannabis with a placebo. Twenty-three heavy smokers and 25 occasional smokers, between 18 and 30 years of age, participated in this study [Battistella G et al., PloS one. 2013;8(1):e52545]. We collected data from a second real case study performed with 146 traffic offenders' cases in which the whole blood cannabinoid concentrations and the frequency of cannabis use were known. Cannabinoid levels were determined in whole blood using tandem mass spectrometry methods. Results: Significantly high differences in THC-COOH concentrations were found between the two groups when measured during the screening visit, prior to the smoking session, and throughout the day of the experiment. Receiver operating characteristic (ROC) curves were determined and two threshold criteria were proposed in order to distinguish between these groups: a free THC-COOH concentration below 3 μg/L suggested an occasional consumption (≤ 1 joint/week) while a concentration higher than 40 μg/L corresponded to a heavy use (≥ 10 joints/month). These thresholds were successfully tested with the second real case study. The two thresholds were not challenged by the presence of ethanol (40% of cases) and of other therapeutic and illegal drugs (24%). These thresholds were also found to be consistent with previously published experimental data. Conclusion: We propose the following procedure that can be very useful in the Swiss context but also in other countries with similar traffic policies: If the whole blood THC-COOH concentration is higher than 40 μg/L, traffic offenders must be directed first and foremost toward medical assessment of their fitness to drive. This evaluation is not recommended if the THC-COOH concentration is lower than 3 μg/L. A THC-COOH level between these two thresholds can't be reliably interpreted. In such a case, further medical assessment and follow up of the fitness to drive are also suggested, but with lower priority.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIMS: To validate a model for quantifying the prognosis of patients with pulmonary embolism (PE). The model was previously derived from 10 534 US patients. METHODS AND RESULTS: We validated the model in 367 patients prospectively diagnosed with PE at 117 European emergency departments. We used baseline data for the model's 11 prognostic variables to stratify patients into five risk classes (I-V). We compared 90-day mortality within each risk class and the area under the receiver operating characteristic curve between the validation and the original derivation samples. We also assessed the rate of recurrent venous thrombo-embolism and major bleeding within each risk class. Mortality was 0% in Risk Class I, 1.0% in Class II, 3.1% in Class III, 10.4% in Class IV, and 24.4% in Class V and did not differ between the validation and the original derivation samples. The area under the curve was larger in the validation sample (0.87 vs. 0.78, P=0.01). No patients in Classes I and II developed recurrent thrombo-embolism or major bleeding. CONCLUSION: The model accurately stratifies patients with PE into categories of increasing risk of mortality and other relevant complications. Patients in Risk Classes I and II are at low risk of adverse outcomes and are potential candidates for outpatient treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: The psychometric properties of the EORTC QLQ-BN20, a brain cancer-specific HRQOL questionnaire, have been previously determined in an English-speaking sample of patients. This study examined the validity and reliability of the questionnaire in a multi-national, multi-lingual study. Methods: QLQ-BN20 data were selected from two completed phase III EORTC/NCIC clinical trials in brain cancer (N=891), including 12 languages. Experimental treatments were surgery followed by radiotherapy (RT) and adjuvant PCV chemotherapy or surgery followed by concomitant RT plus temozolomide (TMZ) chemotherapy and adjuvant TMZ chemotherapy. Standard treatment consisted of surgery and postoperative RT alone. The psychometrics of the QLQ-BN20 were examined by means of multi-trait scaling analyses, reliability estimation, known groups validity testing, and responsiveness analysis. Results: All QLQ-BN20 items correlated more strongly with their own scale (r>0.70) than with other QLQ-BN20 scales. Internal consistency reliability coefficients were high (all alpha0.70). Known-groups comparisons yielded positive results, with the QLQ-BN20 distinguishing between patients with differing levels of performance status and mental functioning. Responsiveness of the questionnaire to changes over time was acceptable. Conclusion: The QLQ-BN20 demonstrates adequate psychometric properties and can be recommended for use in conjunction with the QLQ-C30 in assessing the HRQOL of brain cancer patients in international studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Knowledge of cerebral blood flow (CBF) alterations in cases of acute stroke could be valuable in the early management of these cases. Among imaging techniques affording evaluation of cerebral perfusion, perfusion CT studies involve sequential acquisition of cerebral CT sections obtained in an axial mode during the IV administration of iodinated contrast material. They are thus very easy to perform in emergency settings. Perfusion CT values of CBF have proved to be accurate in animals, and perfusion CT affords plausible values in humans. The purpose of this study was to validate perfusion CT studies of CBF by comparison with the results provided by stable xenon CT, which have been reported to be accurate, and to evaluate acquisition and processing modalities of CT data, notably the possible deconvolution methods and the selection of the reference artery. METHODS: Twelve stable xenon CT and perfusion CT cerebral examinations were performed within an interval of a few minutes in patients with various cerebrovascular diseases. CBF maps were obtained from perfusion CT data by deconvolution using singular value decomposition and least mean square methods. The CBF were compared with the stable xenon CT results in multiple regions of interest through linear regression analysis and bilateral t tests for matched variables. RESULTS: Linear regression analysis showed good correlation between perfusion CT and stable xenon CT CBF values (singular value decomposition method: R(2) = 0.79, slope = 0.87; least mean square method: R(2) = 0.67, slope = 0.83). Bilateral t tests for matched variables did not identify a significant difference between the two imaging methods (P >.1). Both deconvolution methods were equivalent (P >.1). The choice of the reference artery is a major concern and has a strong influence on the final perfusion CT CBF map. CONCLUSION: Perfusion CT studies of CBF achieved with adequate acquisition parameters and processing lead to accurate and reliable results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current study aimed to explore the validity of an adaptation into French of the self-rated form of the Health of the Nation Outcome Scales for Children and Adolescents (F-HoNOSCA-SR) and to test its usefulness in a clinical routine use. One hundred and twenty nine patients, admitted into two inpatient units, were asked to participate in the study. One hundred and seven patients filled out the F-HoNOSCA-SR (for a subsample (N=17): at two occasions, one week apart) and the strengths and difficulties questionnaire (SDQ). In addition, the clinician rated the clinician-rated form of the HoNOSCA (HoNOSCA-CR, N=82). The reliability (assessed with split-half coefficient, item response theory (IRT) models and intraclass correlations (ICC) between the two occasions) revealed that the F-HoNSOCA-SR provides reliable measures. The concurrent validity assessed by correlating the F-HoNOSCA-SR and the SDQ revealed a good convergent validity of the instrument. The relationship analyses between the F-HoNOSCA-SR and the HoNOSCA-CR revealed weak but significant correlations. The comparison between the F-HoNOSCA-SR and the HoNOSCA-CR with paired sample t-tests revealed a higher score for the self-rated version. The F-HoNSOCA-SR was reported to provide reliable measures. In addition, it allows us to measure complementary information when used together with the HoNOSCA-CR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé : J'ai souvent vu des experts être d'avis contraires. Je n'en ai jamais vu aucun avoir tort. Auguste Detoeuf Propos d'O.L. Brenton, confiseur, Editions du Tambourinaire, 1948. En choisissant volontairement une problématique comptable typiquement empirique, ce travail s'est attelé à tenter de démontrer la possibilité de produire des enseignements purement comptables (ie à l'intérieur du schème de représentation de la Comptabilité) en s'interdisant l'emprunt systématique de theories clé-en-main à l'Économie -sauf quant cela s'avère réellement nécessaire et légitime, comme dans l'utilisation du CAPM au chapitre précédent. Encore une fois, rappelons que cette thèse n'est pas un réquisitoire contre l'approche économique en tant que telle, mais un plaidoyer visant à mitiger une telle approche en Comptabilité. En relation avec le positionnement épistémologique effectué au premier chapitre, il a été cherché à mettre en valeur l'apport et la place de la Comptabilité dans l'Économie par le positionnement de la Comptabilité en tant que discipline pourvoyeuse de mesures de représentation de l'activité économique. Il nous paraît clair que si l'activité économique, en tant que sémiosphère comptable directe, dicte les observations comptables, la mesure de ces dernières doit, tant que faire se peut, tenter de s'affranchir de toute dépendance à la discipline économique et aux théories-méthodes qui lui sont liées, en adoptant un mode opératoire orthogonal, rationnel et systématique dans le cadre d'axiomes lui appartenant en propre. Cette prise de position entraîne la définition d'un nouveau cadre épistémologique par rapport à l'approche positive de la Comptabilité. Cette dernière peut se décrire comme l'expression philosophique de l'investissement de la recherche comptable par une réflexion méthodique propre à la recherche économique. Afin d'être au moins partiellement validé, ce nouveau cadre -que nous voyons dérivé du constructivisme -devrait faire montre de sa capacité à traiter de manière satisfaisante une problématique classique de comptabilité empirico-positive. Cette problématique spécifique a été choisie sous la forme de traitement-validation du principe de continuité de l'exploitation. Le principe de continuité de l'exploitation postule (énonciation d'une hypothèse) et établit (vérification de l'hypothèse) que l'entreprise produit ses états financiers dans la perspective d'une poursuite normale de ses activités. Il y a rupture du principe de continuité de l'exploitation (qui devra alors être écartée au profit du principe de liquidation ou de cession) dans le cas de cessation d'activité, totale ou partielle, volontaire ou involontaire, ou la constatation de faits de nature à compromettre la continuité de l'exploitation. Ces faits concernent la situation financière, économique et sociale de l'entreprise et représentent l'ensemble des événements objectifs 33, survenus ou pouvant survenir, susceptibles d'affecter la poursuite de l'activité dans un avenir prévisible. A l'instar de tous les principes comptables, le principe de continuité de l'exploitation procède d'une considération purement théorique. Sa vérification requiert toutefois une analyse concrète, portant réellement et de manière mesurable à conséquence, raison pour laquelle il représente un thème de recherche fort apprécié en comptabilité positive, tant il peut (faussement) se confondre avec les études relatives à la banqueroute et la faillite des entreprises. Dans la pratique, certaines de ces études, basées sur des analyses multivariées discriminantes (VIDA), sont devenues pour l'auditeur de véritables outils de travail de par leur simplicité d'utilisation et d'interprétation. À travers la problématique de ce travail de thèse, il a été tenté de s'acquitter de nombreux objectifs pouvant être regroupés en deux ensembles : celui des objectifs liés à la démarche méthodologique et celui relevant de la mesure-calibration. Ces deux groupes-objectifs ont permis dans une dernière étape la construction d'un modèle se voulant une conséquence logique des choix et hypothèses retenus.