1000 resultados para Code validation
Resumo:
Résumé Les experts forensiques en documents peuvent être confrontés à des écritures réalisées en conditions non conventionnelles. Ces circonstances atypiques pourraient être à l'origine d'une plus grande variabilité de la forme de l'écriture, en particulier lorsque des positions à priori inhabituelles du corps et / ou du support sont impliquées. En effet, en dépit de son aspect stéréotypé /standardisé évident, résultat d'un apprentissage par un modèle, notre écriture est caractérisée par une variabilité intrinsèque de la forme, qui évolue au cours du temps et qui, dans sa dimension qualitative, confère à l'écriture son caractère individuel. En d'autres termes, nous n'écrivons jamais deux fois de la même façon. Cette variabilité intraindividuelle (ou intra-variabilité) observée en condition conventionnelle, c'est-à-dire assis devant un support horizontal, pourrait augmenter en conditions non conventionnelles, par exemple dans une position inconfortable. Cela pourrait rendre plus difficile l'identification d'écrits apposés dans une condition non conventionnelle ou inconnue. Ne pas connaître les circonstances d'apposition d'une mention manuscrite ou ne pas s'interroger sur ces dernières, pourrait conduire l'expert à faire des erreurs d'appréciation. Et le simple fait d'étudier une trace sur laquelle le corps peut exercer une influence fait de l'expertise en écriture une spécialité qui se distingue des autres disciplines forensiques. En cela, la trace écrite diffère des autres types de traces "inanimées" (physiques, chimiques, bigchimiques) considérées comme invariables (mais potentiellement sensibles à d'autres phénomènes tels que la température, la pression atmosphérique...). En effet, le mouvement d'écriture étant commandé et contrôlé par le cerveau, cela lui confère une certaine variabilité. Il est donc assez logique de penser que la connaissance des mécanismes neuroscientifiques à l'origine de ce mouvement facilitera la compréhension des phénomènes observés d'un point de vue forensique. Deux expériences ont été menées afin de comparer les performances de sujets écrivant dans différentes conditions (conventionnelle vs. non conventionnelles). Les résultats ont montré que cinq des sept conditions non conventionnelles n'avaient pas d'impact significatif sur la variabilité d'écriture. L'ensemble des résultats fournit aux experts forensiques des pistes leur permettant de mieux appréhender les écritures rédigées dans des conditions inhabituelles.
Resumo:
The genotyping of human papillomaviruses (HPV) is essential for the surveillance of HPV vaccines. We describe and validate a low-cost PGMY-based PCR assay (PGMY-CHUV) for the genotyping of 31 HPV by reverse blotting hybridization (RBH). Genotype-specific detection limits were 50 to 500 genome equivalents per reaction. RBH was 100% specific and 98.61% sensitive using DNA sequencing as the gold standard (n = 1,024 samples). PGMY-CHUV was compared to the validated and commercially available linear array (Roche) on 200 samples. Both assays identified the same positive (n = 182) and negative samples (n = 18). Seventy-six percent of the positives were fully concordant after restricting the comparison to the 28 genotypes shared by both assays. At the genotypic level, agreement was 83% (285/344 genotype-sample combinations; κ of 0.987 for single infections and 0.853 for multiple infections). Fifty-seven of the 59 discordant cases were associated with multiple infections and with the weakest genotypes within each sample (P < 0.0001). PGMY-CHUV was significantly more sensitive for HPV56 (P = 0.0026) and could unambiguously identify HPV52 in mixed infections. PGMY-CHUV was reproducible on repeat testing (n = 275 samples; 392 genotype-sample combinations; κ of 0.933) involving different reagents lots and different technicians. Discordant results (n = 47) were significantly associated with the weakest genotypes in samples with multiple infections (P < 0.0001). Successful participation in proficiency testing also supported the robustness of this assay. The PGMY-CHUV reagent costs were estimated at $2.40 per sample using the least expensive yet proficient genotyping algorithm that also included quality control. This assay may be used in low-resource laboratories that have sufficient manpower and PCR expertise.
Resumo:
Agreed upon procedures report on the Department of Human Services' compliance with Chapter 249J.22 of the Code of Iowa for the year ended June 30, 2012
Resumo:
BACKGROUND: Genotypes obtained with commercial SNP arrays have been extensively used in many large case-control or population-based cohorts for SNP-based genome-wide association studies for a multitude of traits. Yet, these genotypes capture only a small fraction of the variance of the studied traits. Genomic structural variants (GSV) such as Copy Number Variation (CNV) may account for part of the missing heritability, but their comprehensive detection requires either next-generation arrays or sequencing. Sophisticated algorithms that infer CNVs by combining the intensities from SNP-probes for the two alleles can already be used to extract a partial view of such GSV from existing data sets. RESULTS: Here we present several advances to facilitate the latter approach. First, we introduce a novel CNV detection method based on a Gaussian Mixture Model. Second, we propose a new algorithm, PCA merge, for combining copy-number profiles from many individuals into consensus regions. We applied both our new methods as well as existing ones to data from 5612 individuals from the CoLaus study who were genotyped on Affymetrix 500K arrays. We developed a number of procedures in order to evaluate the performance of the different methods. This includes comparison with previously published CNVs as well as using a replication sample of 239 individuals, genotyped with Illumina 550K arrays. We also established a new evaluation procedure that employs the fact that related individuals are expected to share their CNVs more frequently than randomly selected individuals. The ability to detect both rare and common CNVs provides a valuable resource that will facilitate association studies exploring potential phenotypic associations with CNVs. CONCLUSION: Our new methodologies for CNV detection and their evaluation will help in extracting additional information from the large amount of SNP-genotyping data on various cohorts and use this to explore structural variants and their impact on complex traits.
Resumo:
This is the Report of the Code Commissioners to the Twenty-fifth General Assembly of the State of Iowa
Resumo:
In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and predicted behavior of the bridge caused under a subset of ambient trucks. The predicted behavior is derived from a statistics-based model trained with field data from the undamaged bridge (not a finite element model). The differences between actual and predicted responses, called residuals, are then used to construct control charts, which compare undamaged and damaged structure data. Validation of the damage-detection approach was achieved by using sacrificial specimens that were mounted to the bridge and exposed to ambient traffic loads and which simulated actual damage-sensitive locations. Different damage types and levels were introduced to the sacrificial specimens to study the sensitivity and applicability. The damage-detection algorithm was able to identify damage, but it also had a high false-positive rate. An evaluation of the sub-components of the damage-detection methodology and methods was completed for the purpose of improving the approach. Several of the underlying assumptions within the algorithm were being violated, which was the source of the false-positives. Furthermore, the lack of an automatic evaluation process was thought to potentially be an impediment to widespread use. Recommendations for the improvement of the methodology were developed and preliminarily evaluated. These recommendations are believed to improve the efficacy of the damage-detection approach.
Resumo:
False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.
Resumo:
The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accuracy or correspondence exists between predicted and monitored performance for Iowa conditions. A comprehensive literature review was conducted to identify the MEPDG input parameters and the MEPDG verification/calibration process. Sensitivities of MEPDG input parameters to predictions were studied using different versions of the MEPDG software. Based on literature review and sensitivity analysis, a detailed verification procedure was developed. A total of sixteen different types of pavement sections across Iowa, not used for national calibration in NCHRP 1-47A, were selected. A database of MEPDG inputs and the actual pavement performance measures for the selected pavement sites were prepared for verification. The accuracy of the MEPDG performance models for Iowa conditions was statistically evaluated. The verification testing showed promising results in terms of MEPDG’s performance prediction accuracy for Iowa conditions. Recalibrating the MEPDG performance models for Iowa conditions is recommended to improve the accuracy of predictions. ****************** Large File**************************
Resumo:
Objective: Jaundice is the clinical manifestation, of hyperbilirubinemia. It is considered as a sign of either a liver disease or, less often, of a hemolytic disorder. It can be divided into obstructive and non obstructive type, involving increase of indirect (non-conjugated) bilirubin or increase of direct (conjugated) bilirubin, respectively, but it can be also manifested as mixed type. Methods: This article updates the current knoweledge concerning the jaundice's etiology, pathophysiological mechanisms, and complications ant treatment by reviewing of the latest medical literature. It also presents an approach of jaundice's treatment and pathogenesis, in special populations as in neonates and pregnant women. Results: The treatment is consistent in the management of the subjective diseases responsible for the jaundice and its complications.The clinical prognosis of the jaundice depends on the etiology. Surgical treatment of jaundiced patients is associated with high mortality and morbidity rates. Studies have shown that the severity of jaundice and the presence of malignant disease are importan risk factors for post-operative mortality. Conclusions: Early detection of jaundice is of vital importance because of its involvement in malignancy or in other benign conditions requiring immediate treatment in order to avoid further complications.
Resumo:
Photopolymerization is commonly used in a broad range of bioapplications, such as drug delivery, tissue engineering, and surgical implants, where liquid materials are injected and then hardened by means of illumination to create a solid polymer network. However, photopolymerization using a probe, e.g., needle guiding both the liquid and the curing illumination, has not been thoroughly investigated. We present a Monte Carlo model that takes into account the dynamic absorption and scattering parameters as well as solid-liquid boundaries of the photopolymer to yield the shape and volume of minimally invasively injected, photopolymerized hydrogels. In the first part of the article, our model is validated using a set of well-known poly(ethylene glycol) dimethacrylate hydrogels showing an excellent agreement between simulated and experimental volume-growth-rates. In the second part, in situ experimental results and simulations for photopolymerization in tissue cavities are presented. It was found that a cavity with a volume of 152 mm3 can be photopolymerized from the output of a 0.28-mm2 fiber by adding scattering lipid particles while only a volume of 38 mm3 (25%) was achieved without particles. The proposed model provides a simple and robust method to solve complex photopolymerization problems, where the dimension of the light source is much smaller than the volume of the photopolymerizable hydrogel.