179 resultados para Code validation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: It has been suggested that inaccuracies in cancer registries are distorting UK survival statistics. This study compared the Northern Ireland Cancer Registry (NICR) database of living patients, with independent data held by Northern Ireland's General Practitioners (GPs) to compare and validate the recorded diagnoses and dates held by the registry. 

Methods: All 387 GP practice managers were invited to participate. 100 practices (25.84%) responded. Comparisons were made for 17,102 patients, equivalent to 29.08% of the living patients (58,798) extracted from the NICR between 1993 and 2010. 

Results: There were no significant differences (p > 0.05) between the responding and nonresponding GP patient profiles for age, marital status or deprivation score. However, the responding GPs included more female patients (p = 0.02). NICR data accuracy was high, 0.08% of GP cancer patients (n = 15) were not included in registry records and 0.02% (n = 2) had a diagnosis date which varied more than 2 weeks from GP records (3 weeks and 5 months). The NICR had recorded two different tumour types and three different tumour statuses (benign vs. malignant) to the GPs. 

Conclusion: This comparison demonstrates a high level of accuracy within the NICR and that the survival statistics based on this data can be relied upon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the increasing availability of digital slide viewing, and numerous advantages associated with its application, a lack of quality validation studies is amongst the reasons for poor uptake in routine practice. This study evaluated primary digital pathology reporting in the setting of routine subspecialist gastrointestinal pathology, commonplace in most tissue pathology laboratories and representing one of the highest volume specialties in most laboratories. Individual digital and glass slide diagnoses were compared amongst three pathologists reporting in a gastrointestinal subspecialty team, in a prospective series of 100 consecutive diagnostic cases from routine practice in a large teaching hospital laboratory. The study included a washout period of at least 6 months. Discordant diagnoses were classified, and the study evaluated against recent College of American Pathologists (CAP) recommendations for evaluating digital pathology systems for diagnostic use. The study design met all 12 of the CAP recommendations. The 100 study cases generated 300 pairs of diagnoses, comprising 100 glass slide diagnoses and 100 digital diagnoses from each of the three study pathologists. 286 of 300 pairs of diagnoses were concordant, representing intraobserver concordance of 95.3 %, broadly comparable to rates previously published in this field. In ten of the 14 discordant pairs, the glass slide diagnosis was favoured; in four cases, the digital diagnosis was favoured, but importantly, the 14 discordant intraobserver diagnoses were considered to be of minor clinical significance. Interobserver, or viewing modality independent, concordance was found in 94 of the total of 100 study cases, providing a comparable baseline discordance rate expected in any second viewing of pathology material. These overall results support the safe use of digital pathology in primary diagnostic reporting in this setting

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A single-step lateral flow immunoassay (LFIA) was developed and validated for the rapid screening of paralytic shellfish toxins (PSTs) from a variety of shellfish species, at concentrations relevant to regulatory limits of 800 μg STX-diHCl equivalents/kg shellfish meat. A simple aqueous extraction protocol was performed within several minutes from sample homogenate. The qualitative result was generated after a 5 min run time using a portable reader which removed subjectivity from data interpretation. The test was designed to generate noncompliant results with samples containing approximately 800 μg of STX-diHCl/kg. The cross-reactivities in relation to STX, expressed as mean ± SD, were as follows: NEO: 128.9% ± 29%; GTX1&4: 5.7% ± 1.5%; GTX2&3: 23.4% ± 10.4%; dcSTX: 55.6% ± 10.9%; dcNEO: 28.0% ± 8.9%; dcGTX2&3: 8.3% ± 2.7%; C1&C2: 3.1% ± 1.2%; GTX5: 23.3% ± 14.4% (n = 5 LFIA lots). There were no indications of matrix effects from the different samples evaluated (mussels, scallops, oysters, clams, cockles) nor interference from other shellfish toxins (domoic acid, okadaic acid group). Naturally contaminated sample evaluations showed no false negative results were generated from a variety of different samples and profiles (n = 23), in comparison to reference methods (MBA method 959.08, LC-FD method 2005.06). External laboratory evaluations of naturally contaminated samples (n = 39) indicated good correlation with reference methods (MBA, LC-FD). This is the first LFIA which has been shown, through rigorous validation, to have the ability to detect most major PSTs in a reliable manner and will be a huge benefit to both industry and regulators, who need to perform rapid and reliable testing to ensure shellfish are safe to eat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the influence on the engineering design process of the primary objective of validation, whether it is proving a model, a technology or a product. Through the examination of a number of stiffened panel case studies, the relationships between simulation, validation, design and the final product are established and discussed. The work demonstrates the complex interactions between the original (or anticipated) design model, the analysis model, the validation activities and the product in service. The outcome shows clearly some unintended consequences. High fidelity validation test simulations require a different set of detailed parameters to accurately capture behaviour. By doing so, there is a divergence from the original computer-aided design model, intrinsically limiting the value of the validation with respect to the product. This work represents a shift from the traditional perspective of encapsulating and controlling errors between simulation and experimental test to consideration of the wider design-test process. Specifically, it is a reflection on the implications of how models are built and validated, and the effect on results and understanding of structural behaviour. This article then identifies key checkpoints in the design process and how these should be used to update the computer-aided design system parameters for a design. This work strikes at a fundamental challenge in understanding the interaction between design, certification and operation of any complex system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an increasing interest in the biomedical field to create implantable medical devices to provide a temporary mechanical function for use inside the human body. In many of these applications bioresorbable polymer composites using PLLA with β-TCP , are increasingly being used due to their biocompatability, biodegradability and mechanical strength.1,3 These medical devices can be manufactured using conventional plastics processing methods such as injection moulding and extrusion, however there is great need to understand and control the process due to a lack of knowledge on the influence of processing on material properties. With the addition of biocompatible additives there is also a requirement to be able to predict the quality and level of dispersion within the polymer matrix. On-line UV-Vis spectroscopy has been shown to monitor the quality of fillers in polymers. This can eliminate time consuming and costly post-process evaluation of additive dispersion. The aim of this work was to identify process and performance relationships of PLLA/β-TCP composites with respect to melt-extrusion conditions. This is part of a wider study into on-line process monitoring of bioresorbable polymers as used in the medical industry.
These results show that final properties of the PLLA/ β-TCP composite are highly influenced by the particle size and loading. UV-Vis spectroscopy can be used on-line to monitor the final product and this can be utilised as a valuable tool for quality control in an application where consistent performance is of paramount importance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An experimental investigation is carried out to verify the feasibility of using an instrumented vehicle to detect and monitor bridge dynamic parameters. The low-cost method consists of the use of a moving vehicle fitted with accelerometers on its axles. In the laboratory experiment, the vehicle–bridge interaction model consists of a scaled two-axle vehicle model crossing a simply supported steel beam. The bridge model also includes a scaled road surface profile. The effects of varying the vehicle model configuration and speed are investigated. A finite element beam model is calibrated using the experimental results, and a novel algorithm for the identification of global bridge stiffness is validated. Using measured vehicle accelerations as input to the algorithm, the beam stiffness is identified with a reasonable degree of accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Understanding the heterogeneous genotypes and phenotypes of prostate cancer is fundamental to improving the way we treat this disease. As yet, there are no validated descriptions of prostate cancer subgroups derived from integrated genomics linked with clinical outcome.

METHODS: In a study of 482 tumour, benign and germline samples from 259 men with primary prostate cancer, we used integrative analysis of copy number alterations (CNA) and array transcriptomics to identify genomic loci that affect expression levels of mRNA in an expression quantitative trait loci (eQTL) approach, to stratify patients into subgroups that we then associated with future clinical behaviour, and compared with either CNA or transcriptomics alone.

FINDINGS: We identified five separate patient subgroups with distinct genomic alterations and expression profiles based on 100 discriminating genes in our separate discovery and validation sets of 125 and 103 men. These subgroups were able to consistently predict biochemical relapse (p = 0.0017 and p = 0.016 respectively) and were further validated in a third cohort with long-term follow-up (p = 0.027). We show the relative contributions of gene expression and copy number data on phenotype, and demonstrate the improved power gained from integrative analyses. We confirm alterations in six genes previously associated with prostate cancer (MAP3K7, MELK, RCBTB2, ELAC2, TPD52, ZBTB4), and also identify 94 genes not previously linked to prostate cancer progression that would not have been detected using either transcript or copy number data alone. We confirm a number of previously published molecular changes associated with high risk disease, including MYC amplification, and NKX3-1, RB1 and PTEN deletions, as well as over-expression of PCA3 and AMACR, and loss of MSMB in tumour tissue. A subset of the 100 genes outperforms established clinical predictors of poor prognosis (PSA, Gleason score), as well as previously published gene signatures (p = 0.0001). We further show how our molecular profiles can be used for the early detection of aggressive cases in a clinical setting, and inform treatment decisions.

INTERPRETATION: For the first time in prostate cancer this study demonstrates the importance of integrated genomic analyses incorporating both benign and tumour tissue data in identifying molecular alterations leading to the generation of robust gene sets that are predictive of clinical outcome in independent patient cohorts.