123 resultados para False confession


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To demonstrate the potential value of screening for Down's Syndrome using highly correlated repeated measures of serum markers taken in the first and second trimesters of pregnancy. Design A Monte Carlo simulation study. Population Detection rates and false positive rates relating to the maternal age distribution of England and Wales for the period 1996 to 1998 were obtained using marker distributions from the SURUSS study. Results Screening using first trimester nuchal translucency and repeated measures of uE3 and PAPP-A in the first and second trimester has an estimated false positive rate of 0.3% for an 85% detection rate. This should be compared with the integrated test with an estimated false positive rate of 1.2% for the same detection rate. Conclusionsâ?? The performance of repeated measures screening tests, and their acceptability to women, should be assessed in further prospective studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective Within the framework of a health technology assessment and using an economic model, to determine the most clinically and cost effective policy of scanning and screening for fetal abnormalities in early pregnancy. Design A discrete event simulation model of 50,000 singleton pregnancies. Setting Maternity services in Scotland. Population Women during the first 24 weeks of their pregnancy. Methods The mathematical model was populated with data on uptake of screening, prevalence, detection and false positive rates for eight fetal abnormalities and with costs for ultrasound scanning and serum screening. Inclusion of abnormalities was based on the relative prevalence and clinical importance of conditions and the availability of data. Six strategies for the identification of abnormalities prenatally including combinations of first and second trimester ultrasound scanning and first and second trimester screening for chromosomal abnormalities were compared. Main outcome measures The number of abnormalities detected and missed, the number of iatrogenic losses resulting from invasive tests, the total cost of strategies and the cost per abnormality detected were compared between strategies. Results First trimester screening for chromosomal abnormalities costs more than second trimester screening but results in fewer iatrogenic losses. Strategies which include a second trimester ultrasound scan result in more abnormalities being detected and have lower costs per anomaly detected. Conclusions The preferred strategy includes both first and second trimester ultrasound scans and a first trimester screening test for chromosomal abnormalities. It has been recommended that this policy is offered to all women in Scotland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To present a first and second trimester Down syndrome screening strategy, whereby second-trimester marker determination is contingent on the first-trimester results. Unlike non-disclosure sequential screening (the Integrated test), which requires all women to have markers in both trimesters, this allows a large proportion of the women to complete screening in the first trimester. Methods Two first-trimester risk cut-offs defined three types of results: positive and referred for early diagnosis; negative with screening complete; and intermediate, needing second-trimester markers. Multivariate Gaussian modelling with Monte Carlo simulation was used to estimate the false-positive rate for a fixed 85% detection rate. The false-positive rate was evaluated for various early detection rates and early test completion rates. Model parameters were taken from the SURUSS trial. Results Completion of screening in the first trimester for 75% of women resulted in a 30% early detection rate and a 55% second trimester detected rate (net 85%) with a false-positive rate only 0.1% above that achievable by the Integrated test. The screen-positive rate was 0.1% in the first trimester and 4.7% for those continuing to be tested in the second trimester. If the early detection rate were to be increased to 45% or the early completion rate were to be increased to 80%, there would be a further 0.1% increase in the false-positive rate. Conclusion Contingent screening can achieve results comparable with the Integrated test but with earlier completion of screening for most women. Both strategies need to be evaluated in large-scale prospective studies particularly in relation to psychological impact and practicability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To demonstrate the potential value of three-stage sequential screening for Down syndrome. Methods Protocols were considered in which maternal serum pregnancy associated plasma protein-A (PAPP-A) and free -human chorionic gonadotropin (hCG) measurements were taken on all women in the first trimester. Those women with very low Down syndrome risks were screened negative at that stage and nuchal translucency (NT) was measured on the remainder and the risk reassessed. Those with very low risk were then screened negative and those with very high risk were offered early diagnostic testing. Those with intermediate risks received second-trimester maternal serum -fetoprotein, free -hCG, unconjugated estriol and inhibin-A. Risk was then reassessed and those with high risk were offered diagnosis. Detection rates and false-positive rates were estimated by multivariate Gaussian modelling using Monte-Carlo simulation. Results The modelling suggests that, with full adherence to a three-stage policy, overall detection rates of nearly 90% and false-positive rates below 2.0% can be achieved. Approximately two-thirds of pregnancies are screened on the basis of first-trimester biochemistry alone, five out of six women complete their screening in the first trimester, and the first-trimester detection rate is over 60%. Conclusion Three-stage contingent sequential screening is potentially highly effective for Down syndrome screening. The acceptability of this protocol and its performance in practice, should be tested in prospective studies. Copyright © 2006 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the first paper that shows and theoretically analyses that the presence of auto-correlation can produce considerable alterations in the Type I and Type II errors in univariate and multivariate statistical control charts. To remove this undesired effect, linear inverse ARMA filter are employed and the application studies in this paper show that false alarms (increased Type I errors) and an insensitive monitoring statistics (increased Type II errors) were eliminated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work in this paper is of particular significance since it considers the problem of modelling cross- and auto-correlation in statistical process monitoring. The presence of both types of correlation can lead to fault insensitivity or false alarms, although in published literature to date, only autocorrelation has been broadly considered. The proposed method, which uses a Kalman innovation model, effectively removes both correlations. The paper (and Part 2 [2]) has emerged from work supported by EPSRC grant GR/S84354/01 and is of direct relevance to problems in several application areas including chemical, electrical, and mechanical process monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report cross sections for Ps(1s)-Li(2s) scattering in the energy range up to 30 eV. The calculations have been carried out in a coupled state approximation. The Ps states consist of both eigenstates and pseudostates. the latter to allow for ionization of the Ps. The atom is treated as a frozen core represented by it model potential which supports the valence orbitals. The coupled state expansion includes only the 2s and 2p states of the atom as well as in unphysical Is state which exists in the model potential. The inclusion of this Is state is necessary in order to avoid pronounced false pseudostructure. Results are presented for excitation and ionization of the Ps as well as collisions in which the Ps(1s) remains unchanged. These results also differentiate between the case where the Li(2s) remains unexcited and where it is excited to the 2p level. (c) 2005 Published by Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim The aim of this paper is to challenge recent sceptical approaches to the possibility of validating qualitative research and to underline the benefits of adopting a realist approach to validity. Background In recent discussion about the methodological bases for qualitative research it has been argued that, because different methodologies take different approaches to validity, attempts to develop a common set of validation criteria are futile. On the basis of this sceptical view, a number of strategies for judging qualitative research have been proposed. These include suggestions that: it should be judged according to aesthetic or rhetorical criteria, rather than epistemological validity; responsibility for appraisal should move from researchers to readers; each methodology should be assessed individually according to its own merits. Discussion None of these suggestions provide a viable alternative to validity, defined as the extent to which research reflects accurately that to which it refers. Because the form of research does not determine its content, replacement of epistemology by aesthetics is unsustainable. Because research reports mediate between writer and reader, a one-sided approach to this relationship constitutes a false dichotomy. If we accept the criterion of practitioner confidence as a means of judging methodological approaches, this involves rejection of judgement according to a methodology�s own merits. Conclusion If qualitative research is actually about something, and if it is required to provide beneficial information, then a realist approach to validity holds out greatest promise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polymerase chain reaction (PCR) assessment of clonal immunoglobulin (Ig) and T-cell receptor (TCR) gene rearrangements is an important diagnostic tool in mature B-cell neoplasms. However, lack of standardized PCR protocols resulting in a high level of false negativity has hampered comparability of data in previous clonality studies. In order to address these problems, 22 European laboratories investigated the Ig/TCR rearrangement patterns as well as t(14;18) and t(11;14) translocations of 369 B-cell malignancies belonging to five WHO-defined entities using the standardized BIOMED-2 multiplex PCR tubes accompanied by international pathology panel review. B-cell clonality was detected by combined use of the IGH and IGK multiplex PCR assays in all 260 definitive cases of B-cell chronic lymphocytic leukemia (n¼56), mantle cell lymphoma (n¼54), marginal zone lymphoma (n¼41) and follicular lymphoma (n¼109). Two of 109 cases of diffuse large B-cell lymphoma showed no detectable clonal marker. The use of these techniques to assign cell lineage should be treated with caution as additional clonal TCR gene rearrangements were frequently detected in all disease categories. Our study indicates that the BIOMED-2 multiplex PCR assays provide a powerful strategy for clonality assessment in B-cell malignancies resulting in high Ig clonality detection rates particularly when IGH and IGK strategies are combined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dipicolinic acid (DPA) is an excellent marker compound for bacterial spores, including those of Bacillus anthracis ( anthrax). Surface-enhanced Raman spectroscopy (SERS) potentially has the sensitivity and discrimination needed for trace DPA analysis, but mixing DPA solutions with citrate-reduced silver colloid only yielded measurable SERS spectra at much higher (> 80 ppm) concentrations than would be desirable for anthrax detection. Aggregation of the colloid with halide salts eliminated even these small DPA bands but aggregation with Na2SO4(aq) resulted in a remarkable increase in the DPA signals. With sulfate aggregation even 1 ppm solutions gave detectable signals with 10 s accumulation times, which is in the sensitivity range required. Addition of CNS- as an internal standard allowed quantitative DPA analysis, plotting the intensity of the strong DPA 1010 cm(-1) band (normalised to the ca. 2120 cm(-1) CNS- band) against DPA concentration gave a linear calibration (R-2 = 0.986) over the range 0 - 50 ppm DPA. The inclusion of thiocyanate also allows false negatives due to accidental deactivation of the enhancing medium to be detected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims/hypothesis: Diabetic nephropathy, characterised by persistent proteinuria, hypertension and progressive kidney failure, affects a subset of susceptible individuals with diabetes. It is also a leading cause of end-stage renal disease (ESRD). Non-synonymous (ns) single nucleotide polymorphisms (SNPs) have been reported to contribute to genetic susceptibility in both monogenic disorders and common complex diseases. The objective of this study was to investigate whether nsSNPs are involved in susceptibility to diabetic nephropathy using a case-control design.

Methods: White type 1 diabetic patients with (cases) and without (controls) nephropathy from eight centres in the UK and Ireland were genotyped for a selected subset of nsSNPs using Illumina's GoldenGate BeadArray assay. A ? 2 test for trend, stratified by centre, was used to assess differences in genotype distribution between cases and controls. Genomic control was used to adjust for possible inflation of test statistics, and the False Discovery Rate method was used to account for multiple testing.

Results: We assessed 1,111 nsSNPs for association with diabetic nephropathy in 1,711 individuals with type 1 diabetes (894 cases, 817 controls). A number of SNPs demonstrated a significant difference in genotype distribution between groups before but not after correction for multiple testing. Furthermore, neither subgroup analysis (diabetic nephropathy with ESRD or diabetic nephropathy without ESRD) nor stratification by duration of diabetes revealed any significant differences between groups.

Conclusions/interpretation: The nsSNPs investigated in this study do not appear to contribute significantly to the development of diabetic nephropathy in patients with type 1 diabetes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, genome wide association studies (GWAS) have identified a number of single nucleotide polymorphisms (SNPs) as being associated with coronary heart disease (CHD). We estimated the effect of these SNPs on incident CHD, stroke and total mortality in the prospective cohorts of the MORGAM Project. We studied cohorts from Finland, Sweden, France and Northern Ireland (total N=33,282, including 1,436 incident CHD events and 571 incident stroke events). The lead SNPs at seven loci identified thus far and additional SNPs (in total 42) were genotyped using a case-cohort design.We estimated the effect of the SNPs on disease history at baseline, disease events during follow-up and classic risk factors. Multiple testing was taken into account using false discovery rate (FDR) analysis. SNP rs1333049 on chromosome 9p21.3 was associated with both CHD and stroke (HR5=.20, 95% CI 1.08-1.34 for incident CHD events and 1.15, 0.99-1.34 for incident stroke). SNP rs11670734 (19q12) was associated with total mortality and stroke. SNP rs2146807 (10q11.21) showed some association with the fatality of acute coronary event. SNP rs2943634 (2q36.3) was associated with high density lipoprotein (HDL) cholesterol and SNPs rs599839, rs4970834 (1p13.3) and rs17228212 (15q22.23) were associated with non-HDL cholesterol. SNPs rs2943634 (2q36.3) and rs12525353 (6q25.1) were associated with blood pressure. These findings underline the need for replication studies in prospective settings and confirm the candidacy of several SNPs that may play a role in the etiology of cardiovascular disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background

Interaction of a drug or chemical with a biological system can result in a gene-expression profile or signature characteristic of the event. Using a suitably robust algorithm these signatures can potentially be used to connect molecules with similar pharmacological or toxicological properties by gene expression profile. Lamb et al first proposed the Connectivity Map [Lamb et al (2006), Science 313, 1929–1935] to make successful connections among small molecules, genes, and diseases using genomic signatures.

Results

Here we have built on the principles of the Connectivity Map to present a simpler and more robust method for the construction of reference gene-expression profiles and for the connection scoring scheme, which importantly allows the valuation of statistical significance of all the connections observed. We tested the new method with two randomly generated gene signatures and three experimentally derived gene signatures (for HDAC inhibitors, estrogens, and immunosuppressive drugs, respectively). Our testing with this method indicates that it achieves a higher level of specificity and sensitivity and so advances the original method.

Conclusion

The method presented here not only offers more principled statistical procedures for testing connections, but more importantly it provides effective safeguard against false connections at the same time achieving increased sensitivity. With its robust performance, the method has potential use in the drug development pipeline for the early recognition of pharmacological and toxicological properties in chemicals and new drug candidates, and also more broadly in other 'omics sciences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivation: Many biomedical experiments are carried out by pooling individual biological samples. However, pooling samples can potentially hide biological variance and give false confidence concerning the data significance. In the context of microarray experiments for detecting differentially expressed genes, recent publications have addressed the problem of the efficiency of sample pooling, and some approximate formulas were provided for the power and sample size calculations. It is desirable to have exact formulas for these calculations and have the approximate results checked against the exact ones. We show that the difference between the approximate and the exact results can be large.