903 resultados para Tests for Continuous Lifetime Data
Resumo:
Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.
Resumo:
It is of interest in some applications to determine whether there is a relationship between a hazard rate function (or a cumulative incidence function) and a mark variable which is only observed at uncensored failure times. We develop nonparametric tests for this problem when the mark variable is continuous. Tests are developed for the null hypothesis that the mark-specific hazard rate is independent of the mark versus ordered and two-sided alternatives expressed in terms of mark-specific hazard functions and mark-specific cumulative incidence functions. The test statistics are based on functionals of a bivariate test process equal to a weighted average of differences between a Nelson--Aalen-type estimator of the mark-specific cumulative hazard function and a nonparametric estimator of this function under the null hypothesis. The weight function in the test process can be chosen so that the test statistics are asymptotically distribution-free.Asymptotically correct critical values are obtained through a simple simulation procedure. The testing procedures are shown to perform well in numerical studies, and are illustrated with an AIDS clinical trial example. Specifically, the tests are used to assess if the instantaneous or absolute risk of treatment failure depends on the amount of accumulation of drug resistance mutations in a subject's HIV virus. This assessment helps guide development of anti-HIV therapies that surmount the problem of drug resistance.
Resumo:
This paper is the fourth in a series of reviews that will summarize available data and critically discuss the potential role of lung-function testing in infants with acute neonatal respiratory disorders and chronic lung disease of infancy. The current paper addresses information derived from tidal breathing measurements within the framework outlined in the introductory paper of this series, with particular reference to how these measurements inform on control of breathing. Infants with acute and chronic respiratory illness demonstrate differences in tidal breathing and its control that are of clinical consequence and can be measured objectively. The increased incidence of significant apnea in preterm infants and infants with chronic lung disease, together with the reportedly increased risk of sudden unexplained death within the latter group, suggests that control of breathing is affected by both maturation and disease. Clinical observations are supported by formal comparison of tidal breathing parameters and control of breathing indices in the research setting.
Resumo:
BACKGROUND: This study is based on a comprehensive survey of the neuropsychological attention-deficit hyperactivity disorder (ADHD) literature and presents the first psychometric analyses of different parameters of intra-subject variability (ISV) in patients with ADHD compared to healthy controls, using the Continuous Performance Test, a Go-NoGo task, a Stop Signal Task, as well as N-back tasks. METHODS: Data of 57 patients with ADHD and 53 age- and gender-matched controls were available for statistical analysis. Different parameters were used to describe central tendency (arithmetic mean, median), dispersion (standard deviation, coefficient of variation, consecutive variance), and shape (skewness, excess) of reaction time distributions, as well as errors (commissions and omissions). RESULTS: Group comparisons revealed by far the strongest effect sizes for measures of dispersion, followed by measures of central tendency, and by commission errors. Statistical control of ISV reduced group differences in the other measures substantially. One (patients) or two (controls) principal components explained up to 67% of the inter-individual differences in intra-individual variability. CONCLUSIONS: Results suggest that, across a variety of neuropsychological tests, measures of ISV contribute best to group discrimination, with limited incremental validity of measures of central tendency and errors. Furthermore, increased ISV might be a unitary construct in ADHD.
Resumo:
The last few years have seen the advent of high-throughput technologies to analyze various properties of the transcriptome and proteome of several organisms. The congruency of these different data sources, or lack thereof, can shed light on the mechanisms that govern cellular function. A central challenge for bioinformatics research is to develop a unified framework for combining the multiple sources of functional genomics information and testing associations between them, thus obtaining a robust and integrated view of the underlying biology. We present a graph theoretic approach to test the significance of the association between multiple disparate sources of functional genomics data by proposing two statistical tests, namely edge permutation and node label permutation tests. We demonstrate the use of the proposed tests by finding significant association between a Gene Ontology-derived "predictome" and data obtained from mRNA expression and phenotypic experiments for Saccharomyces cerevisiae. Moreover, we employ the graph theoretic framework to recast a surprising discrepancy presented in Giaever et al. (2002) between gene expression and knockout phenotype, using expression data from a different set of experiments.
Resumo:
OBJECTIVE: To consider the reasons and context for test ordering by doctors when faced with an undiagnosed complaint in primary or secondary care. STUDY DESIGN AND SETTING: We reviewed any study of any design that discussed factors that may affect a doctor's decision to order a test. Articles were located through searches of electronic databases, authors' files on diagnostic methodology, and reference lists of relevant studies. We extracted data on: study design, type of analysis, setting, topic area, and any factors reported to influence test ordering. RESULTS: We included 37 studies. We carried out a thematic analysis to synthesize data. Five key groupings arose from this process: diagnostic factors, therapeutic and prognostic factors, patient-related factors, doctor-related factors, and policy and organization-related factors. To illustrate how the various factors identified may influence test ordering we considered the symptom low back pain and the diagnosis multiple sclerosis as examples. CONCLUSIONS: A wide variety of factors influence a doctor's decision to order a test. These are integral to understanding diagnosis in clinical practice. Traditional diagnostic accuracy studies should be supplemented with research into the broader context in which doctors perform their work.
Resumo:
Stem cells of various tissues are typically defined as multipotent cells with 'self-renewal' properties. Despite the increasing interest in stem cells, surprisingly little is known about the number of times stem cells can or do divide over a lifetime. Based on telomere-length measurements of hematopoietic cells, we previously proposed that the self-renewal capacity of hematopoietic stem cells is limited by progressive telomere attrition and that such cells divide very rapidly during the first year of life. Recent studies of patients with aplastic anemia resulting from inherited mutations in telomerase genes support the notion that the replicative potential of hematopoietic stem cells is directly related to telomere length, which is indirectly related to telomerase levels. To revisit conclusions about stem cell turnover based on cross-sectional studies of telomere length, we performed a longitudinal study of telomere length in leukocytes from newborn baboons. All four individual animals studied showed a rapid decline in telomere length (approximately 2-3 kb) in granulocytes and lymphocytes in the first year after birth. After 50-70 weeks the telomere length appeared to stabilize in all cell types. These observations suggest that hematopoietic stem cells, after an initial phase of rapid expansion, switch at around 1 year of age to a different functional mode characterized by a markedly decreased turnover rate.
Resumo:
INTRODUCTION: Recent advances in medical imaging have brought post-mortem minimally invasive computed tomography (CT) guided percutaneous biopsy to public attention. AIMS: The goal of the following study was to facilitate and automate post-mortem biopsy, to suppress radiation exposure to the investigator, as may occur when tissue sampling under computer tomographic guidance, and to minimize the number of needle insertion attempts for each target for a single puncture. METHODS AND MATERIALS: Clinically approved and post-mortem tested ACN-III biopsy core needles (14 gauge x 160 mm) with an automatic pistol device (Bard Magnum, Medical Device Technologies, Denmark) were used for probe sampling. The needles were navigated in gelatine/peas phantom, ex vivo porcine model and subsequently in two human bodies using a navigation system (MEM centre/ISTB Medical Application Framework, Marvin, Bern, Switzerland) with guidance frame and a CT (Emotion 6, Siemens, Germany). RESULTS: Biopsy of all peas could be performed within a single attempt. The average distance between the inserted needle tip and the pea centre was 1.4mm (n=10; SD 0.065 mm; range 0-2.3 mm). The targets in the porcine liver were also accurately punctured. The average of the distance between the needle tip and the target was 0.5 mm (range 0-1 mm). Biopsies of brain, heart, lung, liver, pancreas, spleen, and kidney were performed on human corpses. For each target the biopsy needle was only inserted once. The examination of one body with sampling of tissue probes at the above-mentioned locations took approximately 45 min. CONCLUSIONS: Post-mortem navigated biopsy can reliably provide tissue samples from different body locations. Since the continuous update of positional data of the body and the biopsy needle is performed using optical tracking, no control CT images verifying the positional data are necessary and no radiation exposure to the investigator need be taken into account. Furthermore, the number of needle insertions for each target can be minimized to a single one with the ex vivo proven adequate accuracy and, in contrast to conventional CT guided biopsy, the insertion angle may be oblique. Navigation for minimally invasive tissue sampling is a useful addition to post-mortem CT guided biopsy.
Resumo:
AIMS/HYPOTHESIS: To assess the use of paediatric continuous subcutaneous infusion (CSII) under real-life conditions by analysing data recorded for up to 90 days and relating them to outcome. METHODS: Pump programming data from patients aged 0-18 years treated with CSII in 30 centres from 16 European countries and Israel were recorded during routine clinical visits. HbA(1c) was measured centrally. RESULTS: A total of 1,041 patients (age: 11.8 +/- 4.2 years; diabetes duration: 6.0 +/- 3.6 years; average CSII duration: 2.0 +/- 1.3 years; HbA(1c): 8.0 +/- 1.3% [means +/- SD]) participated. Glycaemic control was better in preschool (n = 142; 7.5 +/- 0.9%) and pre-adolescent (6-11 years, n = 321; 7.7 +/- 1.0%) children than in adolescent patients (12-18 years, n = 578; 8.3 +/- 1.4%). There was a significant negative correlation between HbA(1c) and daily bolus number, but not between HbA(1c) and total daily insulin dose. The use of <6.7 daily boluses was a significant predictor of an HbA(1c) level >7.5%. The incidence of severe hypoglycaemia and ketoacidosis was 6.63 and 6.26 events per 100 patient-years, respectively. CONCLUSIONS/INTERPRETATION: This large paediatric survey of CSII shows that glycaemic targets can be frequently achieved, particularly in young children, and the incidence of acute complications is low. Adequate substitution of basal and prandial insulin is associated with a better HbA(1c).
Resumo:
Turrialba is one of the largest and most active stratovolcanoes in the Central Cordillera of Costa Rica and an excellent target for validation of satellite data using ground based measurements due to its high elevation, relative ease of access, and persistent elevated SO2 degassing. The Ozone Monitoring Instrument (OMI) aboard the Aura satellite makes daily global observations of atmospheric trace gases and it is used in this investigation to obtain volcanic SO2 retrievals in the Turrialba volcanic plume. We present and evaluate the relative accuracy of two OMI SO2 data analysis procedures, the automatic Band Residual Index (BRI) technique and the manual Normalized Cloud-mass (NCM) method. We find a linear correlation and good quantitative agreement between SO2 burdens derived from the BRI and NCM techniques, with an improved correlation when wet season data are excluded. We also present the first comparisons between volcanic SO2 emission rates obtained from ground-based mini-DOAS measurements at Turrialba and three new OMI SO2 data analysis techniques: the MODIS smoke estimation, OMI SO2 lifetime, and OMI SO2 transect techniques. A robust validation of OMI SO2 retrievals was made, with both qualitative and quantitative agreements under specific atmospheric conditions, proving the utility of satellite measurements for estimating accurate SO2 emission rates and monitoring passively degassing volcanoes.
Resumo:
Sustainable yields from water wells in hard-rock aquifers are achieved when the well bore intersects fracture networks. Fracture networks are often not readily discernable at the surface. Lineament analysis using remotely sensed satellite imagery has been employed to identify surface expressions of fracturing, and a variety of image-analysis techniques have been successfully applied in “ideal” settings. An ideal setting for lineament detection is where the influences of human development, vegetation, and climatic situations are minimal and hydrogeological conditions and geologic structure are known. There is not yet a well-accepted protocol for mapping lineaments nor have different approaches been compared in non-ideal settings. A new approach for image-processing/synthesis was developed to identify successful satellite imagery types for lineament analysis in non-ideal terrain. Four satellite sensors (ASTER, Landsat7 ETM+, QuickBird, RADARSAT-1) and a digital elevation model were evaluated for lineament analysis in Boaco, Nicaragua, where the landscape is subject to varied vegetative cover, a plethora of anthropogenic features, and frequent cloud cover that limit the availability of optical satellite data. A variety of digital image processing techniques were employed and lineament interpretations were performed to obtain 12 complementary image products that were evaluated subjectively to identify lineaments. The 12 lineament interpretations were synthesized to create a raster image of lineament zone coincidence that shows the level of agreement among the 12 interpretations. A composite lineament interpretation was made using the coincidence raster to restrict lineament observations to areas where multiple interpretations (at least 4) agree. Nine of the 11 previously mapped faults were identified from the coincidence raster. An additional 26 lineaments were identified from the coincidence raster, and the locations of 10 were confirmed by field observation. Four manual pumping tests suggest that well productivity is higher for wells proximal to lineament features. Interpretations from RADARSAT-1 products were superior to interpretations from other sensor products, suggesting that quality lineament interpretation in this region requires anthropogenic features to be minimized and topographic expressions to be maximized. The approach developed in this study has the potential to improve siting wells in non-ideal regions.
Resumo:
We hypothesized that the spatial distribution of groundwater inflows through river bottom sediments is a critical factor associated with the selection of coaster brook trout (a life history variant of Salvelinus fontinalis,) spawning sites. An 80-m reach of the Salmon Trout River, in the Huron Mountains of the upper peninsula of Michigan, was selected to test the hypothesis based on long-term documentation of coaster brook trout spawning at this site. Throughout this site, the river is relatively similar along its length with regard to stream channel and substrate features. A monitoring well system consisting of an array of 27 wells was installed to measure subsurface temperatures underneath the riverbed over a 13-month period. The monitoring well locations were separated into areas where spawning has and has not been observed. Over 200,000 total temperature measurements were collected from 5 depths within each of the 27 monitoring wells. Temperatures within the substrate at the spawning area were generally cooler and less variable than river temperatures. Substrate temperatures in the non-spawning area were generally warmer, more variable, and closely tracked temporal variations in river temperatures. Temperature data were inverted to obtain subsurface groundwater velocities using a numerical approximation of the heat transfer equation. Approximately 45,000 estimates of groundwater velocities were obtained. Estimated velocities in the spawning and non-spawning areas confirmed that groundwater velocities in the spawning area were primarily in the upward direction, and were generally greater in magnitude than velocities in the non-spawning area. In the non-spawning area there was a greater occurrence of velocities in the downward direction, and velocity estimates were generally lesser in magnitude than in the spawning area. Both the temperature and velocity results confirm the hypothesis that spawning sites correspond to areas of significant groundwater influx to the river bed.
Resumo:
This dissertation has three separate parts: the first part deals with the general pedigree association testing incorporating continuous covariates; the second part deals with the association tests under population stratification using the conditional likelihood tests; the third part deals with the genome-wide association studies based on the real rheumatoid arthritis (RA) disease data sets from Genetic Analysis Workshop 16 (GAW16) problem 1. Many statistical tests are developed to test the linkage and association using either case-control status or phenotype covariates for family data structure, separately. Those univariate analyses might not use all the information coming from the family members in practical studies. On the other hand, the human complex disease do not have a clear inheritance pattern, there might exist the gene interactions or act independently. In part I, the new proposed approach MPDT is focused on how to use both the case control information as well as the phenotype covariates. This approach can be applied to detect multiple marker effects. Based on the two existing popular statistics in family studies for case-control and quantitative traits respectively, the new approach could be used in the simple family structure data set as well as general pedigree structure. The combined statistics are calculated using the two statistics; A permutation procedure is applied for assessing the p-value with adjustment from the Bonferroni for the multiple markers. We use simulation studies to evaluate the type I error rates and the powers of the proposed approach. Our results show that the combined test using both case-control information and phenotype covariates not only has the correct type I error rates but also is more powerful than the other existing methods. For multiple marker interactions, our proposed method is also very powerful. Selective genotyping is an economical strategy in detecting and mapping quantitative trait loci in the genetic dissection of complex disease. When the samples arise from different ethnic groups or an admixture population, all the existing selective genotyping methods may result in spurious association due to different ancestry distributions. The problem can be more serious when the sample size is large, a general requirement to obtain sufficient power to detect modest genetic effects for most complex traits. In part II, I describe a useful strategy in selective genotyping while population stratification is present. Our procedure used a principal component based approach to eliminate any effect of population stratification. The paper evaluates the performance of our procedure using both simulated data from an early study data sets and also the HapMap data sets in a variety of population admixture models generated from empirical data. There are one binary trait and two continuous traits in the rheumatoid arthritis dataset of Problem 1 in the Genetic Analysis Workshop 16 (GAW16): RA status, AntiCCP and IgM. To allow multiple traits, we suggest a set of SNP-level F statistics by the concept of multiple-correlation to measure the genetic association between multiple trait values and SNP-specific genotypic scores and obtain their null distributions. Hereby, we perform 6 genome-wide association analyses using the novel one- and two-stage approaches which are based on single, double and triple traits. Incorporating all these 6 analyses, we successfully validate the SNPs which have been identified to be responsible for rheumatoid arthritis in the literature and detect more disease susceptibility SNPs for follow-up studies in the future. Except for chromosome 13 and 18, each of the others is found to harbour susceptible genetic regions for rheumatoid arthritis or related diseases, i.e., lupus erythematosus. This topic is discussed in part III.
Resumo:
In-cylinder pressure transducers have been used for decades to record combustion pressure inside a running engine. However, due to the extreme operating environment, transducer design and installation must be considered in order to minimize measurement error. One such error is caused by thermal shock, where the pressure transducer experiences a high heat flux that can distort the pressure transducer diaphragm and also change the crystal sensitivity. This research focused on investigating the effects of thermal shock on in-cylinder pressure transducer data quality using a 2.0L, four-cylinder, spark-ignited, direct-injected, turbo-charged GM engine. Cylinder four was modified with five ports to accommodate pressure transducers of different manufacturers. They included an AVL GH14D, an AVL GH15D, a Kistler 6125C, and a Kistler 6054AR. The GH14D, GH15D, and 6054AR were M5 size transducers. The 6125C was a larger, 6.2mm transducer. Note that both of the AVL pressure transducers utilized a PH03 flame arrestor. Sweeps of ignition timing (spark sweep), engine speed, and engine load were performed to study the effects of thermal shock on each pressure transducer. The project consisted of two distinct phases which included experimental engine testing as well as simulation using a commercially available software package. A comparison was performed to characterize the quality of the data between the actual cylinder pressure and the simulated results. This comparison was valuable because the simulation results did not include thermal shock effects. All three sets of tests showed the peak cylinder pressure was basically unaffected by thermal shock. Comparison of the experimental data with the simulated results showed very good correlation. The spark sweep was performed at 1300 RPM and 3.3 bar NMEP and showed that the differences between the simulated results (no thermal shock) and the experimental data for the indicated mean effective pressure (IMEP) and the pumping mean effective pressure (PMEP) were significantly less than the published accuracies. All transducers had an IMEP percent difference less than 0.038% and less than 0.32% for PMEP. Kistler and AVL publish that the accuracy of their pressure transducers are within plus or minus 1% for the IMEP (AVL 2011; Kistler 2011). In addition, the difference in average exhaust absolute pressure between the simulated results and experimental data was the greatest for the two Kistler pressure transducers. The location and lack of flame arrestor are believed to be the cause of the increased error. For the engine speed sweep, the torque output was held constant at 203 Nm (150 ft-lbf) from 1500 to 4000 RPM. The difference in IMEP was less than 0.01% and the PMEP was less than 1%, except for the AVL GH14D which was 5% and the AVL GH15DK which was 2.25%. A noticeable error in PMEP appeared as the load increased during the engine speed sweeps, as expected. The load sweep was conducted at 2000 RPM over a range of NMEP from 1.1 to 14 bar. The difference in IMEP values were less 0.08% while the PMEP values were below 1% except for the AVL GH14D which was 1.8% and the AVL GH15DK which was at 1.25%. In-cylinder pressure transducer data quality was effectively analyzed using a combination of experimental data and simulation results. Several criteria can be used to investigate the impact of thermal shock on data quality as well as determine the best location and thermal protection for various transducers.
Resumo:
Coal is an aggregation of vegetal matter with varying small amounts of mineral and animal matter which have been so changed by the processes of sedimentation, decay and metamorphism that it has become a dense, dark, combustible substance. It occurs in beds varying in thickness from one foot or less to over 300 feet. The horizontal extent of a bed is sometimes continuous over an area as large as the State of Montana.