891 resultados para event tree analysis
Resumo:
Triggered event-related functional magnetic resonance imaging requires sparse intervals of temporally resolved functional data acquisitions, whose initiation corresponds to the occurrence of an event, typically an epileptic spike in the electroencephalographic trace. However, conventional fMRI time series are greatly affected by non-steady-state magnetization effects, which obscure initial blood oxygen level-dependent (BOLD) signals. Here, conventional echo-planar imaging and a post-processing solution based on principal component analysis were employed to remove the dominant eigenimages of the time series, to filter out the global signal changes induced by magnetization decay and to recover BOLD signals starting with the first functional volume. This approach was compared with a physical solution using radiofrequency preparation, which nullifies magnetization effects. As an application of the method, the detectability of the initial transient BOLD response in the auditory cortex, which is elicited by the onset of acoustic scanner noise, was used to demonstrate that post-processing-based removal of magnetization effects allows to detect brain activity patterns identical with those obtained using the radiofrequency preparation. Using the auditory responses as an ideal experimental model of triggered brain activity, our results suggest that reducing the initial magnetization effects by removing a few principal components from fMRI data may be potentially useful in the analysis of triggered event-related echo-planar time series. The implications of this study are discussed with special caution to remaining technical limitations and the additional neurophysiological issues of the triggered acquisition.
Resumo:
Searching for the neural correlates of visuospatial processing using functional magnetic resonance imaging (fMRI) is usually done in an event-related framework of cognitive subtraction, applying a paradigm comprising visuospatial cognitive components and a corresponding control task. Besides methodological caveats of the cognitive subtraction approach, the standard general linear model with fixed hemodynamic response predictors bears the risk of being underspecified. It does not take into account the variability of the blood oxygen level-dependent signal response due to variable task demand and performance on the level of each single trial. This underspecification may result in reduced sensitivity regarding the identification of task-related brain regions. In a rapid event-related fMRI study, we used an extended general linear model including single-trial reaction-time-dependent hemodynamic response predictors for the analysis of an angle discrimination task. In addition to the already known regions in superior and inferior parietal lobule, mapping the reaction-time-dependent hemodynamic response predictor revealed a more specific network including task demand-dependent regions not being detectable using the cognitive subtraction method, such as bilateral caudate nucleus and insula, right inferior frontal gyrus and left precentral gyrus.
Resumo:
Use of microarray technology often leads to high-dimensional and low- sample size data settings. Over the past several years, a variety of novel approaches have been proposed for variable selection in this context. However, only a small number of these have been adapted for time-to-event data where censoring is present. Among standard variable selection methods shown both to have good predictive accuracy and to be computationally efficient is the elastic net penalization approach. In this paper, adaptation of the elastic net approach is presented for variable selection both under the Cox proportional hazards model and under an accelerated failure time (AFT) model. Assessment of the two methods is conducted through simulation studies and through analysis of microarray data obtained from a set of patients with diffuse large B-cell lymphoma where time to survival is of interest. The approaches are shown to match or exceed the predictive performance of a Cox-based and an AFT-based variable selection method. The methods are moreover shown to be much more computationally efficient than their respective Cox- and AFT- based counterparts.
Resumo:
Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.
Resumo:
Recurrent event data are largely characterized by the rate function but smoothing techniques for estimating the rate function have never been rigorously developed or studied in statistical literature. This paper considers the moment and least squares methods for estimating the rate function from recurrent event data. With an independent censoring assumption on the recurrent event process, we study statistical properties of the proposed estimators and propose bootstrap procedures for the bandwidth selection and for the approximation of confidence intervals in the estimation of the occurrence rate function. It is identified that the moment method without resmoothing via a smaller bandwidth will produce curve with nicks occurring at the censoring times, whereas there is no such problem with the least squares method. Furthermore, the asymptotic variance of the least squares estimator is shown to be smaller under regularity conditions. However, in the implementation of the bootstrap procedures, the moment method is computationally more efficient than the least squares method because the former approach uses condensed bootstrap data. The performance of the proposed procedures is studied through Monte Carlo simulations and an epidemiological example on intravenous drug users.
Resumo:
OBJECTIVE: To estimate the prognosis over 5 years of HIV-1-infected, treatment-naive patients starting HAART, taking into account the immunological and virological response to therapy. DESIGN: A collaborative analysis of data from 12 cohorts in Europe and North America on 20,379 adults who started HAART between 1995 and 2003. METHODS: Parametric survival models were used to predict the cumulative incidence at 5 years of a new AIDS-defining event or death, and death alone, first from the start of HAART and second from 6 months after the start of HAART. Data were analysed by intention-to-continue-treatment, ignoring treatment changes and interruptions. RESULTS: During 61 798 person-years of follow-up, 1005 patients died and an additional 1303 developed AIDS. A total of 10 046 (49%) patients started HAART either with a CD4 cell count of less than 200 cells/microl or with a diagnosis of AIDS. The 5-year risk of AIDS or death (death alone) from the start of HAART ranged from 5.6 to 77% (1.8-65%), depending on age, CD4 cell count, HIV-1-RNA level, clinical stage, and history of injection drug use. From 6 months the corresponding figures were 4.1-99% for AIDS or death and 1.3-96% for death alone. CONCLUSION: On the basis of data collected routinely in HIV care, prognostic models with high discriminatory power over 5 years were developed for patients starting HAART in industrialized countries. A risk calculator that produces estimates for progression rates at years 1 to 5 after starting HAART is available from www.art-cohort-collaboration.org.
Resumo:
BACKGROUND: Previous meta-analyses described moderate to large benefits of chondroitin in patients with osteoarthritis. However, recent large-scale trials did not find evidence of an effect. PURPOSE: To determine the effects of chondroitin on pain in patients with osteoarthritis. DATA SOURCES: The authors searched the Cochrane Central Register of Controlled Trials (1970 to 2006), MEDLINE (1966 to 2006), EMBASE (1980 to 2006), CINAHL (1970 to 2006), and conference proceedings; checked reference lists; and contacted authors. The last update of searches was performed on 30 November 2006. STUDY SELECTION: Studies were included if they were randomized or quasi-randomized, controlled trials that compared chondroitin with placebo or with no treatment in patients with osteoarthritis of the knee or hip. There were no language restrictions. DATA EXTRACTION: The authors extracted data in duplicate. Effect sizes were calculated from the differences in means of pain-related outcomes between treatment and control groups at the end of the trial, divided by the pooled SD. Trials were combined by using random-effects meta-analysis. DATA SYNTHESIS: 20 trials (3846 patients) contributed to the meta-analysis, which revealed a high degree of heterogeneity among the trials (I2 = 92%). Small trials, trials with unclear concealment of allocation, and trials that were not analyzed according to the intention-to-treat principle showed larger effects in favor of chondroitin than did the remaining trials. When the authors restricted the analysis to the 3 trials with large sample sizes and an intention-to-treat analysis, 40% of patients were included. This resulted in an effect size of -0.03 (95% CI, -0.13 to 0.07; I2 = 0%) and corresponded to a difference of 0.6 mm on a 10-cm visual analogue scale. A meta-analysis of 12 trials showed a pooled relative risk of 0.99 (CI, 0.76 to 1.31) for any adverse event. LIMITATIONS: For 9 trials, the authors had to use approximations to calculate effect sizes. Trial quality was generally low, heterogeneity among the trials made initial interpretation of results difficult, and exploring sources of heterogeneity in meta-regression and stratified analyses may be unreliable. CONCLUSIONS: Large-scale, methodologically sound trials indicate that the symptomatic benefit of chondroitin is minimal or nonexistent. Use of chondroitin in routine clinical practice should therefore be discouraged.
Resumo:
In 2009 and 2010 a study was conducted on the Hiawatha National Forest (HNF) to determine if whole-tree harvest (WTH) of jack pine would deplete the soil nutrients in the very coarse-textured Rubicon soil. WTH is restricted on Rubicon sand in order to preserve the soil fertility, but the increasing construction of biomass-fueled power plants is expected to increase the demand for forest biomass. The specific objectives of this study were to estimate biomass and nutrient content of above- and below-ground tree components in mature jack pine (Pinus banksiana) stands growing on a coarse-textured, low-productivity soil, determine pools of total C and N and exchangeable soil cations in Rubicon sand, and to compare the possible impacts of conventional stem-only harvest (CH) and WTH on soil nutrient pools and the implications for productivity of subsequent rotations. Four even-aged jack pine stands on Rubicon soil were studied. Allometric equations were used to estimate above-ground biomass and nutrients, and soil samples from each stand were taken for physical and chemical analysis. Results indicate that WTH will result in cation deficits in all stands, with exceptionally large Ca deficits occurring in two stands. Where a deficit does not occur, the cation surplus is small and, chemical weathering and atmospheric deposition is not anticipated to replace the removed cations. CH will result in a surplus of cations, and will likely not result in productivity declines during the first rotation. However even under CH, the surplus is small, and chemical weathering and atmospheric deposition will not supply enough cations for the second rotation.
Resumo:
The electric utility business is an inherently dangerous area to work in with employees exposed to many potential hazards daily. One such hazard is an arc flash. An arc flash is a rapid release of energy, referred to as incident energy, caused by an electric arc. Due to the random nature and occurrence of an arc flash, one can only prepare and minimize the extent of harm to themself, other employees and damage to equipment due to such a violent event. Effective January 1, 2009 the National Electric Safety Code (NESC) requires that an arc-flash assessment be performed by companies whose employees work on or near energized equipment to determine the potential exposure to an electric arc. To comply with the NESC requirement, Minnesota Power’s (MP’s) current short circuit and relay coordination software package, ASPEN OneLinerTM and one of the first software packages to implement an arc-flash module, is used to conduct an arc-flash hazard analysis. At the same time, the package is benchmarked against equations provided in the IEEE Std. 1584-2002 and ultimately used to determine the incident energy levels on the MP transmission system. This report goes into the depth of the history of arc-flash hazards, analysis methods, both software and empirical derived equations, issues of concern with calculation methods and the work conducted at MP. This work also produced two offline software products to conduct and verify an offline arc-flash hazard analysis.
Resumo:
BACKGROUND: In HIV type-1-infected patients starting highly active antiretroviral therapy (HAART), the prognostic value of haemoglobin when starting HAART, and of changes in haemoglobin levels, are not well defined. METHODS: We combined data from 10 prospective studies of 12,100 previously untreated individuals (25% women). A total of 4,222 patients (35%) were anaemic: 131 patients (1.1%) had severe (<8.0 g/dl), 1,120 (9%) had moderate (male 8.0-<11.0 g/dl and female 8.0- < 10.0 g/dl) and 2,971 (25%) had mild (male 11.0- < 13.0 g/ dl and female 10.0- < 12.0 g/dl) anaemia. We separately analysed progression to AIDS or death from baseline and from 6 months using Weibull models, adjusting for CD4+ T-cell count, age, sex and other variables. RESULTS: During 48,420 person-years of follow-up 1,448 patients developed at least one AIDS event and 857 patients died. Anaemia at baseline was independently associated with higher mortality: the adjusted hazard ratio (95% confidence interval) for mild anaemia was 1.42 (1.17-1.73), for moderate anaemia 2.56 (2.07-3.18) and for severe anaemia 5.26 (3.55-7.81). Corresponding figures for progression to AIDS were 1.60 (1.37-1.86), 2.00 (1.66-2.40) and 2.24 (1.46-3.42). At 6 months the prevalence of anaemia declined to 26%. Baseline anaemia continued to predict mortality (and to a lesser extent progression to AIDS) in patients with normal haemoglobin or mild anaemia at 6 months. CONCLUSIONS: Anaemia at the start of HAART is an important factor for short- and long-term prognosis, including in patients whose haemoglobin levels improved or normalized during the first 6 months of HAART.