852 resultados para Random effect model
Resumo:
A theory is developed for diffusion-limited charge transfer on a non-fractally rough electrode. The perturbation expressions are obtained for concentration, current density and measured diffusion-limited current for arbitrary one- and two-dimensional surface profiles. The random surface model is employed for a rough electrode\electrolyte interface. In this model the gross geometrical property of an electrochemically active rough surface - the surface structure factor-is related to the average electrode current, current density and concentration. Under short and long time regimes, various morphological features of the rough electrodes, i.e. excess area (related to roughness slope), curvature, correlation length, etc. are related to the (average) current transients. A two-point Pade approximant is used to develop an all time average current expression in terms of partial morphological features of the rough surface. The inverse problem of predicting the surface structure factor from the observed transients is also described. Finally, the effect of surface roughness is studied for specific surface statistics, namely a Gaussian correlation function. It is shown how the surface roughness enhances the overall diffusion-limited charge transfer current.
Resumo:
STUDY QUESTION Is there an association between high levels of sperm DNA damage and miscarriage?SUMMARY ANSWERMiscarriage rates are positively correlated with sperm DNA damage levels.WHAT IS KNOWN ALREADYMost ejaculates contain a subpopulation of sperm with DNA damage, also referred to as DNA fragmentation, in the form of double or single-strand breaks which have been induced in the DNA prior to or following ejaculation. This DNA damage may be particularly elevated in some subfertile men, hence several studies have examined the link between sperm DNA damage levels and conception and miscarriage rates.STUDY DESIGN, SIZE, DURATIONA systematic review and meta-analysis of studies which examined the effect of sperm DNA damage on miscarriage rates was performed. Searches were conducted on MEDLINE, EMBASE and the Cochrane Library without any language restrictions from database inception to January 2012.PARTICIPANTS/MATERIALS, SETTING, METHODSWe used the terms 'DNA damage' or 'DNA fragmentation' combined with 'miscarriage', 'abortion' or 'pregnancy' to generate a set of relevant citations. Data extraction was performed by two reviewers. Study quality was assessed using the Newcastle-Ottawa Scale. Meta-analysis of relative risks of miscarriage was performed with a random effects model. Subgroup analyses were performed by the type of DNA damage test, whether the sperm examined were prepared or from raw semen and for pregnancies resulting from IVF or ICSI treatment.MAIN RESULTS AND THE ROLE OF CHANCEWe identified 16 cohort studies (2969 couples), 14 of which were prospective. Eight studies used acridine orange-based assays, six the TUNEL assay and two the COMET assay. Meta-analysis showed a significant increase in miscarriage in patients with high DNA damage compared with those with low DNA damage [risk ratio (RR) = 2.16 (1.54, 3.03), P <0.00001)]. A subgroup analysis showed that the miscarriage association is strongest for the TUNEL assay (RR = 3.94 (2.45, 6.32), P <0.00001).LIMITATIONS, REASONS FOR CAUTIONThere is some variation in study characteristics, including the use of different assays and different thresholds for DNA damage and the definition of pregnancy loss.WIDER IMPLICATIONS OF THE FINDINGSThe use of methods which select sperm without DNA damage for use in assisted conception treatment may reduce the risk of miscarriage. This finding indicates that assays detecting DNA damage could be considered in those suffering from recurrent pregnancy loss. Further research is necessary to study the mechanisms of DNA damage and the potential therapeutic effects of antioxidant therapy.STUDY FUNDING/COMPETING INTEREST(S)None.
Resumo:
Purpose: We conducted a systematic review and meta-analysis of observational studies to evaluate the effect of oral statins on intraocular pressure (IOP) and the incidence and progression of glaucoma. Methods: This was a systematic review of the literature and meta-analysis. Searches of PubMed/Medline and Embase were conducted to include all types of studies. Gray literature abstracts were also considered for inclusion. Last search date was February 2016. Risk of bias was assessed using the Newcastle-Ottawa scale independently by two reviewers. Odds ratios (OR) or hazard ratios (HR) and 95% confidence intervals (CI) were extracted from each study. Pooled ORs for incidence of glaucoma were calculated using a random-effects model. Results: We identified seven cohort studies, three case–control studies, and one cross-sectional study with a total number of 583,615 participants. No randomized controlled trials were retrieved. Pooled ORs demonstrated a statistically significant association between short-term statin use (≤2 years) and reduced incidence of glaucoma (OR 0.96, 95%CI 0.94, 0.99). Pooled ORs of long-term statin use (>2 years) did not demonstrate statistically significant reduction in incidence of glaucoma (OR 0.70, 95%CI 0.46, 1.06). There was inconsistent evidence for the protective effect of statins against the progression of glaucoma, although there was no standard definition for progression across studies. There was no significant difference in IOP associated with statin use. Conclusions: Short-term statin use is associated with a reduced incidence of glaucoma. The effect of statins on glaucoma progression and IOP is uncertain.
Resumo:
In survival analysis frailty is often used to model heterogeneity between individuals or correlation within clusters. Typically frailty is taken to be a continuous random effect, yielding a continuous mixture distribution for survival times. A Bayesian analysis of a correlated frailty model is discussed in the context of inverse Gaussian frailty. An MCMC approach is adopted and the deviance information criterion is used to compare models. As an illustration of the approach a bivariate data set of corneal graft survival times is analysed. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
It is believed that habitat heterogeneity can change the extent of predator-prey interactions. Therefore, in this study we examined the effect of habitat heterogeneity (characterized here as an addition of refuge) on D. ater predation on M. domestica. Predation of D. ater on M. domestica larvae was carried out in experimental habitats with and without refuge, and examined at different prey densities. The number of prey eaten by beetles over 24 h of predator-prey interaction was recorded, and we investigated the strength of interaction between prey and predator in both experimental habitats by determining predator functional response. The mean number of prey eaten by beetles in the presence of refuge was significantly higher than in the absence of refuge. Females had greater weight gains than males. Logistic regression analyses revealed the type II functional response for both experimental habitats, even though data did not fit well into the random predator model. Results suggest that the addition of refuge in fact enhanced predation, as prey consumption increased in the presence of refuge. Predators kept in the presence of refuge also consumed more prey at high prey densities. Thus, we concluded that the addition of refuge was an important component mediating D. ater-M. domestica population interactions. Refuge actually acted as a refuge for predators from prey, since prey behaviors detrimental to predators were reduced in this case.
Resumo:
Objective: To measure 2-week postoperative sensitivity in Class II composite restorations placed with a self-etching adhesive (Clearfil SE Bond) or a total-etch adhesive (Prime&Bond NT) with or without a flowable composite as cervical increment. Method and materials: Upon approval by the University of Guarulhos Committee on Human Subjects, 100 restorations were inserted in 46 patients who required Class II restorations in their molars and premolars. Enamel and dentin walls were conditioned with a self-etching primer (for Clearfil SE Bond) or etched with 34% phosphoric acid (for Prime&Bond NT). A 1- to 2-mm-thick increment of a flowable composite (Filtek Flow) was used in the proximal box in 50% of the restorations of each adhesive. Preparations were restored with a packable composite (Surefil). The restorations were evaluated preoperatively and 2 weeks postoperatively for sensitivity to cold, air, and masticatory forces using a visual analog scale. Marginal integrity of the accessible margins was also evaluated. Statistical analysis used a mixed linear model with subject as a random effect. Results: Ninety-eight teeth from 44 subjects were observed at 2 weeks. The type of adhesive and use of flowable composite had no significant effects or interaction for any of the four outcomes of interest, ie, change from baseline to 2 weeks in sensitivity and response time for the cold or air stimulus. For the air stimulus, the overall average change from baseline was not significant for either sensitivity or response time. For the cold stimulus, the overall average change from baseline was significant for both sensitivity and response time. No case of sensitivity to masticatory forces was observed. Conclusion: No differences in postoperative sensitivity were observed between a self-etch adhesive and a total-etch adhesive at 2 weeks. The use of flowable composite did not decrease postoperative sensitivity.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In the present work we perform an econometric analysis of the Tribal art market. To this aim, we use a unique and original database that includes information on Tribal art market auctions worldwide from 1998 to 2011. In Literature, art prices are modelled through the hedonic regression model, a classic fixed-effect model. The main drawback of the hedonic approach is the large number of parameters, since, in general, art data include many categorical variables. In this work, we propose a multilevel model for the analysis of Tribal art prices that takes into account the influence of time on artwork prices. In fact, it is natural to assume that time exerts an influence over the price dynamics in various ways. Nevertheless, since the set of objects change at every auction date, we do not have repeated measurements of the same items over time. Hence, the dataset does not constitute a proper panel; rather, it has a two-level structure in that items, level-1 units, are grouped in time points, level-2 units. The main theoretical contribution is the extension of classical multilevel models to cope with the case described above. In particular, we introduce a model with time dependent random effects at the second level. We propose a novel specification of the model, derive the maximum likelihood estimators and implement them through the E-M algorithm. We test the finite sample properties of the estimators and the validity of the own-written R-code by means of a simulation study. Finally, we show that the new model improves considerably the fit of the Tribal art data with respect to both the hedonic regression model and the classic multilevel model.
Resumo:
BACKGROUND Several treatment strategies are available for adults with advanced-stage Hodgkin's lymphoma, but studies assessing two alternative standards of care-increased dose bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, and prednisone (BEACOPPescalated), and doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD)-were not powered to test differences in overall survival. To guide treatment decisions in this population of patients, we did a systematic review and network meta-analysis to identify the best initial treatment strategy. METHODS We searched the Cochrane Library, Medline, and conference proceedings for randomised controlled trials published between January, 1980, and June, 2013, that assessed overall survival in patients with advanced-stage Hodgkin's lymphoma given BEACOPPbaseline, BEACOPPescalated, BEACOPP variants, ABVD, cyclophosphamide (mechlorethamine), vincristine, procarbazine, and prednisone (C[M]OPP), hybrid or alternating chemotherapy regimens with ABVD as the backbone (eg, COPP/ABVD, MOPP/ABVD), or doxorubicin, vinblastine, mechlorethamine, vincristine, bleomycin, etoposide, and prednisone combined with radiation therapy (the Stanford V regimen). We assessed studies for eligibility, extracted data, and assessed their quality. We then pooled the data and used a Bayesian random-effects model to combine direct comparisons with indirect evidence. We also reconstructed individual patient survival data from published Kaplan-Meier curves and did standard random-effects Poisson regression. Results are reported relative to ABVD. The primary outcome was overall survival. FINDINGS We screened 2055 records and identified 75 papers covering 14 eligible trials that assessed 11 different regimens in 9993 patients, providing 59 651 patient-years of follow-up. 1189 patients died, and the median follow-up was 5·9 years (IQR 4·9-6·7). Included studies were of high methodological quality, and between-trial heterogeneity was negligible (τ(2)=0·01). Overall survival was highest in patients who received six cycles of BEACOPPescalated (HR 0·38, 95% credibility interval [CrI] 0·20-0·75). Compared with a 5 year survival of 88% for ABVD, the survival benefit for six cycles of BEACOPPescalated is 7% (95% CrI 3-10)-ie, a 5 year survival of 95%. Reconstructed individual survival data showed that, at 5 years, BEACOPPescalated has a 10% (95% CI 3-15) advantage over ABVD in overall survival. INTERPRETATION Six cycles of BEACOPPescalated significantly improves overall survival compared with ABVD and other regimens, and thus we recommend this treatment strategy as standard of care for patients with access to the appropriate supportive care.
Resumo:
This study investigates a theoretical model where a longitudinal process, that is a stationary Markov-Chain, and a Weibull survival process share a bivariate random effect. Furthermore, a Quality-of-Life adjusted survival is calculated as the weighted sum of survival time. Theoretical values of population mean adjusted survival of the described model are computed numerically. The parameters of the bivariate random effect do significantly affect theoretical values of population mean. Maximum-Likelihood and Bayesian methods are applied on simulated data to estimate the model parameters. Based on the parameter estimates, predicated population mean adjusted survival can then be calculated numerically and compared with the theoretical values. Bayesian method and Maximum-Likelihood method provide parameter estimations and population mean prediction with comparable accuracy; however Bayesian method suffers from poor convergence due to autocorrelation and inter-variable correlation. ^
Resumo:
Motivation: The clustering of gene profiles across some experimental conditions of interest contributes significantly to the elucidation of unknown gene function, the validation of gene discoveries and the interpretation of biological processes. However, this clustering problem is not straightforward as the profiles of the genes are not all independently distributed and the expression levels may have been obtained from an experimental design involving replicated arrays. Ignoring the dependence between the gene profiles and the structure of the replicated data can result in important sources of variability in the experiments being overlooked in the analysis, with the consequent possibility of misleading inferences being made. We propose a random-effects model that provides a unified approach to the clustering of genes with correlated expression levels measured in a wide variety of experimental situations. Our model is an extension of the normal mixture model to account for the correlations between the gene profiles and to enable covariate information to be incorporated into the clustering process. Hence the model is applicable to longitudinal studies with or without replication, for example, time-course experiments by using time as a covariate, and to cross-sectional experiments by using categorical covariates to represent the different experimental classes. Results: We show that our random-effects model can be fitted by maximum likelihood via the EM algorithm for which the E(expectation) and M(maximization) steps can be implemented in closed form. Hence our model can be fitted deterministically without the need for time-consuming Monte Carlo approximations. The effectiveness of our model-based procedure for the clustering of correlated gene profiles is demonstrated on three real datasets, representing typical microarray experimental designs, covering time-course, repeated-measurement and cross-sectional data. In these examples, relevant clusters of the genes are obtained, which are supported by existing gene-function annotation. A synthetic dataset is considered too.
Resumo:
The aim of this paper is to provide a contemporary summary of statistical and non-statistical meta-analytic procedures that have relevance to the type of experimental designs often used by sport scientists when examining differences/change in dependent measure(s) as a result of one or more independent manipulation(s). Using worked examples from studies on observational learning in the motor behaviour literature, we adopt a random effects model and give a detailed explanation of the statistical procedures for the three types of raw score difference-based analyses applicable to between-participant, within-participant, and mixed-participant designs. Major merits and concerns associated with these quantitative procedures are identified and agreed methods are reported for minimizing biased outcomes, such as those for dealing with multiple dependent measures from single studies, design variation across studies, different metrics (i.e. raw scores and difference scores), and variations in sample size. To complement the worked examples, we summarize the general considerations required when conducting and reporting a meta-analysis, including how to deal with publication bias, what information to present regarding the primary studies, and approaches for dealing with outliers. By bringing together these statistical and non-statistical meta-analytic procedures, we provide the tools required to clarify understanding of key concepts and principles.