219 resultados para Exponential smoothing methods
Resumo:
PURPOSE: Nonvisual light-dependent functions in humans are conveyed mainly by intrinsically photosensitive retinal ganglion cells, which express melanopsin as photopigment. We aimed to identify the effects of circadian phase and sleepiness across 24 hours on various aspects of the pupil response to light stimulation. METHODS: We tested 10 healthy adults hourly in two 12-hour sessions covering a 24-hour period. Pupil responses to narrow bandwidth red (635 ± 18 nm) and blue (463 ± 24 nm) light (duration of 1 and 30 seconds) at equal photon fluxes were recorded, and correlated with salivary melatonin concentrations at the same circadian phases and to subjective sleepiness ratings. The magnitude of pupil constriction was determined from minimal pupil size. The post-stimulus pupil response was assessed from the pupil size at 6 seconds following light offset, the area within the redilation curve, and the exponential rate of redilation. RESULTS: Among the measured parameters, the pupil size 6 seconds after light offset correlated with melatonin concentrations (P < 0.05) and showed a significant modulation over 24 hours with maximal values after the nocturnal peak of melatonin secretion. In contrast, the post-stimulus pupil response following red light stimulation correlated with subjective sleepiness (P < 0.05) without significant changes over 24 hours. CONCLUSIONS: The post-stimulus pupil response to blue light as a marker of intrinsic melanopsin activity demonstrated a circadian modulation. In contrast, the effect of sleepiness was more apparent in the cone contribution to the pupil response. Thus, pupillary responsiveness to light is under influence of the endogenous circadian clock and subjective sleepiness.
Resumo:
The ability to determine the location and relative strength of all transcription-factor binding sites in a genome is important both for a comprehensive understanding of gene regulation and for effective promoter engineering in biotechnological applications. Here we present a bioinformatically driven experimental method to accurately define the DNA-binding sequence specificity of transcription factors. A generalized profile was used as a predictive quantitative model for binding sites, and its parameters were estimated from in vitro-selected ligands using standard hidden Markov model training algorithms. Computer simulations showed that several thousand low- to medium-affinity sequences are required to generate a profile of desired accuracy. To produce data on this scale, we applied high-throughput genomics methods to the biochemical problem addressed here. A method combining systematic evolution of ligands by exponential enrichment (SELEX) and serial analysis of gene expression (SAGE) protocols was coupled to an automated quality-controlled sequence extraction procedure based on Phred quality scores. This allowed the sequencing of a database of more than 10,000 potential DNA ligands for the CTF/NFI transcription factor. The resulting binding-site model defines the sequence specificity of this protein with a high degree of accuracy not achieved earlier and thereby makes it possible to identify previously unknown regulatory sequences in genomic DNA. A covariance analysis of the selected sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism.
Resumo:
"Most quantitative empirical analyses are motivated by the desire to estimate the causal effect of an independent variable on a dependent variable. Although the randomized experiment is the most powerful design for this task, in most social science research done outside of psychology, experimental designs are infeasible. (Winship & Morgan, 1999, p. 659)." This quote from earlier work by Winship and Morgan, which was instrumental in setting the groundwork for their book, captures the essence of our review of Morgan and Winship's book: It is about causality in nonexperimental settings.
Resumo:
The method of instrumental variable (referred to as Mendelian randomization when the instrument is a genetic variant) has been initially developed to infer on a causal effect of a risk factor on some outcome of interest in a linear model. Adapting this method to nonlinear models, however, is known to be problematic. In this paper, we consider the simple case when the genetic instrument, the risk factor, and the outcome are all binary. We compare via simulations the usual two-stages estimate of a causal odds-ratio and its adjusted version with a recently proposed estimate in the context of a clinical trial with noncompliance. In contrast to the former two, we confirm that the latter is (under some conditions) a valid estimate of a causal odds-ratio defined in the subpopulation of compliers, and we propose its use in the context of Mendelian randomization. By analogy with a clinical trial with noncompliance, compliers are those individuals for whom the presence/absence of the risk factor X is determined by the presence/absence of the genetic variant Z (i.e., for whom we would observe X = Z whatever the alleles randomly received at conception). We also recall and illustrate the huge variability of instrumental variable estimates when the instrument is weak (i.e., with a low percentage of compliers, as is typically the case with genetic instruments for which this proportion is frequently smaller than 10%) where the inter-quartile range of our simulated estimates was up to 18 times higher compared to a conventional (e.g., intention-to-treat) approach. We thus conclude that the need to find stronger instruments is probably as important as the need to develop a methodology allowing to consistently estimate a causal odds-ratio.
Resumo:
The pursuit of high response rates to minimise the threat of nonresponse bias continues to dominate decisions about resource allocation in survey research. Yet a growing body of research has begun to question this practice. In this study, we use previously unavailable data from a new sampling frame based on population registers to assess the value of different methods designed to increase response rates on the European Social Survey in Switzerland. Using sampling data provides information about both respondents and nonrespondents, making it possible to examine how changes in response rates resulting from the use of different fieldwork methods relate to changes in the composition and representativeness of the responding sample. We compute an R-indicator to assess representativity with respect to the sampling register variables, and find little improvement in the sample composition as response rates increase. We then examine the impact of response rate increases on the risk of nonresponse bias based on Maximal Absolute Bias (MAB), and coefficients of variation between subgroup response rates, alongside the associated costs of different types of fieldwork effort. The results show that increases in response rate help to reduce MAB, while only small but important improvements to sample representativity are gained by varying the type of effort. These findings lend further support to research that has called into question the value of extensive investment in procedures aimed at reaching response rate targets and the need for more tailored fieldwork strategies aimed both at reducing survey costs and minimising the risk of bias.
Resumo:
PURPOSE: To compare different techniques for positive contrast imaging of susceptibility markers with MRI for three-dimensional visualization. As several different techniques have been reported, the choice of the suitable method depends on its properties with regard to the amount of positive contrast and the desired background suppression, as well as other imaging constraints needed for a specific application. MATERIALS AND METHODS: Six different positive contrast techniques are investigated for their ability to image at 3 Tesla a single susceptibility marker in vitro. The white marker method (WM), susceptibility gradient mapping (SGM), inversion recovery with on-resonant water suppression (IRON), frequency selective excitation (FSX), fast low flip-angle positive contrast SSFP (FLAPS), and iterative decomposition of water and fat with echo asymmetry and least-squares estimation (IDEAL) were implemented and investigated. RESULTS: The different methods were compared with respect to the volume of positive contrast, the product of volume and signal intensity, imaging time, and the level of background suppression. Quantitative results are provided, and strengths and weaknesses of the different approaches are discussed. CONCLUSION: The appropriate choice of positive contrast imaging technique depends on the desired level of background suppression, acquisition speed, and robustness against artifacts, for which in vitro comparative data are now available.
Resumo:
INTRODUCTION: Radiosurgery (RS) is gaining increasing acceptance in the upfront management of brain metastases (BM). It was initially used in so-called radioresistant metastases (melanoma, renal cell, sarcoma) because it allowed delivering higher dose to the tumor. Now, RS is also used for BM of other cancers. The risk of high incidence of new BM questions the need for associated whole-brain radiotherapy (WBRT). Recent evidence suggests that RS alone allows avoiding cognitive impairment related to WBRT, and the latter should be upheld for salvage therapy. Thus the increase use of RS for single and multiple BM raises new technical challenges for treatment delivery and dosimetry. We present our single institution experience focusing on the criteria that led to patients' selection for RS treatment with Gamma Knife (GK) in lieu of Linac. METHODS: Leksell Gamma Knife Perfexion (Elekta, Sweden) was installed in July 2010. Currently, the Swiss federal health care supports the costs of RS for BM with Linac but not with GK. Therefore, in our center, we always consider first the possibility to use Linac for this indication, and only select patients for GK in specific situations. All cases of BM treated with GK were retrospectively reviewed for criteria yielding to GK indication, clinical information, and treatment data. Further work in progress includes a posteriori dosimetry comparison with our Linac planning system (Brainscan V.5.3, Brainlab, Germany). RESULTS: From July 2010 to March 2012, 20 patients had RS for BM with GK (7 patients with single BM, and 13 with multiple BM). During the same period, 31 had Linac-based RS. Primary tumor was melanoma in 9, lung in 7, renal in 2, and gastrointestinal tract in 2 patients. In single BM, the reason for choosing of GK was the anatomical location close to, or in highly functional areas (1 motor cortex, 1 thalamic, 1 ventricular, 1 mesio-temporal, 3 deep cerebellar close to the brainstem), especially since most of these tumors were intended to be treated with high-dose RS (24 Gy at margin) because of their histology (3 melanomas, 1 renal cell). In multiple BM, the reason for choosing GK in relation with the anatomical location of the lesions was either technical (limitations of Linac movements, especially in lower posterior fossa locations) or closeness of multiple lesions to highly functional areas (typically, multiple posterior fossa BM close to the brainstem), precluding optimal dosimetry with Linac. Again, this was made more critical for multiple BM needing high-dose RS (6 melanoma, 2 hypernephroma). CONCLUSION: Radiosurgery for BM may represent some technical challenge in relation with the anatomical location and multiplicity of the lesions. These considerations may be accentuated for so-called radioresistant BM, when higher dose RS in needed. In our experience, Leksell Gamma Knife Perfexion proves to be useful in addressing these challenges for the treatment of BM.
Resumo:
Question: When multiple observers record the same spatial units of alpine vegetation, how much variation is there in the records and what are the consequences of this variation for monitoring schemes to detect change? Location: One test summit in Switzerland (Alps) and one test summit in Scotland (Cairngorm Mountains). Method: Eight observers used the GLORIA protocols for species composition and visual cover estimates in percent on large summit sections (>100 m2) and species composition and frequency in nested quadrats (1 m2). Results: The multiple records from the same spatial unit for species composition and species cover showed considerable variation in the two countries. Estimates of pseudoturnover of composition and coefficients of variation of cover estimates for vascular plant species in 1m x 1m quadrats showed less variation than in previously published reports whereas our results in larger sections were broadly in line with previous reports. In Scotland, estimates for bryophytes and lichens were more variable than for vascular plants. Conclusions: Statistical power calculations indicated that, unless large numbers of plots were used, changes in cover or frequency were only likely to be detected for abundant species (exceeding 10% cover) or if relative changes were large (50% or more). Lower variation could be reached with the point methods and with larger numbers of small plots. However, as summits often strongly differ from each other, supplementary summits cannot be considered as a way of increasing statistical power without introducing a supplementary component of variance into the analysis and hence the power calculations.