870 resultados para Time varying coefficients


Relevância:

30.00% 30.00%

Publicador:

Resumo:

By means of fixed-links modeling, the present study identified different processes of visual short-term memory (VSTM) functioning and investigated how these processes are related to intelligence. We conducted an experiment where the participants were presented with a color change detection task. Task complexity was manipulated through varying the number of presented stimuli (set size). We collected hit rate and reaction time (RT) as indicators for the amount of information retained in VSTM and speed of VSTM scanning, respectively. Due to the impurity of these measures, however, the variability in hit rate and RT was assumed to consist not only of genuine variance due to individual differences in VSTM retention and VSTM scanning but also of other, non-experimental portions of variance. Therefore, we identified two qualitatively different types of components for both hit rate and RT: (1) non-experimental components representing processes that remained constant irrespective of set size and (2) experimental components reflecting processes that increased as a function of set size. For RT, intelligence was negatively associated with the non-experimental components, but was unrelated to the experimental components assumed to represent variability in VSTM scanning speed. This finding indicates that individual differences in basic processing speed, rather than in speed of VSTM scanning, differentiates between high- and low-intelligent individuals. For hit rate, the experimental component constituting individual differences in VSTM retention was positively related to intelligence. The non-experimental components of hit rate, representing variability in basal processes, however, were not associated with intelligence. By decomposing VSTM functioning into non-experimental and experimental components, significant associations with intelligence were revealed that otherwise might have been obscured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How stable are individual differences in self-esteem? We examined the time-dependent decay of rank-order stability of self-esteem and tested whether stability asymptotically approaches zero or a nonzero value across long test–retest intervals. Analyses were based on 6 assessments across a 29-year period of a sample of 3,180 individuals aged 14 to 102 years. The results indicated that, as test–retest intervals increased, stability exponentially decayed and asymptotically approached a nonzero value (estimated as .43). The exponential decay function explained a large proportion of variance in observed stability coefficients, provided a better fit than alternative functions, and held across gender and for all age groups from adolescence to old age. Moreover, structural equation modeling of the individual-level data suggested that a perfectly stable trait component underlies stability of self-esteem. The findings suggest that the stability of self-esteem is relatively large, even across very long periods, and that self-esteem is a trait-like characteristic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graphical presentation of regression results has become increasingly popular in the scientific literature, as graphs are much easier to read than tables in many cases. In Stata such plots can be produced by the -marginsplot- command. However, while -marginsplot- is very versatile and flexible, it has two major limitations: it can only process results left behind by -margins- and it can only handle one set of results at the time. In this article I introduce a new command called -coefplot- that overcomes these limitations. It plots results from any estimation command and combines results from several models into a single graph. The default behavior of -coefplot- is to plot markers for coefficients and horizontal spikes for confidence intervals. However, -coefplot- can also produce various other types of graphs. The capabilities of -coefplot- are illustrated in this article using a series of examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecology and conservation require reliable data on the occurrence of animals and plants. A major source of bias is imperfect detection, which, however, can be corrected for by estimation of detectability. In traditional occupancy models, this requires repeat or multi-observer surveys. Recently, time-to-detection models have been developed as a cost-effective alternative, which requires no repeat surveys and hence costs could be halved. We compared the efficiency and reliability of time-to-detection and traditional occupancy models under varying survey effort. Two observers independently searched for 17 plant species in 44100m(2) Swiss grassland quadrats and recorded the time-to-detection for each species, enabling detectability to be estimated with both time-to-detection and traditional occupancy models. In addition, we gauged the relative influence on detectability of species, observer, plant height and two measures of abundance (cover and frequency). Estimates of detectability and occupancy under both models were very similar. Rare species were more likely to be overlooked; detectability was strongly affected by abundance. As a measure of abundance, frequency outperformed cover in its predictive power. The two observers differed significantly in their detection ability. Time-to-detection models were as accurate as traditional occupancy models, but their data easier to obtain; thus they provide a cost-effective alternative to traditional occupancy models for detection-corrected estimation of occurrence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graphical display of regression results has become increasingly popular in presentations and in scientific literature because graphs are often much easier to read than tables. Such plots can be produced in Stata by the marginsplot command (see [R] marginsplot). However, while marginsplot is versatile and flexible, it has two major limitations: it can only process results left behind by margins (see [R] margins), and it can handle only one set of results at a time. In this article, I introduce a new command called coefplot that overcomes these limitations. It plots results from any estimation command and combines results from several models into one graph. The default behavior of coefplot is to plot markers for coefficients and horizontal spikes for confidence intervals. However, coefplot can also produce other types of graphs. I illustrate the capabilities of coefplot by using a series of examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The time variable Earth’s gravity field contains information about the mass transport within the system Earth, i.e., the relationship between mass variations in the atmosphere, oceans, land hydrology, and ice sheets. For many years, satellite laser ranging (SLR) observations to geodetic satellites have provided valuable information of the low-degree coefficients of the Earth’s gravity field. Today, the Gravity Recovery and Climate Experiment (GRACE) mission is the major source of information for the time variable field of a high spatial resolution. We recover the low-degree coefficients of the time variable Earth’s gravity field using SLR observations up to nine geodetic satellites: LAGEOS-1, LAGEOS-2, Starlette, Stella, AJISAI, LARES, Larets, BLITS, and Beacon-C. We estimate monthly gravity field coefficients up to degree and order 10/10 for the time span 2003–2013 and we compare the results with the GRACE-derived gravity field coefficients. We show that not only degree-2 gravity field coefficients can be well determined from SLR, but also other coefficients up to degree 10 using the combination of short 1-day arcs for low orbiting satellites and 10-day arcs for LAGEOS-1/2. In this way, LAGEOS-1/2 allow recovering zonal terms, which are associated with long-term satellite orbit perturbations, whereas the tesseral and sectorial terms benefit most from low orbiting satellites, whose orbit modeling deficiencies are minimized due to short 1-day arcs. The amplitudes of the annual signal in the low-degree gravity field coefficients derived from SLR agree with GRACE K-band results at a level of 77 %. This implies that SLR has a great potential to fill the gap between the current GRACE and the future GRACE Follow-On mission for recovering of the seasonal variations and secular trends of the longest wavelengths in gravity field, which are associated with the large-scale mass transport in the system Earth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objectives of this research were (1) to study the effect of contact pressure, compression time, and liquid (moisture content of the fabric) on the transfer by sliding contact of non-fixed surface contamination to protective clothing constructed from uncoated, woven fabrics, (2) to study the effect of contact pressure, compression time, and liquid content on the subsequent penetration through the fabric, and (3) to determine if varying the type of contaminant changes the effect of contact pressure, compression time, and liquid content on the transfer by sliding contact and penetration of non-fixed surface contamination. ^ It was found that the combined influence of the liquid (moisture content of the fabric), load (contact pressure), compression time, and their interactions significantly influenced the penetration of all three test agents, sucrose- 14C, triolein-3H, and starch-14C through 100% cotton fabric. The combined influence of the statistically significant main effects and their interactions increased the penetration of triolein- 3H by 32,548%, sucrose-14C by 7,006%, and starch- 14C by 1,900%. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prevalent sampling is an efficient and focused approach to the study of the natural history of disease. Right-censored time-to-event data observed from prospective prevalent cohort studies are often subject to left-truncated sampling. Left-truncated samples are not randomly selected from the population of interest and have a selection bias. Extensive studies have focused on estimating the unbiased distribution given left-truncated samples. However, in many applications, the exact date of disease onset was not observed. For example, in an HIV infection study, the exact HIV infection time is not observable. However, it is known that the HIV infection date occurred between two observable dates. Meeting these challenges motivated our study. We propose parametric models to estimate the unbiased distribution of left-truncated, right-censored time-to-event data with uncertain onset times. We first consider data from a length-biased sampling, a specific case in left-truncated samplings. Then we extend the proposed method to general left-truncated sampling. With a parametric model, we construct the full likelihood, given a biased sample with unobservable onset of disease. The parameters are estimated through the maximization of the constructed likelihood by adjusting the selection bias and unobservable exact onset. Simulations are conducted to evaluate the finite sample performance of the proposed methods. We apply the proposed method to an HIV infection study, estimating the unbiased survival function and covariance coefficients. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of analyzing data with updated measurements in the time-dependent proportional hazards model arises frequently in practice. One available option is to reduce the number of intervals (or updated measurements) to be included in the Cox regression model. We empirically investigated the bias of the estimator of the time-dependent covariate while varying the effect of failure rate, sample size, true values of the parameters and the number of intervals. We also evaluated how often a time-dependent covariate needs to be collected and assessed the effect of sample size and failure rate on the power of testing a time-dependent effect.^ A time-dependent proportional hazards model with two binary covariates was considered. The time axis was partitioned into k intervals. The baseline hazard was assumed to be 1 so that the failure times were exponentially distributed in the ith interval. A type II censoring model was adopted to characterize the failure rate. The factors of interest were sample size (500, 1000), type II censoring with failure rates of 0.05, 0.10, and 0.20, and three values for each of the non-time-dependent and time-dependent covariates (1/4,1/2,3/4).^ The mean of the bias of the estimator of the coefficient of the time-dependent covariate decreased as sample size and number of intervals increased whereas the mean of the bias increased as failure rate and true values of the covariates increased. The mean of the bias of the estimator of the coefficient was smallest when all of the updated measurements were used in the model compared with two models that used selected measurements of the time-dependent covariate. For the model that included all the measurements, the coverage rates of the estimator of the coefficient of the time-dependent covariate was in most cases 90% or more except when the failure rate was high (0.20). The power associated with testing a time-dependent effect was highest when all of the measurements of the time-dependent covariate were used. An example from the Systolic Hypertension in the Elderly Program Cooperative Research Group is presented. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An exposure system was constructed to evaluate the performance of a personal organic vapor dosimeter (3520 OVM) at ppb concentrations of nine selected target volatile organic compounds (VOCs). These concentration levels are generally encountered in community air environments, both indoor and outdoor. It was demonstrated that the chamber system could provide closely-controlled conditions of VOC concentrations, temperature and relative humidity (RH) required for the experiments. The target experimental conditions included combinations of three VOC concentrations (10, 20 and 200 $\rm\mu g/m\sp3),$ three temperatures (10, 25 and 40$\sp\circ$C) and three RHs (12, 50 and 90% RH), leading to a total of 27 exposure conditions. No backgrounds of target VOCs were found in the exposure chamber system. In the exposure chamber, the variation of the temperature was controlled within $\pm$1$\sp\circ$C, and the variation of RH was controlled within $\pm$1.5% at 12% RH, $\pm$2% at 50% RH and $\pm$3% at 90% RH. High-emission permeation tubes were utilized to generate the target VOCs. Various patterns of the permeation rates were observed over time. The lifetimes and permeation rates of the tubes differed by compound, length of the tube and manufacturer. By carefully selecting the source and length of the tubes, and closely monitoring tube weight loss over time, the permeation tubes can be used for delivering low and stable concentrations of VOCs during multiple days.^ The results of this study indicate that the performance of the 3520 OVM is compound-specific and depends on concentration, temperature and humidity. With the exception of 1,3-butadiene under most conditions, and styrene and methylene chloride at very high relative humidities, recoveries were generally within $\pm$25% of theory, indicating that the 3520 OVM can be effectively used over the range of concentrations and environmental conditions tested with a 24-hour sampling period. Increasing humidities resulted in increasing negative bias from full recovery. Reverse diffusion conducted at 200 $\rm\mu g/m\sp3$ and five temperature/humidity combinations indicated severe diffusion losses only for 1,3-butadiene, methylene chloride and styrene under increased humidity. Overall, the results of this study do not support the need to employ diffusion samplers with backup sections for the exposure conditions tested. ^