989 resultados para covariance model


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose. To assess the impact of a six-month stage-based intervention on fruit and vegetable intake, regarding perceived benefits and barriers, and self-efficacy among adolescents. Design. Randomized treatment-control, pre-post design. Subjects/ Setting. Schools were randomized between control and experimental groups. 860 adolescents from ten public schools in Bras ' ilia, Federal District, Brazil were evaluated at baseline; 771 (81%) completed the study. Intervention. Experimental group received monthly magazines and newsletters aimed at promotion of healthy eating. Measures. Self-reported fruit and vegetable intake, stages of change, self-efficacy and decisional balance scores were evaluated at baseline and post-intervention in both groups. Analysis. The effectiveness of the intervention was evaluated using the analysis of covariance model (ANCOVA) and repeated measurement analysis by means of weighted least squares. Comparison between the proportions of adolescents who advanced through the stages during the intervention was performed using the Mantel-Haenszel chi-square test. Results. After adjusting for sex and age, study variables showed no modifications through the proposed intervention. There was no statistical difference in participant mobility in the intervention and control groups between the stages of change, throughout the study. Conclusion. A nutritional intervention based exclusively on distribution of stage-matched printed educational materials was insufficient to change adolescents' dietary behavior.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To compare the effect of bimatoprost and the fixed combination of latanoprost and timolol (LTFC) on 24-hour mean intraocular pressure (IOP) after patients are switched from a nonfixed combination of latanoprost and timolol. DESIGN: Randomized, double-masked, multicenter clinical trial. PARTICIPANTS: Two hundred patients with glaucoma or ocular hypertension. METHODS: Included were patients who were controlled (IOP < 21 mmHg) on the nonfixed combination of latanoprost and timolol for at least 3 months before the baseline visit or patients on monotherapy with either latanoprost or timolol who were eligible for dual therapy not being fully controlled on monotherapy. The latter group of patients underwent a 6-week wash-in phase with the nonfixed combination of latanoprost and timolol before baseline IOP determination and study inclusion. Supine and sitting position IOPs were recorded at 8 pm, midnight, 5 am, 8 am, noon, and 4 pm at baseline, week 6, and week 12 visits. MAIN OUTCOME MEASURE: An analysis of covariance model was used for a noninferiority test of the primary efficacy variable, with mean area under the 24-hour IOP curve after 12 weeks of treatment as response variable and treatment, center, and baseline IOP as factors. A secondary analysis was performed on the within-treatment change from baseline. RESULTS: Mean baseline IOPs were 16.3+/-3.3 mmHg and 15.5+/-2.9.mmHg in the bimatoprost and LTFC groups, respectively. At week 12, mean IOPs were 16.1+/-2.5 mmHg for the bimatoprost group and 16.3+/-3.7 mmHg for the LTFC group, and no significant difference between the 2 treatment groups could be found. As compared with baseline, mean IOP increased by 0.3+/-3.6 mmHg during the day and decreased by 0.8+/-3.8 mmHg during the night in the bimatoprost group, whereas there were increases of 1.43+/-2.6 mmHg and 0.14+/-3.2 mmHg in the LTFC group, respectively. CONCLUSIONS: Bimatoprost is not inferior to the LTFC in maintaining IOP at a controlled level during a 24-hour period in patients switched from the nonfixed combination of latanoprost and timolol.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: To evaluate the relationship between ocular perfusion pressure and color Doppler measurements in patients with glaucoma. MATERIALS AND METHODS: Twenty patients with primary open-angle glaucoma with visual field deterioration in spite of an intraocular pressure lowered below 21 mm Hg, 20 age-matched patients with glaucoma with stable visual fields, and 20 age-matched healthy controls were recruited. After a 20-minute rest in a supine position, intraocular pressure and color Doppler measurements parameters of the ophthalmic artery and the central retinal artery were obtained. Correlations between mean ocular perfusion pressure and color Doppler measurements parameters were determined. RESULTS: Patients with glaucoma showed a higher intraocular pressure (P <.0008) and a lower mean ocular perfusion pressure (P <.0045) compared with healthy subjects. Patients with deteriorating glaucoma showed a lower mean blood pressure (P =.033) and a lower end diastolic velocity in the central retinal artery (P =.0093) compared with normals. Mean ocular perfusion pressure correlated positively with end diastolic velocity in the ophthalmic artery (R = 0.66, P =.002) and central retinal artery (R = 0.74, P <.0001) and negatively with resistivity index in the ophthalmic artery (R = -0.70, P =.001) and central retinal artery (R = -0.62, P =.003) in patients with deteriorating glaucoma. Such correlations did not occur in patients with glaucoma with stable visual fields or in normal subjects. The correlations were statistically significantly different between the study groups (parallelism of regression lines in an analysis of covariance model) for end diastolic velocity (P =.001) and resistivity index (P =.0001) in the ophthalmic artery, as well as for end diastolic velocity (P =.0009) and resistivity index (P =. 001) in the central retinal artery. CONCLUSIONS: The present findings suggest that alterations in ocular blood flow regulation may contribute to the progression in glaucomatous damage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Management of type 2 diabetes with metformin often does not provide adequate glycemic control, thereby necessitating add-on treatment. In a 24-week clinical trial, dapagliflozin, an investigational sodium glucose cotransporter 2 inhibitor, improved glycemic control in patients inadequately controlled with metformin. The present study is an extension that was undertaken to evaluate dapagliflozin as long-term therapy in this population.Methods: This was a long-term extension (total 102 weeks) of a 24-week phase 3, multicenter, randomized, placebo-controlled, double-blind, parallel-group trial. Patients were randomly assigned (1:1:1:1) to blinded daily treatment (placebo, or dapagliflozin 2.5 to 5, or 10 mg) plus open-label metformin (=1,500 mg). The previously published primary endpoint was change from baseline in glycated hemoglobin (HbA1c) at 24 weeks. This paper reports the follow-up to week 102, with analysis of covariance model performed at 24 weeks with last observation carried forward; a repeated measures analysis was utilized to evaluate changes from baseline in HbA1c, fasting plasma glucose (FPG), and weight.Results: A total of 546 patients were randomized to 1 of the 4 treatments. The completion rate for the 78-week double-blind extension period was lower for the placebo group (63.5%) than for the dapagliflozin groups (68.3% to 79.8%). At week 102, mean changes from baseline HbA1c (8.06%) were +0.02% for placebo compared with -0.48% (P = 0.0008), -0.58% (P

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation analyzes whether and how changes in federal tax policy affect local tax policies, specifically, the elimination of the federal deductibility of state and local taxes for individual taxpayers by the Tax Reform Act of 1986 (TRA86) in 59 California cities. Two methods are used in the study: a survey of local revenue officials and a time event time-series/cross sectional sales tax reliance study.^ The reliance study uses a covariance model to pool cross-section and time-series observations. The results of the reliance study indicate a statistically significant overall decline in sales tax reliance after 1986. The results of the survey indicate that local policy makers generally do not believe that federal deductibility is an important factor when considering raising local sales taxes. Further analysis shows that local revenue officials claiming federal deductibility is not an important factor are associated mostly with cities that registered no significant decline in sales tax reliance after 1986. Similarly, local revenue officials claiming federal deductibility is an important factor when considering local tax policy are associated mostly with cities that suffered a significant decline in sales tax reliance after 1986.^ Of that group, further analysis shows that the declines in sales tax reliance are associated mostly with cities located in the southwestern part of the state. When compared to other cities in the state, an analysis of variance reveals that there are a series of statistically significant factors associated with southwestern cities which may contribute to the decline in sales tax reliance following the enactment of the Tax Reform Act of 1986. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Spatial data analysis has become more and more important in the studies of ecology and economics during the last decade. One focus of spatial data analysis is how to select predictors, variance functions and correlation functions. However, in general, the true covariance function is unknown and the working covariance structure is often misspecified. In this paper, our target is to find a good strategy to identify the best model from the candidate set using model selection criteria. This paper is to evaluate the ability of some information criteria (corrected Akaike information criterion, Bayesian information criterion (BIC) and residual information criterion (RIC)) for choosing the optimal model when the working correlation function, the working variance function and the working mean function are correct or misspecified. Simulations are carried out for small to moderate sample sizes. Four candidate covariance functions (exponential, Gaussian, Matern and rational quadratic) are used in simulation studies. With the summary in simulation results, we find that the misspecified working correlation structure can still capture some spatial correlation information in model fitting. When the sample size is large enough, BIC and RIC perform well even if the the working covariance is misspecified. Moreover, the performance of these information criteria is related to the average level of model fitting which can be indicated by the average adjusted R square ( [GRAPHICS] ), and overall RIC performs well.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A total of 20,065 weights recorded on 3016 Nelore animals were used to estimate covariance functions for growth from birth to 630 days of age, assuming a parametric correlation structure to model within-animal correlations. The model of analysis included fixed effects of contemporary groups and age of dam as quadratic covariable. Mean trends were taken into account by a cubic regression on orthogonal polynomials of animal age. Genetic effects of the animal and its dam and maternal permanent environmental effects were modelled by random regressions on Legendre polynomials of age at recording. Changes in direct permanent environmental effect variances were modelled by a polynomial variance function, together with a parametric correlation function to account for correlations between ages. Stationary and nonstationary models were used to model within-animal correlations between different ages. Residual variances were considered homogeneous or heterogeneous, with changes modelled by a step or polynomial function of age at recording. Based on Bayesian information criterion, a model with a cubic variance function combined with a nonstationary correlation function for permanent environmental effects, with 49 parameters to be estimated, fitted best. Modelling within-animal correlations through a parametric correlation structure can describe the variation pattern adequately. Moreover, the number of parameters to be estimated can be decreased substantially compared to a model fitting random regression on Legendre polynomial of age. © 2004 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Geostrophic surface velocities can be derived from the gradients of the mean dynamic topography-the difference between the mean sea surface and the geoid. Therefore, independently observed mean dynamic topography data are valuable input parameters and constraints for ocean circulation models. For a successful fit to observational dynamic topography data, not only the mean dynamic topography on the particular ocean model grid is required, but also information about its inverse covariance matrix. The calculation of the mean dynamic topography from satellite-based gravity field models and altimetric sea surface height measurements, however, is not straightforward. For this purpose, we previously developed an integrated approach to combining these two different observation groups in a consistent way without using the common filter approaches (Becker et al. in J Geodyn 59(60):99-110, 2012, doi:10.1016/j.jog.2011.07.0069; Becker in Konsistente Kombination von Schwerefeld, Altimetrie und hydrographischen Daten zur Modellierung der dynamischen Ozeantopographie, 2012, http://nbn-resolving.de/nbn:de:hbz:5n-29199). Within this combination method, the full spectral range of the observations is considered. Further, it allows the direct determination of the normal equations (i.e., the inverse of the error covariance matrix) of the mean dynamic topography on arbitrary grids, which is one of the requirements for ocean data assimilation. In this paper, we report progress through selection and improved processing of altimetric data sets. We focus on the preprocessing steps of along-track altimetry data from Jason-1 and Envisat to obtain a mean sea surface profile. During this procedure, a rigorous variance propagation is accomplished, so that, for the first time, the full covariance matrix of the mean sea surface is available. The combination of the mean profile and a combined GRACE/GOCE gravity field model yields a mean dynamic topography model for the North Atlantic Ocean that is characterized by a defined set of assumptions. We show that including the geodetically derived mean dynamic topography with the full error structure in a 3D stationary inverse ocean model improves modeled oceanographic features over previous estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the case of large cooperative communication systems where terminals use the protocol known as slotted amplify-and-forward protocol to aid the source in its transmission. Using the perturbation expansion methods of resolvents and large deviation techniques we obtain an expression for the Stieltjes transform of the asymptotic eigenvalue distribution of a sample covariance random matrix of the type HH† where H is the channel matrix of the transmission model for the transmission protocol we consider. We prove that the resulting expression is similar to the Stieltjes transform in its quadratic equation form for the Marcenko-Pastur distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose. To create a binocular statistical eye model based on previously measured ocular biometric data. Methods. Thirty-nine parameters were determined for a group of 127 healthy subjects (37 male, 90 female; 96.8% Caucasian) with an average age of 39.9 ± 12.2 years and spherical equivalent refraction of −0.98 ± 1.77 D. These parameters described the biometry of both eyes and the subjects' age. Missing parameters were complemented by data from a previously published study. After confirmation of the Gaussian shape of their distributions, these parameters were used to calculate their mean and covariance matrices. These matrices were then used to calculate a multivariate Gaussian distribution. From this, an amount of random biometric data could be generated, which were then randomly selected to create a realistic population of random eyes. Results. All parameters had Gaussian distributions, with the exception of the parameters that describe total refraction (i.e., three parameters per eye). After these non-Gaussian parameters were omitted from the model, the generated data were found to be statistically indistinguishable from the original data for the remaining 33 parameters (TOST [two one-sided t tests]; P < 0.01). Parameters derived from the generated data were also significantly indistinguishable from those calculated with the original data (P > 0.05). The only exception to this was the lens refractive index, for which the generated data had a significantly larger SD. Conclusions. A statistical eye model can describe the biometric variations found in a population and is a useful addition to the classic eye models.