919 resultados para working-correlation-structure


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Genética e Melhoramento de Plantas) - FCAV

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present two-dimensional (2D) two-particle angular correlations measured with the STAR detector on relative pseudorapidity eta and azimuth phi for charged particles from Au-Au collisions at root s(NN) = 62 and 200 GeV with transverse momentum p(t) >= 0.15 GeV/c, vertical bar eta vertical bar <= 1, and 2 pi in azimuth. Observed correlations include a same-side (relative azimuth <pi/2) 2D peak, a closely related away-side azimuth dipole, and an azimuth quadrupole conventionally associated with elliptic flow. The same-side 2D peak and away-side dipole are explained by semihard parton scattering and fragmentation (minijets) in proton-proton and peripheral nucleus-nucleus collisions. Those structures follow N-N binary-collision scaling in Au-Au collisions until midcentrality, where a transition to a qualitatively different centrality trend occurs within one 10% centrality bin. Above the transition point the number of same-side and away-side correlated pairs increases rapidly relative to binary-collision scaling, the eta width of the same-side 2D peak also increases rapidly (eta elongation), and the phi width actually decreases significantly. Those centrality trends are in marked contrast with conventional expectations for jet quenching in a dense medium. The observed centrality trends are compared to perturbative QCD predictions computed in HIJING, which serve as a theoretical baseline, and to the expected trends for semihard parton scattering and fragmentation in a thermalized opaque medium predicted by theoretical calculations and phenomenological models. We are unable to reconcile a semihard parton scattering and fragmentation origin for the observed correlation structure and centrality trends with heavy-ion collision scenarios that invoke rapid parton thermalization. If the collision system turns out to be effectively opaque to few-GeV partons the present observations would be inconsistent with the minijet picture discussed here. DOI: 10.1103/PhysRevC.86.064902

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of the thesis is to propose a Bayesian estimation through Markov chain Monte Carlo of multidimensional item response theory models for graded responses with complex structures and correlated traits. In particular, this work focuses on the multiunidimensional and the additive underlying latent structures, considering that the first one is widely used and represents a classical approach in multidimensional item response analysis, while the second one is able to reflect the complexity of real interactions between items and respondents. A simulation study is conducted to evaluate the parameter recovery for the proposed models under different conditions (sample size, test and subtest length, number of response categories, and correlation structure). The results show that the parameter recovery is particularly sensitive to the sample size, due to the model complexity and the high number of parameters to be estimated. For a sufficiently large sample size the parameters of the multiunidimensional and additive graded response models are well reproduced. The results are also affected by the trade-off between the number of items constituting the test and the number of item categories. An application of the proposed models on response data collected to investigate Romagna and San Marino residents' perceptions and attitudes towards the tourism industry is also presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: Smoking is the most prevalent modifiable risk factor for cardiovascular diseases among HIV-positive persons. We assessed the effect on smoking cessation of training HIV care physicians in counselling. METHODS: The Swiss HIV Cohort Study (SHCS) is a multicentre prospective observational database. Our single-centre intervention at the Zurich centre included a half day of standardized training for physicians in counselling and in the pharmacotherapy of smokers, and a physicians' checklist for semi-annual documentation of their counselling. Smoking status was then compared between participants at the Zurich centre and other institutions. We used marginal logistic regression models with exchangeable correlation structure and robust standard errors to estimate the odds of smoking cessation and relapse. RESULTS: Between April 2000 and December 2010, 11 056 SHCS participants had 121 238 semi-annual visits and 64 118 person-years of follow-up. The prevalence of smoking decreased from 60 to 43%. During the intervention at the Zurich centre from November 2007 to December 2009, 1689 participants in this centre had 6068 cohort visits. These participants were more likely to stop smoking [odds ratio (OR) 1.23; 95% confidence interval (CI) 1.07-1.42; P = 0.004] and had fewer relapses (OR 0.75; 95% CI 0.61-0.92; P = 0.007) than participants at other SHCS institutions. The effect of the intervention was stronger than the calendar time effect (OR 1.19 vs. 1.04 per year, respectively). Middle-aged participants, injecting drug users, and participants with psychiatric problems or with higher alcohol consumption were less likely to stop smoking, whereas persons with a prior cardiovascular event were more likely to stop smoking. CONCLUSIONS: An institution-wide training programme for HIV care physicians in smoking cessation counselling led to increased smoking cessation and fewer relapses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This review of late-Holocene palaeoclimatology represents the results from a PAGES/CLIVAR Intersection Panel meeting that took place in June 2006. The review is in three parts: the principal high-resolution proxy disciplines (trees, corals, ice cores and documentary evidence), emphasizing current issues in their use for climate reconstruction; the various approaches that have been adopted to combine multiple climate proxy records to provide estimates of past annual-to-decadal timescale Northern Hemisphere surface temperatures and other climate variables, such as large-scale circulation indices; and the forcing histories used in climate model simulations of the past millennium. We discuss the need to develop a framework through which current and new approaches to interpreting these proxy data may be rigorously assessed using pseudo-proxies derived from climate model runs, where the `answer' is known. The article concludes with a list of recommendations. First, more raw proxy data are required from the diverse disciplines and from more locations, as well as replication, for all proxy sources, of the basic raw measurements to improve absolute dating, and to better distinguish the proxy climate signal from noise. Second, more effort is required to improve the understanding of what individual proxies respond to, supported by more site measurements and process studies. These activities should also be mindful of the correlation structure of instrumental data, indicating which adjacent proxy records ought to be in agreement and which not. Third, large-scale climate reconstructions should be attempted using a wide variety of techniques, emphasizing those for which quantified errors can be estimated at specified timescales. Fourth, a greater use of climate model simulations is needed to guide the choice of reconstruction techniques (the pseudo-proxy concept) and possibly help determine where, given limited resources, future sampling should be concentrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Field soils show rather different spreading behavior at different water saturations, frequently caused by layering of the soil material. We performed tracer experiments in a laboratory sand tank. Such experiments complement and help comprehension of field investigations. We estimated, by image analysis, the first two moments of small plumes traveling through a two-dimensional, heterogeneous medium with strongly anisotropic correlation structure. Three steady state regimes were analyzed. Two main conclusions were drawn. First, low saturation led to very large heterogeneity and to strong preferential flow. Thus the description of the flow paths and the prediction of the solute arrival times require, in this case, more accurate knowledge about the topological structure. Second, saturation-dependent macroscopic anisotropy is an essential element of transport in unsaturated media. For this reason, small structural soil features should be properly upscaled to give appropriate effective soil parameters to be input in transport models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thigmomorphogenesis, the characteristic phenotypic changes by which plants react to mechanical stress, is a widespread and probably adaptive type of phenotypic plasticity. However, little is known about its genetic basis and population variation. Here, we examine genetic variation for thigmomorphogenesis within and among natural populations of the model system Arabidopsis thaliana. Offspring from 17 field-collected European populations was subjected to three levels of mechanical stress exerted by wind. Overall, plants were remarkably tolerant to mechanical stress. Even high wind speed did not significantly alter the correlation structure among phenotypic traits. However, wind significantly affected plant growth and phenology, and there was genetic variation for some aspects of plasticity to wind among A. thaliana populations. Our most interesting finding was that phenotypic traits were organized into three distinct and to a large degree statistically independent covariance modules associated with plant size, phenology, and growth form, respectively. These phenotypic modules differed in their responsiveness to wind, in the degree of genetic variability for plasticity, and in the extent to which plasticity affected fitness. It is likely, therefore, that thigmomorphogenesis in this species evolves quasi-independently in different phenotypic modules.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study is to investigate the effects of predictor variable correlations and patterns of missingness with dichotomous and/or continuous data in small samples when missing data is multiply imputed. Missing data of predictor variables is multiply imputed under three different multivariate models: the multivariate normal model for continuous data, the multinomial model for dichotomous data and the general location model for mixed dichotomous and continuous data. Subsequent to the multiple imputation process, Type I error rates of the regression coefficients obtained with logistic regression analysis are estimated under various conditions of correlation structure, sample size, type of data and patterns of missing data. The distributional properties of average mean, variance and correlations among the predictor variables are assessed after the multiple imputation process. ^ For continuous predictor data under the multivariate normal model, Type I error rates are generally within the nominal values with samples of size n = 100. Smaller samples of size n = 50 resulted in more conservative estimates (i.e., lower than the nominal value). Correlation and variance estimates of the original data are retained after multiple imputation with less than 50% missing continuous predictor data. For dichotomous predictor data under the multinomial model, Type I error rates are generally conservative, which in part is due to the sparseness of the data. The correlation structure for the predictor variables is not well retained on multiply-imputed data from small samples with more than 50% missing data with this model. For mixed continuous and dichotomous predictor data, the results are similar to those found under the multivariate normal model for continuous data and under the multinomial model for dichotomous data. With all data types, a fully-observed variable included with variables subject to missingness in the multiple imputation process and subsequent statistical analysis provided liberal (larger than nominal values) Type I error rates under a specific pattern of missing data. It is suggested that future studies focus on the effects of multiple imputation in multivariate settings with more realistic data characteristics and a variety of multivariate analyses, assessing both Type I error and power. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Generalized linear Poisson and logistic regression models were utilized to examine the relationship between temperature and precipitation and cases of Saint Louis encephalitis virus spread in the Houston metropolitan area. The models were investigated with and without repeated measures, with a first order autoregressive (AR1) correlation structure used for the repeated measures model. The two types of Poisson regression models, with and without correlation structure, showed that a unit increase in temperature measured in degrees Fahrenheit increases the occurrence of the virus 1.7 times and a unit increase in precipitation measured in inches increases the occurrence of the virus 1.5 times. Logistic regression did not show these covariates to be significant as predictors for encephalitis activity in Houston for either correlation structure. This discrepancy for the logistic model could be attributed to the small data set.^ Keywords: Saint Louis Encephalitis; Generalized Linear Model; Poisson; Logistic; First Order Autoregressive; Temperature; Precipitation. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The infant mortality rate (IMR) is considered to be one of the most important indices of a country's well-being. Countries around the world and other health organizations like the World Health Organization are dedicating their resources, knowledge and energy to reduce the infant mortality rates. The well-known Millennium Development Goal 4 (MDG 4), whose aim is to archive a two thirds reduction of the under-five mortality rate between 1990 and 2015, is an example of the commitment. ^ In this study our goal is to model the trends of IMR between the 1950s to 2010s for selected countries. We would like to know how the IMR is changing overtime and how it differs across countries. ^ IMR data collected over time forms a time series. The repeated observations of IMR time series are not statistically independent. So in modeling the trend of IMR, it is necessary to account for these correlations. We proposed to use the generalized least squares method in general linear models setting to deal with the variance-covariance structure in our model. In order to estimate the variance-covariance matrix, we referred to the time-series models, especially the autoregressive and moving average models. Furthermore, we will compared results from general linear model with correlation structure to that from ordinary least squares method without taking into account the correlation structure to check how significantly the estimates change.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Count data with excess zeros relative to a Poisson distribution are common in many biomedical applications. A popular approach to the analysis of such data is to use a zero-inflated Poisson (ZIP) regression model. Often, because of the hierarchical Study design or the data collection procedure, zero-inflation and lack of independence may occur simultaneously, which tender the standard ZIP model inadequate. To account for the preponderance of zero counts and the inherent correlation of observations, a class of multi-level ZIP regression model with random effects is presented. Model fitting is facilitated using an expectation-maximization algorithm, whereas variance components are estimated via residual maximum likelihood estimating equations. A score test for zero-inflation is also presented. The multi-level ZIP model is then generalized to cope with a more complex correlation structure. Application to the analysis of correlated count data from a longitudinal infant feeding study illustrates the usefulness of the approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A two-factor no-arbitrage model is used to provide a theoretical link between stock and bond market volatility. While this model suggests that short-term interest rate volatility may, at least in part, drive both stock and bond market volatility, the empirical evidence suggests that past bond market volatility affects both markets and feeds back into short-term yield volatility. The empirical modelling goes on to examine the (time-varying) correlation structure between volatility in the stock and bond markets and finds that the sign of this correlation has reversed over the last 20 years. This has important implications far portfolio selection in financial markets. © 2005 Elsevier B.V. All rights reserved.