989 resultados para Minimum Variance Model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Educação - IBRC

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To compare the cyclosporine 0.05 % exposure effect on fibroblasts from primary and recurrent pterygium. Primary culture of fibroblasts from primary and recurrent pterygium was performed until the third passage, which was exposed to cyclosporine 0.05 % in a group and the other remaining unexposed (control group), in triplicates. After 3, 6, 12, and 17 days of exposure the viable cell counting was performed by hemocytometer. The results were statistically analyzed using the technique of analysis of non-parametric variance model for repeated measures with three factors. There was a significant reduction in both fibroblast proliferation, in primary as in the recurrent pterygium cultures exposed to cyclosporine when compared not exposed cultures, with statistical significance (P < 0.05). Comparing primary and recurrent pterygium that received the drug, there was no significant difference in cell proliferation in relation to primary or recurrent pterygium. Cyclosporine 0.05 % is effective in inhibiting fibroblast proliferation in culture, both in primary and as in recurrent pterygium.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, new precision experiments have become possible withthe high luminosity accelerator facilities at MAMIand JLab, supplyingphysicists with precision data sets for different hadronic reactions inthe intermediate energy region, such as pion photo- andelectroproduction and real and virtual Compton scattering.By means of the low energy theorem (LET), the global properties of thenucleon (its mass, charge, and magnetic moment) can be separated fromthe effects of the internal structure of the nucleon, which areeffectively described by polarizabilities. Thepolarizabilities quantify the deformation of the charge andmagnetization densities inside the nucleon in an applied quasistaticelectromagnetic field. The present work is dedicated to develop atool for theextraction of the polarizabilities from these precise Compton data withminimum model dependence, making use of the detailed knowledge of pionphotoproduction by means of dispersion relations (DR). Due to thepresence of t-channel poles, the dispersion integrals for two ofthe six Compton amplitudes diverge. Therefore, we have suggested to subtract the s-channel dispersion integrals at zero photon energy($nu=0$). The subtraction functions at $nu=0$ are calculated through DRin the momentum transfer t at fixed $nu=0$, subtracted at t=0. For this calculation, we use the information about the t-channel process, $gammagammatopipito Nbar{N}$. In this way, four of thepolarizabilities can be predicted using the unsubtracted DR in the $s$-channel. The other two, $alpha-beta$ and $gamma_pi$, are free parameters in ourformalism and can be obtained from a fit to the Compton data.We present the results for unpolarized and polarized RCS observables,%in the kinematics of the most recent experiments, and indicate anenhanced sensitivity to the nucleon polarizabilities in theenergy range between pion production threshold and the $Delta(1232)$-resonance.newlineindentFurthermore,we extend the DR formalism to virtual Compton scattering (radiativeelectron scattering off the nucleon), in which the concept of thepolarizabilities is generalized to the case of avirtual initial photon by introducing six generalizedpolarizabilities (GPs). Our formalism provides predictions for the fourspin GPs, while the two scalar GPs $alpha(Q^2)$ and $beta(Q^2)$ have to befitted to the experimental data at each value of $Q^2$.We show that at energies betweenpion threshold and the $Delta(1232)$-resonance position, thesensitivity to the GPs can be increased significantly, as compared tolow energies, where the LEX is applicable. Our DR formalism can be used for analysing VCS experiments over a widerange of energy and virtuality $Q^2$, which allows one to extract theGPs from VCS data in different kinematics with a minimum of model dependence.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Infrared stimulated luminescence (IRSL) and post-IR IRSL are applied to small aliquots and single grains to determine the equivalent dose (De) of eleven alluvial and fluvial sediment samples collected in the Pativilca valley, Central Peru at ca. 10°S latitude. Small aliquot De distributions are rather symmetric and display over-dispersion values between 15 and 46%. Small aliquot g-values range between 4 and 8% per decade for the IRSL and 1 and 2% per decade for the post-IR IRSL signal. The single grain De distributions are highly over-dispersed with some of them skewed to higher doses, implying partial bleaching; this is especially true for the post-IR IRSL. Measurements of a modern analog reveal that residuals due to partial bleaching are present in both the IRSL as well as the post-IR IRSL signal. The g-values of individual grains exhibit a wide range with high individual uncertainties and might contribute significantly to the spread of the single grain De values, at least for the IRSL data. Electron Microprobe Analysis performed on single grains reveal that a varying K-content can be excluded as the origin of over-dispersion. Final ages for the different approaches are calculated using the Central Age Model and the Minimum Age Model (MAM). The samples are grouped into well-beached, potentially well-bleached and partially bleached according to the evaluation of the single grain distributions and the agreement of age estimates between methods. The application of the MAM to the single grain data resulted in consistent age estimates for both the fading corrected IRSL and the post-IR IRSL ages, and suggests that both approaches are suitable for dating these samples. Keywords

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Bayesian approach to estimating the intraclass correlation coefficient was used for this research project. The background of the intraclass correlation coefficient, a summary of its standard estimators, and a review of basic Bayesian terminology and methodology were presented. The conditional posterior density of the intraclass correlation coefficient was then derived and estimation procedures related to this derivation were shown in detail. Three examples of applications of the conditional posterior density to specific data sets were also included. Two sets of simulation experiments were performed to compare the mean and mode of the conditional posterior density of the intraclass correlation coefficient to more traditional estimators. Non-Bayesian methods of estimation used were: the methods of analysis of variance and maximum likelihood for balanced data; and the methods of MIVQUE (Minimum Variance Quadratic Unbiased Estimation) and maximum likelihood for unbalanced data. The overall conclusion of this research project was that Bayesian estimates of the intraclass correlation coefficient can be appropriate, useful and practical alternatives to traditional methods of estimation. ^

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The neurophysiological changes associated with Alzheimer's Disease (AD) and Mild Cognitive Impairment (MCI) include an increase in low frequency activity, as measured with electroencephalography or magnetoencephalography (MEG). A relevant property of spectral measures is the alpha peak, which corresponds to the dominant alpha rhythm. Here we studied the spatial distribution of MEG resting state alpha peak frequency and amplitude values in a sample of 27 MCI patients and 24 age-matched healthy controls. Power spectra were reconstructed in source space with linearly constrained minimum variance beamformer. Then, 88 Regions of Interest (ROIs) were defined and an alpha peak per ROI and subject was identified. Statistical analyses were performed at every ROI, accounting for age, sex and educational level. Peak frequency was significantly decreased (p < 0.05) in MCIs in many posterior ROIs. The average peak frequency over all ROIs was 9.68 ± 0.71 Hz for controls and 9.05 ± 0.90 Hz for MCIs and the average normalized amplitude was (2.57 ± 0.59)·10−2 for controls and (2.70 ± 0.49)·10−2 for MCIs. Age and gender were also found to play a role in the alpha peak, since its frequency was higher in females than in males in posterior ROIs and correlated negatively with age in frontal ROIs. Furthermore, we examined the dependence of peak parameters with hippocampal volume, which is a commonly used marker of early structural AD-related damage. Peak frequency was positively correlated with hippocampal volume in many posterior ROIs. Overall, these findings indicate a pathological alpha slowing in MCI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A study which examines the use of aircraft as wind sensors in a terminal area for real-time wind estimation in order to improve aircraft trajectory prediction is presented in this paper. We describe not only different sources in the aircraft systems that provide the variables needed to derivate the wind velocity but the capabilities which allow us to present this information for ATM Applications. Based on wind speed samples from aircraft landing at Madrid-Barajas airport, a real-time wind field will be estimated using a data processing approach through a minimum variance method. Finally the accuracy of this procedure will be evaluated for this information to be useful to Air Traffic Control.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho tem com objetivo abordar o problema de alocação de ativos (análise de portfólio) sob uma ótica Bayesiana. Para isto foi necessário revisar toda a análise teórica do modelo clássico de média-variância e na sequencia identificar suas deficiências que comprometem sua eficácia em casos reais. Curiosamente, sua maior deficiência não esta relacionado com o próprio modelo e sim pelos seus dados de entrada em especial ao retorno esperado calculado com dados históricos. Para superar esta deficiência a abordagem Bayesiana (modelo de Black-Litterman) trata o retorno esperado como uma variável aleatória e na sequência constrói uma distribuição a priori (baseado no modelo de CAPM) e uma distribuição de verossimilhança (baseado na visão de mercado sob a ótica do investidor) para finalmente aplicar o teorema de Bayes tendo como resultado a distribuição a posteriori. O novo valor esperado do retorno, que emerge da distribuição a posteriori, é que substituirá a estimativa anterior do retorno esperado calculado com dados históricos. Os resultados obtidos mostraram que o modelo Bayesiano apresenta resultados conservadores e intuitivos em relação ao modelo clássico de média-variância.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During 1999 and 2000 a large number of articles appeared in the financial press which argued that the concentration of the FTSE 100 had increased. Many of these reports suggested that stock market volatility in the UK had risen, because the concentration of its stock markets had increased. This study undertakes a comprehensive measurement of stock market concentration using the FTSE 100 index. We find that during 1999, 2000 and 2001 stock market concentration was noticeably higher than at any other time since the index was introduced. When we measure the volatility of the FTSE 100 index we do not find an association between concentration and its volatility. When we examine the variances and covariance’s of the FTSE 100 constituents we find that security volatility appears to be positively related to concentration changes but concentration and the size of security covariances appear to be negatively related. We simulate the variance of four versions of the FTSE 100 index; in each version of the index the weighting structure reflects either an equally weighted index, or one with levels of low, intermediate or high concentration. We find that moving from low to high concentration has very little impact on the volatility of the index. To complete the study we estimate the minimum variance portfolio for the FTSE 100, we then compare concentration levels of this index to those formed on the basis of market weighting. We find that realised FTSE index weightings are higher than for the minimum variance index.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: This study aimed to explore methods of assessing interactions between neuronal sources using MEG beamformers. However, beamformer methodology is based on the assumption of no linear long-term source interdependencies [VanVeen BD, vanDrongelen W, Yuchtman M, Suzuki A. Localization of brain electrical activity via linearly constrained minimum variance spatial filtering. IEEE Trans Biomed Eng 1997;44:867-80; Robinson SE, Vrba J. Functional neuroimaging by synthetic aperture magnetometry (SAM). In: Recent advances in Biomagnetism. Sendai: Tohoku University Press; 1999. p. 302-5]. Although such long-term correlations are not efficient and should not be anticipated in a healthy brain [Friston KJ. The labile brain. I. Neuronal transients and nonlinear coupling. Philos Trans R Soc Lond B Biol Sci 2000;355:215-36], transient correlations seem to underlie functional cortical coordination [Singer W. Neuronal synchrony: a versatile code for the definition of relations? Neuron 1999;49-65; Rodriguez E, George N, Lachaux J, Martinerie J, Renault B, Varela F. Perception's shadow: long-distance synchronization of human brain activity. Nature 1999;397:430-3; Bressler SL, Kelso J. Cortical coordination dynamics and cognition. Trends Cogn Sci 2001;5:26-36]. Methods: Two periodic sources were simulated and the effects of transient source correlation on the spatial and temporal performance of the MEG beamformer were examined. Subsequently, the interdependencies of the reconstructed sources were investigated using coherence and phase synchronization analysis based on Mutual Information. Finally, two interacting nonlinear systems served as neuronal sources and their phase interdependencies were studied under realistic measurement conditions. Results: Both the spatial and the temporal beamformer source reconstructions were accurate as long as the transient source correlation did not exceed 30-40 percent of the duration of beamformer analysis. In addition, the interdependencies of periodic sources were preserved by the beamformer and phase synchronization of interacting nonlinear sources could be detected. Conclusions: MEG beamformer methods in conjunction with analysis of source interdependencies could provide accurate spatial and temporal descriptions of interactions between linear and nonlinear neuronal sources. Significance: The proposed methods can be used for the study of interactions between neuronal sources. © 2005 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The consumption of energy on the planet is currently based on fossil fuels. They are responsible for adverse effects on the environment. Renewables propose solutions for this scenario, but must face issues related to the capacity of the power supply. Wind energy offshore emerging as a promising alternative. The speed and stability are greater winds over oceans, but the variability of these may cause inconvenience to the generation of electric power fluctuations. To reduce this, a combination of wind farms geographically distributed was proposed. The greater the distance between them, the lower the correlation between the wind velocity, increasing the likelihood that together achieve more stable power system with less fluctuations in power generation. The efficient use of production capacity of the wind park however, depends on their distribution in marine environments. The objective of this research was to analyze the optimal allocation of wind farms offshore on the east coast of the U.S. by Modern Portfolio Theory. The Modern Portfolio Theory was used so that the process of building portfolios of wind energy offshore contemplate the particularity of intermittency of wind, through calculations of return and risk of the production of wind farms. The research was conducted with 25.934 observations of energy produced by wind farms 11 hypothetical offshore, from the installation of 01 simulated ocean turbine with a capacity of 5 MW. The data show hourly time resolution and covers the period between January 1, 1998 until December 31, 2002. Through the Matlab R software, six were calculated minimum variance portfolios, each for a period of time distinct. Given the inequality of the variability of wind over time, set up four strategies rebalancing to evaluate the performance of the related portfolios, which enabled us to identify the most beneficial to the stability of the wind energy production offshore. The results showed that the production of wind energy for 1998, 1999, 2000 and 2001 should be considered by the portfolio weights calculated for the same periods, respectively. Energy data for 2002 should use the weights derived from the portfolio calculated in the previous time period. Finally, the production of wind energy in the period 1998-2002 should also be weighted by 1/11. It follows therefore that the portfolios found failed to show reduced levels of variability when compared to the individual production of wind farms hypothetical offshore

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabajo predice la volatilidad de la rentabilidad diaria del precio del azúcar, en el período compren­dido entre 1 de junio de 2011 y el 24 de octubre de 2013. Los datos diarios utilizados fueron los precios del azúcar, del etanol y la tasa de cambio de la moneda de Brasil (Real) en dólares. Se usaron modelos multivariados de volatilidad autoregresiva condicional generalizada. A partir de la predicción de los precios del azúcar se calcula la razón de cobertura de mínima varianza. Los resultados muestran, que la razón de cobertura es 0.37, esto significa que, si un productor adverso al riesgo, que tiene la intención de eliminar un porcentaje de la volatilidad de la rentabilidad diaria del mercado mundial del azúcar, y espera vender 25 contratos de azúcar, cada uno de ellos de 50,84 toneladas (1.271 toneladas), el número de contratos optimo tomando cobertura a futuro será 9 y el número de contratos sin tomar cobertura (de contado) será 16.