967 resultados para Minimum Variance Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article develops a weighted least squares version of Levene's test of homogeneity of variance for a general design, available both for univariate and multivariate situations. When the design is balanced, the univariate and two common multivariate test statistics turn out to be proportional to the corresponding ordinary least squares test statistics obtained from an analysis of variance of the absolute values of the standardized mean-based residuals from the original analysis of the data. The constant of proportionality is simply a design-dependent multiplier (which does not necessarily tend to unity). Explicit results are presented for randomized block and Latin square designs and are illustrated for factorial treatment designs and split-plot experiments. The distribution of the univariate test statistic is close to a standard F-distribution, although it can be slightly underdispersed. For a complex design, the test assesses homogeneity of variance across blocks, treatments, or treatment factors and offers an objective interpretation of residual plot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chest clapping, vibration, and shaking were studied in 10 physiotherapists who applied these techniques on an anesthetized animal model. Hemodynamic variables (such as heart rate, blood pressure, pulmonary artery pressure, and right atrial pressure) were measured during the application of these techniques to verify claims of adverse events. In addition, expired tidal volume and peak expiratory flow rate were measured to ascertain effects of these techniques. Physiotherapists in this study applied chest clapping at a rate of 6.2 +/- 0.9 Hz, vibration at 10.5 +/- 2.3 Hz, and shaking at 6.2 +/- 2.3 Hz. With the use of these rates, esophageal pressure swings of 8.8 +/- 5.0, 0.7 +/- 0.3, and 1.4 +/- 0.7 mmHg resulted from clapping, vibration, and shaking respectively. Variability in rates and forces generated by these techniques was 80% of variance in shaking force (P = 0.003). Application of these techniques by physiotherapists was found to have no significant effects on hemodynamic and most ventilatory variables in this study. From this study, we conclude that chest clapping, vibration, and shaking 1) can be consistently performed by physiotherapists; 2) are significantly related to physiotherapists' characteristics, particularly clinical experience; and 3) caused no significant hemodynamic effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new model based on thermodynamic and molecular interaction between molecules to describe the vapour-liquid phase equilibria and surface tension of pure component. The model assumes that the bulk fluid can be characterised as set of parallel layers. Because of this molecular structure, we coin the model as the molecular layer structure theory (MLST). Each layer has two energetic components. One is the interaction energy of one molecule of that layer with all surrounding layers. The other component is the intra-layer Helmholtz free energy, which accounts for the internal energy and the entropy of that layer. The equilibrium between two separating phases is derived from the minimum of the grand potential, and the surface tension is calculated as the excess of the Helmholtz energy of the system. We test this model with a number of components, argon, krypton, ethane, n-butane, iso-butane, ethylene and sulphur hexafluoride, and the results are very satisfactory. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growth experimented in recent years in both the variety and volume of structured products implies that banks and other financial institutions have become increasingly exposed to model risk. In this article we focus on the model risk associated with the local volatility (LV) model and with the Variance Gamma (VG) model. The results show that the LV model performs better than the VG model in terms of its ability to match the market prices of European options. Nevertheless, both models are subject to significant pricing errors when compared with the stochastic volatility framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de Mestrado, Gestão de Empresa (MBA), 16 de Julho de 2013, Universidade dos Açores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on cluster analysis for categorical data continues to develop, new clustering algorithms being proposed. However, in this context, the determination of the number of clusters is rarely addressed. We propose a new approach in which clustering and the estimation of the number of clusters is done simultaneously for categorical data. We assume that the data originate from a finite mixture of multinomial distributions and use a minimum message length criterion (MML) to select the number of clusters (Wallace and Bolton, 1986). For this purpose, we implement an EM-type algorithm (Silvestre et al., 2008) based on the (Figueiredo and Jain, 2002) approach. The novelty of the approach rests on the integration of the model estimation and selection of the number of clusters in a single algorithm, rather than selecting this number based on a set of pre-estimated candidate models. The performance of our approach is compared with the use of Bayesian Information Criterion (BIC) (Schwarz, 1978) and Integrated Completed Likelihood (ICL) (Biernacki et al., 2000) using synthetic data. The obtained results illustrate the capacity of the proposed algorithm to attain the true number of cluster while outperforming BIC and ICL since it is faster, which is especially relevant when dealing with large data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a Bertrand duopoly model with unknown costs. The firms' aim is to choose the price of its product according to the well-known concept of Bayesian Nash equilibrium. The chooses are made simultaneously by both firms. In this paper, we suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We show that this game has exactly one Bayesian Nash equilibrium. We analyse the advantages, for firms and for consumers, of using the technology with highest production cost versus the one with cheapest production cost. We prove that the expected profit of each firm increases with the variance of its production costs. We also show that the expected price of each good increases with both expected production costs, being the effect of the expected production costs of the rival dominated by the effect of the own expected production costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crossed classification models are applied in many investigations taking in consideration the existence of interaction between all factors or, in alternative, excluding all interactions, and in this case only the effects and the error term are considered. In this work we use commutative Jordan algebras in the study of the algebraic structure of these designs and we use them to obtain similar designs where only some of the interactions are considered. We finish presenting the expressions of the variance componentes estimators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Work presented in the context of the European Master in Computational Logics, as partial requisit for the graduation as Master in Computational Logics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Mesenchymal stem/stromal cells have unique properties favorable to their use in clinical practice and have been studied for cardiac repair. However, these cells are larger than coronary microvessels and there is controversy about the risk of embolization and microinfarctions, which could jeopardize the safety and efficacy of intracoronary route for their delivery. The index of microcirculatory resistance (IMR) is an invasive method for quantitatively assessing the coronary microcirculation status. OBJECTIVES: To examine heart microcirculation after intracoronary injection of mesenchymal stem/stromal cells with the index of microcirculatory resistance. METHODS: Healthy swine were randomized to receive by intracoronary route either 30x106 MSC or the same solution with no cells (1% human albumin/PBS) (placebo). Blinded operators took coronary pressure and flow measurements, prior to intracoronary infusion and at 5 and 30 minutes post-delivery. Coronary flow reserve (CFR) and the IMR were compared between groups. RESULTS: CFR and IMR were done with a variance within the 3 transit time measurements of 6% at rest and 11% at maximal hyperemia. After intracoronary infusion there were no significant differences in CFR. The IMR was significantly higher in MSC-injected animals (at 30 minutes, 14.2U vs. 8.8U, p = 0.02) and intragroup analysis showed a significant increase of 112% from baseline to 30 minutes after cell infusion, although no electrocardiographic changes or clinical deterioration were noted. CONCLUSION: Overall, this study provides definitive evidence of microcirculatory disruption upon intracoronary administration of mesenchymal stem/stromal cells, in a large animal model closely resembling human cardiac physiology, function and anatomy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future of health care delivery is becoming more citizen-centred, as today’s user is more active, better informed and more demanding. The European Commission is promoting online health services and, therefore, member states will need to boost deployment and use of online services. This makes e-health adoption an important field to be studied and understood. This study applied the extended unified theory of acceptance and usage technology (UTAUT2) to explain patients’ individual adoption of e-health. An online questionnaire was administrated Portugal using mostly the same instrument used in UTAUT2 adapted to e-health context. We collected 386 valid answers. Performance expectancy, effort expectancy, social influence, and habit had the most significant explanatory power over behavioural intention and habit and behavioural intention over technology use. The model explained 52% of the variance in behavioural intention and 32% of the variance in technology use. Our research helps to understand the desired technology characteristics of ehealth. By testing an information technology acceptance model, we are able to determine what is more valued by patients when it comes to deciding whether to adopt e-health systems or not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The suitability of a total-length-based, minimum capture-size and different protection regimes was investigated for the gooseneck barnacle Pollicipes pollicipes shellfishery in N Spain. For this analysis, individuals that were collected from 10 sites under different fishery protection regimes (permanently open, seasonally closed, and permanently closed) were used. First, we applied a non-parametric regression model to explore the relationship between the capitulum Rostro-Tergum (RT) size and the Total Length (TL). Important heteroskedastic disturbances were detected for this relationship, demon- strating a high variability of TL with respect to RT. This result substantiates the unsuitability of a TL-based minimum size by means of a mathematical model. Due to these disturbances, an alternative growth- based minimum capture size of 26.3 mm RT (23 mm RC) was estimated using the first derivative of a Kernel-based non-parametric regression model for the relationship between RT and dry weight. For this purpose, data from the permanently protected area were used to avoid bias due to the fishery. Second, the size-frequency distribution similarity was computed using a MDS analysis for the studied sites to evaluate the effectiveness of the protection regimes. The results of this analysis indicated a positive effect of the permanent protection, while the effect of the seasonal closure was not detected. This result needs to be interpreted with caution because the current harvesting based on a potentially unsuitable mini- mum capture size may dampen the efficacy of the seasonal protection regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we attempt to give a theoretical underpinning to the well established empirical stylized fact that asset returns in general and the spot FOREX returns in particular display predictable volatility characteristics. Adopting Moore and Roche s habit persistence version of Lucas model we nd that both the innovation in the spot FOREX return and the FOREX return itself follow "ARCH" style processes. Using the impulse response functions (IRFs) we show that the baseline simulated FOREX series has "ARCH" properties in the quarterly frequency that match well the "ARCH" properties of the empirical monthly estimations in that when we scale the x-axis to synchronize the monthly and quarterly responses we find similar impulse responses to one unit shock in variance. The IRFs for the ARCH processes we estimate "look the same" with an approximately monotonic decreasing fashion. The Lucas two-country monetary model with habit can generate realistic conditional volatility in spot FOREX return.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specification which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for inflation.