23 resultados para Mean-variance analysis

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article investigates the performance of a model called Full-Scale Optimisation, which was presented recently and is used for financial investment advice. The investor’s preferences of expected risk and return are entered into the model, and a recommended portfolio is produced. This model is theoretically more accurate than the mainstream investment advice model, called Mean-Variance Optimization, as there are fewer assumptions made. Our investigation of the model’s performance is broader when it comes to investor preferences, and more general when it comes to investment type, as compared to previous studies. Our investigation shows that Full-Scale Optimisation is more widely applicable than earlier known.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a central integrator of basal ganglia function, the external segment of the globus pallidus (GP) plays a critical role in the control of voluntary movement. Driven by intrinsic mechanisms and excitatory glutamatergic inputs from the subthalamic nucleus, GP neurons receive GABAergic inhibitory input from the striatum (Str-GP) and from local collaterals of neighbouring pallidal neurons (GP-GP). Here we provide electrophysiological evidence for functional differences between these two inhibitory inputs. The basic synaptic characteristics of GP-GP and Str-GP GABAergic synapses were studied using whole-cell recordings with paired-pulse and train stimulation protocols and variance-mean (VM) analysis. We found (i) IPSC kinetics are consistent with local collaterals innervating the soma and proximal dendrites of GP neurons whereas striatal inputs innervate more distal regions. (ii) Compared to GP-GP synapses Str-GP synapses have a greater paired-pulse ratio, indicative of a lower probability of release. This was confirmed using VM analysis. (iii) In response to 20 and 50 Hz train stimulation, GP-GP synapses are weakly facilitatory in 1 mm external calcium and depressant in 2.4 mm calcium. This is in contrast to Str-GP synapses which display facilitation under both conditions. This is the first quantitative study comparing the properties of GP-GP and Str-GP synapses. The results are consistent with the differential location of these inhibitory synapses and subtle differences in their release probability which underpin stable GP-GP responses and robust short-term facilitation of Str-GP responses. These fundamental differences may provide the physiological basis for functional specialization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reports on an aspect of the implementation of a sophisticated system of Casemix Budgeting within a large public hospital in New Zealand. The paper examines the role of accounting inscription in supporting a system of “remote” management control effected through the Finance function at the hospital. The paper provides detailed description and analysis of part of the casemix technology in use at the research site. The implementation of clinical budgeting through the Transition casemix system will be examined by describing an aspect of the casemix system in detail. The design and use of management reporting is described. Reporting to different levels of management and for differing parts of the organisation are discussed with particular emphasis on the adoption of traditional analysis of costs using standard costing and variance analysis techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Flow control in Computer Communication systems is generally a multi-layered structure, consisting of several mechanisms operating independently at different levels. Evaluation of the performance of networks in which different flow control mechanisms act simultaneously is an important area of research, and is examined in depth in this thesis. This thesis presents the modelling of a finite resource computer communication network equipped with three levels of flow control, based on closed queueing network theory. The flow control mechanisms considered are: end-to-end control of virtual circuits, network access control of external messages at the entry nodes and the hop level control between nodes. The model is solved by a heuristic technique, based on an equivalent reduced network and the heuristic extensions to the mean value analysis algorithm. The method has significant computational advantages, and overcomes the limitations of the exact methods. It can be used to solve large network models with finite buffers and many virtual circuits. The model and its heuristic solution are validated by simulation. The interaction between the three levels of flow control are investigated. A queueing model is developed for the admission delay on virtual circuits with end-to-end control, in which messages arrive from independent Poisson sources. The selection of optimum window limit is considered. Several advanced network access schemes are postulated to improve the network performance as well as that of selected traffic streams, and numerical results are presented. A model for the dynamic control of input traffic is developed. Based on Markov decision theory, an optimal control policy is formulated. Numerical results are given and throughput-delay performance is shown to be better with dynamic control than with static control.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper explores a new method of analysing muscle fatigue within the muscles predominantly used during microsurgery. The captured electromyographic (EMG) data retrieved from these muscles are analysed for any defining patterns relating to muscle fatigue. The analysis consists of dynamically embedding the EMG signals from a single muscle channel into an embedded matrix. The muscle fatigue is determined by defining its entropy characterized by the singular values of the dynamically embedded (DE) matrix. The paper compares this new method with the traditional method of using mean frequency shifts in the EMG signal's power spectral density. Linear regressions are fitted to the results from both methods, and the coefficients of variation of both their slope and point of intercept are determined. It is shown that the complexity method is slightly more robust in that the coefficient of variation for the DE method has lower variability than the conventional method of mean frequency analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article is aimed primarily at eye care practitioners who are undertaking advanced clinical research, and who wish to apply analysis of variance (ANOVA) to their data. ANOVA is a data analysis method of great utility and flexibility. This article describes why and how ANOVA was developed, the basic logic which underlies the method and the assumptions that the method makes for it to be validly applied to data from clinical experiments in optometry. The application of the method to the analysis of a simple data set is then described. In addition, the methods available for making planned comparisons between treatment means and for making post hoc tests are evaluated. The problem of determining the number of replicates or patients required in a given experimental situation is also discussed. Copyright (C) 2000 The College of Optometrists.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Analysis of variance (ANOVA) is the most efficient method available for the analysis of experimental data. Analysis of variance is a method of considerable complexity and subtlety, with many different variations, each of which applies in a particular experimental context. Hence, it is possible to apply the wrong type of ANOVA to data and, therefore, to draw an erroneous conclusion from an experiment. This article reviews the types of ANOVA most likely to arise in clinical experiments in optometry including the one-way ANOVA ('fixed' and 'random effect' models), two-way ANOVA in randomised blocks, three-way ANOVA, and factorial experimental designs (including the varieties known as 'split-plot' and 'repeated measures'). For each ANOVA, the appropriate experimental design is described, a statistical model is formulated, and the advantages and limitations of each type of design discussed. In addition, the problems of non-conformity to the statistical model and determination of the number of replications are considered. © 2002 The College of Optometrists.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To carry out an analysis of variance, several assumptions are made about the nature of the experimental data which have to be at least approximately true for the tests to be valid. One of the most important of these assumptions is that a measured quantity must be a parametric variable, i.e., a member of a normally distributed population. If the data are not normally distributed, then one method of approach is to transform the data to a different scale so that the new variable is more likely to be normally distributed. An alternative method, however, is to use a non-parametric analysis of variance. There are a limited number of such tests available but two useful tests are described in this Statnote, viz., the Kruskal-Wallis test and Friedmann’s analysis of variance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The techniques and insights from two distinct areas of financial economic modelling are combined to provide evidence of the influence of firm size on the volatility of stock portfolio returns. Portfolio returns are characterized by positive serial correlation induced by the varying levels of non-synchronous trading among the component stocks. This serial correlation is greatest for portfolios of small firms. The conditional volatility of stock returns has been shown to be well represented by the GARCH family of statistical processes. Using a GARCH model of the variance of capitalization-based portfolio returns, conditioned on the autocorrelation structure in the conditional mean, striking differences related to firm size are uncovered.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A method is described which enables the spatial pattern of discrete objects in histological sections of brain tissue to be determined. The method can be applied to cell bodies, sections of blood vessels or the characteristic lesions which develop in the brain of patients with neurodegenerative disorders. The density of the histological feature under study is measured in a series of contiguous sample fields arranged in a grid or transect. Data from adjacent sample fields are added together to provide density data for larger field sizes. A plot of the variance/mean ratio (V/M) of the data versus field size reveals whether the objects are distributed randomly, uniformly or in clusters. If the objects are clustered, the analysis determines whether the clusters are randomly or regularly distributed and the mean size of the clusters. In addition, if two different histological features are clustered, the analysis can determine whether their clusters are in phase, out of phase or unrelated to each other. To illustrate the method, the spatial patterns of senile plaques and neurofibrillary tangles were studied in histological sections of brain tissue from patients with Alzheimer's disease.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The spatial patterns of discrete beta-amyloid (Abeta) deposits in brain tissue from patients with Alzheimer disease (AD) were studied using a statistical method based on linear regression, the results being compared with the more conventional variance/mean (V/M) method. Both methods suggested that Abeta deposits occurred in clusters (400 to <12,800 mu m in diameter) in all but 1 of the 42 tissues examined. In many tissues, a regular periodicity of the Abeta deposit clusters parallel to the tissue boundary was observed. In 23 of 42 (55%) tissues, the two methods revealed essentially the same spatial patterns of Abeta deposits; in 15 of 42 (36%), the regression method indicated the presence of clusters at a scale not revealed by the V/M method; and in 4 of 42 (9%), there was no agreement between the two methods. Perceived advantages of the regression method are that there is a greater probability of detecting clustering at multiple scales, the dimension of larger Abeta clusters can be estimated more accurately, and the spacing between the clusters may be estimated. However, both methods may be useful, with the regression method providing greater resolution and the V/M method providing greater simplicity and ease of interpretation. Estimates of the distance between regularly spaced Abeta clusters were in the range 2,200-11,800 mu m, depending on tissue and cluster size. The regular periodicity of Abeta deposit clusters in many tissues would be consistent with their development in relation to clusters of neurons that give rise to specific neuronal projections.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The two-way design has been variously described as a matched-sample F-test, a simple within-subjects ANOVA, a one-way within-groups ANOVA, a simple correlated-groups ANOVA, and a one-factor repeated measures design! This confusion of terminology is likely to lead to problems in correctly identifying this analysis within commercially available software. The essential feature of the design is that each treatment is allocated by randomization to one experimental unit within each group or block. The block may be a plot of land, a single occasion in which the experiment was performed, or a human subject. The ‘blocking’ is designed to remove an aspect of the error variation and increase the ‘power’ of the experiment. If there is no significant source of variation associated with the ‘blocking’ then there is a disadvantage to the two-way design because there is a reduction in the DF of the error term compared with a fully randomised design thus reducing the ‘power’ of the analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Experiments combining different groups or factors are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the number of replications required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than simply the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for each error term of the ANOVA. Finally, in a factorial experiment, it is important to define the design of the experiment in detail because this determines the appropriate type of ANOVA. We will discuss some of the common variations of factorial ANOVA in future statnotes. If there is doubt about which ANOVA to use, the researcher should seek advice from a statistician with experience of research in applied microbiology.