960 resultados para stock mixture analysis
Resumo:
Mixture proportioning is routinely a matter of using a recipe based on a previously produced concrete, rather than adjusting the proportions based on the needs of the mixture and the locally available materials. As budgets grow tighter and increasing attention is being paid to sustainability metrics, greater attention is beginning to be focused on making mixtures that are more efficient in their usage of materials yet do not compromise engineering performance. Therefore, a performance-based mixture proportioning method is needed to provide the desired concrete properties for a given project specification. The proposed method should be user friendly, easy to apply in practice, and flexible in terms of allowing a wide range of material selection. The objective of this study is to further develop an innovative performance-based mixture proportioning method by analyzing the relationships between the selected mix characteristics and their corresponding effects on tested properties. The proposed method will provide step-by-step instructions to guide the selection of required aggregate and paste systems based on the performance requirements. Although the provided guidance in this report is primarily for concrete pavements, the same approach can be applied to other concrete applications as well.
Resumo:
The stability of air bubbles in fresh concrete can have a profound influence of the potential durability of the system, because excessive losses during placement and consolidation can compromise the ability of the mixture to resist freezing and thawing. The stability of air void systems developed by some air entraining admixtures (AEAs) could be affected by the presence of some polycarboxylate-based water reducing admixtures (WRAs). The foam drainage test provides a means of measuring the potential stability of air bubbles in a paste. A barrier to acceptance of the test was that there was little investigation of the correlation with field performance. The work reported here was a limited exercise seeking to observe the stability of a range of currently available AEA/WRA combinations in the foam drainage test; then, to take the best and the worst and observe their stabilities on concrete mixtures in the lab. Based on the data collected, the foam drainage test appears to identify stable combinations of AEA and WRA.
Resumo:
Concrete durability may be considered as the ability to maintain serviceability over the design life without significant deterioration, and is generally a direct function of the mixture permeability. Therefore, reducing permeability will improve the potential durability of a given mixture and, in turn, improve the serviceability and longevity of the structure. Given the importance of this property, engineers often look for methods that can decrease permeability. One approach is to add chemical compounds known as integral waterproofing admixtures or permeability-reducing admixtures, which help fill and block capillary pores in the paste. Currently, there are no standard approaches to evaluate the effectiveness of permeability-reducing admixtures or to compare different products in the US. A review of manufacturers’ data sheets shows that a wide range of test methods have been used, and rarely are the same tests used on more than one product. This study investigated the fresh and hardened properties of mixtures containing commercially available hydrophilic and hydrophobic types of permeability-reducing admixtures. The aim was to develop a standard test protocol that would help owners, engineers, and specifiers compare different products and to evaluate their effects on concrete mixtures that may be exposed to hydrostatic or non-hydrostatic pressure. In this experimental program, 11 concrete mixtures were prepared with a fixed water-to-cement ratio and cement content. One plain mixture was prepared as a reference, 5 mixtures were prepared using the recommended dosage of the different permeability-reducing admixtures, and 5 mixtures were prepared using double the recommended dosage. Slump, air content, setting time, compressive and flexural strength, shrinkage, and durability indicating tests including electrical resistivity, rapid chloride penetration, air permeability, permeable voids, and sorptivity tests were conducted at various ages. The data are presented and recommendations for a testing protocol are provided.
Resumo:
In this study, the population structure of the white grunt (Haemulon plumieri) from the northern coast of the Yucatan Peninsula was determined through an otolith shape analysis based on the samples collected in three locations: Celestún (N 20°49",W 90°25"), Dzilam (N 21°23", W 88°54") and Cancún (N 21°21",W 86°52"). The otolith outline was based on the elliptic Fourier descriptors, which indicated that the H. plumieri population in the northern coast of the Yucatan Peninsula is composed of three geographically delimited units (Celestún, Dzilam, and Cancún). Significant differences were observed in mean otolith shapes among all samples (PERMANOVA; F2, 99 = 11.20, P = 0.0002), and the subsequent pairwise comparisons showed that all samples were significantly differently from each other. Samples do not belong to a unique white grunt population, and results suggest that they might represent a structured population along the northern coast of the Yucatan Peninsula
Resumo:
In this paper, we obtain sharp asymptotic formulas with error estimates for the Mellin con- volution of functions de ned on (0;1), and use these formulas to characterize the asymptotic behavior of marginal distribution densities of stock price processes in mixed stochastic models. Special examples of mixed models are jump-di usion models and stochastic volatility models with jumps. We apply our general results to the Heston model with double exponential jumps, and make a detailed analysis of the asymptotic behavior of the stock price density, the call option pricing function, and the implied volatility in this model. We also obtain similar results for the Heston model with jumps distributed according to the NIG law.
Resumo:
A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae). The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone) were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%), total polyphenol content (29.39 ± 0.39%), and antioxidant capacity (6.38 ± 0.21). An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.
Resumo:
This thesis examines the application of data envelopment analysis as an equity portfolio selection criterion in the Finnish stock market during period 2001-2011. A sample of publicly traded firms in the Helsinki Stock Exchange is examined in this thesis. The sample covers the majority of the publicly traded firms in the Helsinki Stock Exchange. Data envelopment analysis is used to determine the efficiency of firms using a set of input and output financial parameters. The set of financial parameters consist of asset utilization, liquidity, capital structure, growth, valuation and profitability measures. The firms are divided into artificial industry categories, because of the industry-specific nature of the input and output parameters. Comparable portfolios are formed inside the industry category according to the efficiency scores given by the DEA and the performance of the portfolios is evaluated with several measures. The empirical evidence of this thesis suggests that with certain limitations, data envelopment analysis can successfully be used as portfolio selection criterion in the Finnish stock market when the portfolios are rebalanced at annual frequency according to the efficiency scores given by the data envelopment analysis. However, when the portfolios were rebalanced every two or three years, the results are mixed and inconclusive.
Resumo:
The aim of this research is to examine the pricing anomalies existing in the U.S. market during 1986 to 2011. The sample of stocks is divided into decile portfolios based on seven individual valuation ratios (E/P, B/P, S/P, EBIT/EV, EVITDA/EV, D/P, and CE/P) and price momentum to investigate the efficiency of individual valuation ratio and their combinations as portfolio formation criteria. This is the first time in financial literature when CE/P is employed as a constituent of composite value measure. The combinations are based on median scaled composite value measures and TOPSIS method. During the sample period value portfolios significantly outperform both the market portfolio and comparable glamour portfolios. The results show the highest return for the value portfolio that was based on the combination of S/P & CE/P ratios. The outcome of this research will increase the understanding on the suitability of different methodologies for portfolio selection. It will help managers to take advantage of the results of different methodologies in order to gain returns above the market.
Resumo:
Tutkielma käyttää automaattista kuviontunnistusalgoritmia ja yleisiä kahden liukuvan keskiarvon leikkauspiste –sääntöjä selittääkseen Stuttgartin pörssissä toimivien yksityissijoittajien myynti-osto –epätasapainoa ja siten vastatakseen kysymykseen ”käyttävätkö yksityissijoittajat teknisen analyysin menetelmiä kaupankäyntipäätöstensä perustana?” Perusolettama sijoittajien käyttäytymisestä ja teknisen analyysin tuottavuudesta tehtyjen tutkimusten perusteella oli, että yksityissijoittajat käyttäisivät teknisen analyysin metodeja. Empiirinen tutkimus, jonka aineistona on DAX30 yhtiöiden data vuosilta 2009 – 2013, ei tuottanut riittävän selkeää vastausta tutkimuskysymykseen. Heikko todistusaineisto näyttää kuitenkin osoittavan, että yksityissijoittajat muuttavat kaupankäyntikäyttäytymistänsä eräiden kuvioiden ja leikkauspistesääntöjen ohjastamaan suuntaan.
Resumo:
Most of economic literature has presented its analysis under the assumption of homogeneous capital stock. However, capital composition differs across countries. What has been the pattern of capital composition associated with World economies? We make an exploratory statistical analysis based on compositional data transformed by Aitchinson logratio transformations and we use tools for visualizing and measuring statistical estimators of association among the components. The goal is to detect distinctive patterns in the composition. As initial findings could be cited that: 1. Sectorial components behaved in a correlated way, building industries on one side and , in a less clear view, equipment industries on the other. 2. Full sample estimation shows a negative correlation between durable goods component and other buildings component and between transportation and building industries components. 3. Countries with zeros in some components are mainly low income countries at the bottom of the income category and behaved in a extreme way distorting main results observed in the full sample. 4. After removing these extreme cases, conclusions seem not very sensitive to the presence of another isolated cases
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.
Resumo:
This dissertation presents two papers on how to deal with simple systemic risk measures to assess portfolio risk characteristics. The first paper deals with the Granger-causation of systemic risk indicators based in correlation matrices in stock returns. Special focus is devoted to the Eigenvalue Entropy as some previous literature indicated strong re- sults, but not considering different macroeconomic scenarios; the Index Cohesion Force and the Absorption Ratio are also considered. Considering the S&P500, there is not ev- idence of Granger-causation from Eigenvalue Entropies and the Index Cohesion Force. The Absorption Ratio Granger-caused both the S&P500 and the VIX index, being the only simple measure that passed this test. The second paper develops this measure to capture the regimes underlying the American stock market. New indicators are built using filtering and random matrix theory. The returns of the S&P500 is modelled as a mixture of normal distributions. The activation of each normal distribution is governed by a Markov chain with the transition probabilities being a function of the indicators. The model shows that using a Herfindahl-Hirschman Index of the normalized eigenval- ues exhibits best fit to the returns from 1998-2013.