989 resultados para Sample data
Resumo:
The main purpose of an Experimental Design resides mainly in the search for relationships between variables and in comparing levels of factors, using statistical treatment of collected data. The use of blocks in Experimental Design is essential because it allows reducing or eliminating the variability introduced by factors that can influence the experience but are not of main interest and/or were not explicitly included during experiments. In this work we present the results of the study and research of Balanced Incomplete Block Designs (BIBD), Balanced Incomplete Block Designs with repeated blocks (BIBDR) and the Incomplete Blocks Designs with blocks with different dimensions (VBBD). We explore some properties and construction methods of such designs and illustrate, when possible, with examples. Based on Block Designs, we present an application of BIBDR in Education, with the aim of comparing five domains of algebraic thinking in a sample of 1st year students of higher education in Cape Verde. For the analysis of sample data, the software R was used, version 2.12.1. We observed that significant differences exist between some of the domains of algebraic thinking, especially among the domains of Generalization of Arithmetic and Algebraic Technicality with the remaining areas. For a more representative sample, we recommend a bigger sample consisting of students from all higher institutions of Cape Verde.
Resumo:
O principal objetivo de um Planeamento de Experiências reside essencialmente na procura de relações entre variáveis e na comparação de níveis de fatores, recorrendo ao tratamento estatístico dos dados recolhidos. A utilização de blocos no Planeamento de Experiências é fundamental, pois permite reduzir ou eliminar a variabilidade introduzida por fatores que podem influenciar a experiência mas que não interessam e/ou não foram explicitamente incluídos durante o planeamento. Neste trabalho apresentamos os resultados do estudo e investigação dos Planos em Blocos Incompletos Equilibrados (BIBD), Planos em Blocos Incompletos Equilibrados com repetição de blocos (BIBDR) e Planos em Blocos Incompletos com blocos de diferentes dimensões (VBBD). Exploramos algumas propriedades e métodos de construção destes planos e ilustramos, sempre que possível, com exemplos. Tendo como base o planeamento em blocos, apresentamos uma aplicação dos BIBDR na área da Educação com o objetivo de comparar cinco domínios do pensamento algébrico de uma amostra de alunos do 1º ano do ensino superior em Cabo Verde. Para a análise dos dados da amostra foi utilizado o software R, versão 2.12.1. Pudemos constatar que existem diferenças significativas entre alguns dos domínios do pensamento algébrico, nomeadamente entre os domínios da Generalização da Aritmética e Tecnicismo Algébrico com os restantes domínios. Recomendamos a escolha de uma amostra mais representativa constituída por alunos de todas as instituições superiores de Cabo Verde
Resumo:
The modeling and estimation of the parameters that define the spatial dependence structure of a regionalized variable by geostatistical methods are fundamental, since these parameters, underlying the kriging of unsampled points, allow the construction of thematic maps. One or more atypical observations in the sample data can affect the estimation of these parameters. Thus, the assessment of the combined influence of these observations by the analysis of Local Influence is essential. The purpose of this paper was to propose local influence analysis methods for the regionalized variable, given that it has n-variate Student's t-distribution, and compare it with the analysis of local influence when the same regionalized variable has n-variate normal distribution. These local influence analysis methods were applied to soil physical properties and soybean yield data of an experiment carried out in a 56.68 ha commercial field in western Paraná, Brazil. Results showed that influential values are efficiently determined with n-variate Student's t-distribution.
Resumo:
Opinnäytetyö etsii korrelaatiota ohjelmistomittauksella saavutettujen tulosten ja ohjelmasta löytyneiden virheiden väliltä. Työssä käytetään koeryhmänä jo olemassaolevia ohjelmistoja. Työ tutkii olisiko ohjelmistomittareita käyttämällä ollut mahdollista paikallistaa ohjelmistojen ongelmakohdat ja näin saada arvokasta tietoa ohjelmistokehitykseen. Mittausta voitaisiin käyttää resurssien parempaan kohdentamiseen koodikatselmuksissa, koodi-integraatiossa, systeemitestauksessa ja aikataulutuksessa. Mittaamisen avulla nämä tehtävät saisivat enemmän tietoa resurssien kohdistamiseen. Koeryhmänä käytetään erilaisia ohjelmistotuotteita. Yhteistä näille kaikille tuotteille on niiden peräkkäiset julkaisut. Uutta julkaisua tehtäessä, edellistä julkaisua käytetään pohjana, jonka päällekehitetään uutta lähdekoodia. Tämän takia ohjelmistomittauksessa pitää pystyä erottelemaan edellisen julkaisun lähdekoodi uudesta lähdekoodista. Työssä käytettävät ohjelmistomittarit ovat yleisiä ja ohjelmistotekniikassalaajasti käytettyjä mittaamaan erilaisia lähdekoodin ominaisuuksia, joiden arvellaan vaikuttavan virhealttiuteen. Tämän työn tarkoitus on tutkia näiden ohjelmistomittareiden käytettävyyttä koeryhmänä toimivissa ohjelmistoympäristöissä. Käytännön osuus työstä onnistui löytämään korrelaation joidenkinohjelmistomittareiden ja virheiden väliltä, samalla kuin toiset ohjelmistomittarit eivät antaneet vakuuttavia tuloksia. Ohjelmistomittareita käyttämällä näyttää olevan mahdollista tunnistaa virhealttiit kohdat ohjelmasta ja siten parantaa ohjelmistokehityksen tehokkuutta. Ohjelmistomittareiden käyttö tuotekehityksessäon perusteltavaa ja niiden avulla mahdollisesti pystyttäisiin vaikuttamaan ohjelmiston laatuun tulevissa julkaisuissa.
Resumo:
Tutkielman tavoitteena oli tarkastella innovaatioiden leviämismallien ennustetarkkuuteen vaikuttavia tekijöitä. Tutkielmassa ennustettiin logistisella mallilla matkapuhelinliittymien leviämistä kolmessa Euroopan maassa: Suomessa, Ranskassa ja Kreikassa. Teoriaosa keskittyi innovaatioiden leviämisen ennustamiseen leviämismallien avulla. Erityisesti painotettiin mallien ennustuskykyä ja niiden käytettävyyttä eri tilanteissa. Empiirisessä osassa keskityttiin ennustamiseen logistisella leviämismallilla, joka kalibroitiin eri tavoin koostetuilla aikasarjoilla. Näin tehtyjä ennusteita tarkasteltiin tiedon kokoamistasojen vaikutusten selvittämiseksi. Tutkimusasetelma oli empiirinen, mikä sisälsi logistisen leviämismallin ennustetarkkuuden tutkimista otosdatan kokoamistasoa muunnellen. Leviämismalliin syötettävä data voidaan kerätä kuukausittain ja operaattorikohtaisesti vaikuttamatta ennustetarkkuuteen. Dataan on sisällytettävä leviämiskäyrän käännöskohta, eli pitkän aikavälin huippukysyntäpiste.
Resumo:
Tämän työn tarkoituksena oli tutkia lineaarisen regressioanalyysin avulla sekä osingonjakopolitiikan määräytymiseen vaikuttavia tekijöitä että osinkojen osakekurssivaikutusta Helsingin pörssissä. Osinkopolitiikan määräytymistä tutkittiin yhtiön koon, kannattavuuden, velkaisuuden, investointi- ja kasvumahdollisuuksien sekä sisäpiirin omistuksen avulla. Käytetty aineisto koostuu Helsingin pörssissä noteerattujen yhtiöiden tilinpäätösluvuista ja osakekurssitiedoista vuosina 2000-2010. Empiiriset tutkimukset osoittivat, että osinkotuottoon vaikuttavia tekijöitä Helsingin pörssissä ovat yhtiön kannattavuus, velkaisuus, investointi- ja kasvumahdollisuudet sekä sisäpiirin omistus. Saadut tulokset ovat samansuuntaisia aikaisempien tutkimusten kanssa. Toinen merkittävä löydös on, että osingoilla todettiin olevan positiivinen yhteys osakekurssimuutoksiin Helsingin pörssissä. Osinkojen ja osakekurssin välinen yhteys tukee signalointiteoriaa.
Resumo:
ABSTRACTChanges in the frequency of occurrence of extreme weather events have been pointed out as a likely impact of global warming. In this context, this study aimed to detect climate change in series of extreme minimum and maximum air temperature of Pelotas, State of Rio Grande do Sul, (1896 - 2011) and its influence on the probability of occurrence of these variables. We used the general extreme value distribution (GEV) in its stationary and non-stationary forms. In the latter case, GEV parameters are variable over time. On the basis of goodness-of-fit tests and of the maximum likelihood method, the GEV model in which the location parameter increases over time presents the best fit of the daily minimum air temperature series. Such result describes a significant increase in the mean values of this variable, which indicates a potential reduction in the frequency of frosts. The daily maximum air temperature series is also described by a non-stationary model, whose location parameter decreases over time, and the scale parameter related to sample variance rises between the beginning and end of the series. This result indicates a drop in the mean of daily maximum air temperature values and increased dispersion of the sample data.
Resumo:
ABSTRACT Knowledge of natural water availability, which is characterized by low flows, is essential for planning and management of water resources. One of the most widely used hydrological techniques to determine streamflow is regionalization, but the extrapolation of regionalization equations beyond the limits of sample data is not recommended. This paper proposes a new method for reducing overestimation errors associated with the extrapolation of regionalization equations for low flows. The method is based on the use of a threshold value for the maximum specific low flow discharge estimated at the gauging sites that are used in the regionalization. When a specific low flow, which has been estimated using the regionalization equation, exceeds the threshold value, the low flow can be obtained by multiplying the drainage area by the threshold value. This restriction imposes a physical limit to the low flow, which reduces the error of overestimating flows in regions of extrapolation. A case study was done in the Urucuia river basin, in Brazil, and the results showed the regionalization equation to perform positively in reducing the risk of extrapolation.
Resumo:
The aim of this thesis is to examine whether the pricing anomalies exists in the Finnish stock markets by comparing the performance of quantile portfolios that are formed on the basis of either individual valuation ratios, composite value measures or combined value and momentum indicators. All the research papers included in the thesis show evidence of value anomalies in the Finnish stock markets. In the first paper, the sample of stocks over the 1991-2006 period is divided into quintile portfolios based on four individual valuation ratios (i.e., E/P, EBITDA/EV, B/P, and S/P) and three hybrids of them (i.e. composite value measures). The results show the superiority of composite value measures as selection criterion for value stocks, particularly when EBITDA/EV is employed as earnings multiple. The main focus of the second paper is on the impact of the holding period length on performance of value strategies. As an extension to the first paper, two more individual ratios (i.e. CF/P and D/P) are included in the comparative analysis. The sample of stocks over 1993- 2008 period is divided into tercile portfolios based on six individual valuation ratios and three hybrids of them. The use of either dividend yield criterion or one of three composite value measures being examined results in best value portfolio performance according to all performance metrics used. Parallel to the findings of many international studies, our results from performance comparisons indicate that for the sample data employed, the yearly reformation of portfolios is not necessarily optimal in order to maximally gain from the value premium. Instead, the value investor may extend his holding period up to 5 years without any decrease in long-term portfolio performance. The same holds also for the results of the third paper that examines the applicability of data envelopment analysis (DEA) method in discriminating the undervalued stocks from overvalued ones. The fourth paper examines the added value of combining price momentum with various value strategies. Taking account of the price momentum improves the performance of value portfolios in most cases. The performance improvement is greatest for value portfolios that are formed on the basis of the 3-composite value measure which consists of D/P, B/P and EBITDA/EV ratios. The risk-adjusted performance can be enhanced further by following 130/30 long-short strategy in which the long position of value winner stocks is leveraged by 30 percentages while simultaneously selling short glamour loser stocks by the same amount. Average return of the long-short position proved to be more than double stock market average coupled with the volatility decrease. The fifth paper offers a new approach to combine value and momentum indicators into a single portfolio-formation criterion using different variants of DEA models. The results throughout the 1994-2010 sample period shows that the top-tercile portfolios outperform both the market portfolio and the corresponding bottom-tercile portfolios. In addition, the middle-tercile portfolios also outperform the comparable bottom-tercile portfolios when DEA models are used as a basis for stock classification criteria. To my knowledge, such strong performance differences have not been reported in earlier peer-reviewed studies that have employed the comparable quantile approach of dividing stocks into portfolios. Consistently with the previous literature, the division of the full sample period into bullish and bearish periods reveals that the top-quantile DEA portfolios lose far less of their value during the bearish conditions than do the corresponding bottom portfolios. The sixth paper extends the sample period employed in the fourth paper by one year (i.e. 1993- 2009) covering also the first years of the recent financial crisis. It contributes to the fourth paper by examining the impact of the stock market conditions on the main results. Consistently with the fifth paper, value portfolios lose much less of their value during bearish conditions than do stocks on average. The inclusion of a momentum criterion somewhat adds value to an investor during bullish conditions, but this added value turns to negative during bearish conditions. During bear market periods some of the value loser portfolios perform even better than their value winner counterparts. Furthermore, the results show that the recent financial crisis has reduced the added value of using combinations of momentum and value indicators as portfolio formation criteria. However, since the stock markets have historically been bullish more often than bearish, the combination of the value and momentum criteria has paid off to the investor despite the fact that its added value during bearish periods is negative, on an average.
Resumo:
Tämän tutkimuksen tavoitteena on selvittää, voidaanko yritysten tilinpäätöstiedoista löytää sellaisia muuttujia, jotka pystyvät ennustamaan yritysten konkursseja ja onko yrityksen kannattavuudella, vakavaraisuudella ja maksuvalmiudella kaikilla yhtä suuri merkitys konkurssin ennustamisessa. Lisäksi tavoitteena on verrata mitkä eri muuttujat selittävät konkurssia eri vuosina. Tutkimus toteutetaan luomalla viidelle vuodelle ennen konkurssia ennustusmallit käyttäen logistista regressiota. Tutkimus on rajattu koskemaan suomalaisia pieniä ja keskisuuria osakeyhtiöitä. Tutkimuksessa käytetty aineisto koostuu vuonna 2012 konkurssiin menneistä yrityksistä ja näille satunnaisotannalla valituista toimivista vertailuyrityksistä. Tutkimuksesta on rajattu pois nuoret, alle neljä vuotta toimineet yritykset, koska näiden konkurssiprosessit eroavat jo pidemmän aikaa toimineiden yritysten konkursseista.
Resumo:
This thesis examines whether or not Finnish stock markets has herding behavior. Sample data is from 2004 to 2013. Including total of 2516 market days. Market wide herding, up and down market herding, extreme price movement herding and turnover volume herding are measured in this thesis. Methods used in this thesis are cross-sectional absolute dispersion and cross-sectional standard deviation. This thesis found no signs of herding in the Finnish stock market.
Resumo:
Premenstrual syndrome and premenstrual dysphoric disorder (PMDD) seem to form a severity continuum with no clear-cut boundary. However, since the American Psychiatric Association proposed the research criteria for PMDD in 1994, there has been no agreement about the symptomatic constellation that constitutes this syndrome. The objective of the present study was to establish the core latent structure of PMDD symptoms in a non-clinical sample. Data concerning PMDD symptoms were obtained from 632 regularly menstruating college students (mean age 24.4 years, SD 5.9, range 17 to 49). For the first random half (N = 316), we performed principal component analysis (PCA) and for the remaining half (N = 316), we tested three theory-derived competing models of PMDD by confirmatory factor analysis. PCA allowed us to extract two correlated factors, i.e., dysphoric-somatic and behavioral-impairment factors. The two-dimensional latent model derived from PCA showed the best overall fit among three models tested by confirmatory factor analysis (c²53 = 64.39, P = 0.13; goodness-of-fit indices = 0.96; adjusted goodness-of-fit indices = 0.95; root mean square residual = 0.05; root mean square error of approximation = 0.03; 90%CI = 0.00 to 0.05; Akaike's information criterion = -41.61). The items "out of control" and "physical symptoms" loaded conspicuously on the first factor and "interpersonal impairment" loaded higher on the second factor. The construct validity for PMDD was accounted for by two highly correlated dimensions. These results support the argument for focusing on the core psychopathological dimension of PMDD in future studies.
Resumo:
This Master’s Thesis analyses the effectiveness of different hedging models on BRICS (Brazil, Russia, India, China, and South Africa) countries. Hedging performance is examined by comparing two different dynamic hedging models to conventional OLS regression based model. The dynamic hedging models being employed are Constant Conditional Correlation (CCC) GARCH(1,1) and Dynamic Conditional Correlation (DCC) GARCH(1,1) with Student’s t-distribution. In order to capture the period of both Great Moderation and the latest financial crisis, the sample period extends from 2003 to 2014. To determine whether dynamic models outperform the conventional one, the reduction of portfolio variance for in-sample data with contemporaneous hedge ratios is first determined and then the holding period of the portfolios is extended to one and two days. In addition, the accuracy of hedge ratio forecasts is examined on the basis of out-of-sample variance reduction. The results are mixed and suggest that dynamic hedging models may not provide enough benefits to justify harder estimation and daily portfolio adjustment. In this sense, the results are consistent with the existing literature.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
Pairs trading is an algorithmic trading strategy that is based on the historical co-movement of two separate assets and trades are executed on the basis of degree of relative mispricing. The purpose of this study is to explore one new and alternative copula-based method for pairs trading. The objective is to find out whether the copula method generates more trading opportunities and higher profits than the more traditional distance and cointegration methods applied extensively in previous empirical studies. Methods are compared by selecting top five pairs from stocks of the large and medium-sized companies in the Finnish stock market. The research period includes years 2006-2015. All the methods are proven to be profitable and the Finnish stock market suitable for pairs trading. However, copula method doesn’t generate more trading opportunities or higher profits than the other methods. It seems that the limitations of the more traditional methods are not too restrictive for this particular sample data.