14 resultados para Cluster Analysis of Variables

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimus tarkastelee Luoteis-Venäjän liikennelogistiikkaklusteria. Tarkoitus on selvittää klusterin nykyinen rakenne ja kilpailukyky sekä klusterin tarjoamat liiketoimintamahdollisuudet suomalaisille logistiikkayrityksille. Työssä käsitellään neljää perusliikennemuotoa: rautatie-, maantie-, meri- ja sisävesi-, sekä ilmaliikennettä. Tutkimuksen aineisto on kerätty tutkimusta varten laadituista kyselyistä, haastatteluista sekä aiemmin julkaistusta materiaalista. Venäjä on suunnitellut kehittävänsä voimakkaasti liikenneinfrastruktuuria, mm. julkaisemalla protektionistisen liikennestrategiasuunnitelman. Ongelmana ovat olleet toteutukset, jotka ovat jääneet yleensä puutteellisiksi. Tällä hetkellä todellista kilpailukykyä löytyy ainoastaan rautatieliikenteestä, muut kolme liikennemuotoa omaavat potentiaalisen kilpailukyvyn. Venäjällä on mahdollisuus hyötyä laajasta pinta-alastaan Aasian ja Euroopan liikenteen yhdistäjänä. Yksi konkreettisimmista esimerkeistä on Trans Siperian rautatie, joka kaipaisi vielä lisäkehitystä. Suomi on toiminut Venäjän liikenteessä arvotavaran kauttakulkumaana, vuonna 2003 noin 30–40 % Venäjän tuonnin arvosta kulki Suomen kautta. Venäjälle tullaan tuomaan arvotavaraa vielä useita vuosia, mutta reittien osalta kilpailu on tiukentunut. Suomalaisten yritysten liiketoimintamahdollisuuksiin esitetään kaksi mallia: kauttakulkuliikenteen lisäarvologistiset (VAL) operaatiot Suomessa tai etabloituminen Venäjän logistisiin ketjuihin. Suomalaisten olisi syytä parantaa yhteistyötään yritysten ja yliopistojen ym. koulutuslaitosten välillä. Myös yhteistyökumppaneiden hakeminen esimerkiksi Ruotsista voisi tuoda merkittäviä etuja. Suomalaista osaamista voitaisiin hyödyntää parhaiten etabloitumalla Venäjän markkinoille, esimerkiksi keskittymällä Venäjän logististen ketjujen johtamiseen. Myös VAL palveluiden johtamiseen Venäjällä olisi erittäin hyvä tilaisuus, koska Venäjän oma tietotaito logistiikassa ei ole vielä kehittynyt kansainväliselle tasolle, mutta kustannustaso on alhaisempi kuin Suomessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työssä on tutkittu vetojännityskuormituksen alaisena olevien hitsattujen kuormaa kantamattomien X-liitosten hitsin paikallisen geometrian variaation vaikutusta väsymislujuuteen. Muuttujina olivat reunan pyöristyssäde, kylmäjuoksun suuruus ja kylkikulma. Geometristen muuttujien parametrinen riippuvuussuhde on analysoitu usealla elementtimallilla. Väsymistarkastelu on suoritettu käyttämällä lineaaris-elastista murtumismekaniikkaa (LEFM) tasovenymätilassa ja materiaalina terästä. Särönkasvun suunnan ennustamisessaon käytetty maksimipääjännityskriteeriä sekä jännitysintensiteettikertoimet on määritetty J-integraalilla. Särön ydintymisvaihetta ei ole otettu huomioon. Rakenteen on oletettu olevan hitsatussa tilassa ja jännitysheilahdus on kokonaan tehollinen. Särön kasvunopeuden ennustamiseen on käytetty Paris'n lakia. Väsymislujuustulokset on esitetty karakteristisina väsymisluokkina (FAT) ja sovitettu parametriseksi yhtälöksi. Lopuksi väsymisanalyysin ennustamia tuloksia on verrattu saatavilla oleviin väsytystestituloksiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prediction of the stock market valuation is a common interest to all market participants. Theoretically sound market valuation can be achieved by discounting future earnings of equities to present. Competing valuation models seek to find variables that affect the equity market valuation in a way that the market valuation can be explained and also variables that could be used to predict market valuation. In this paper we test the contemporaneous relationship between stock prices, forward looking earnings and long-term government bond yields. We test this so-called Fed model in a long- and short-term time series analysis. In order to test the dynamics of the relationship, we use the cointegration framework. The data used in this study spans over four decades of various market conditions between 1964-2007, using data from United States. The empirical results of our analysis do not give support for the Fed model. We are able to show that the long-term government bonds do not play statistically significant role in this relationship. The effect of forward earnings yield on the stock market prices is significant and thus we suggest the use of standard valuation ratios when trying to predict the future paths of equity prices. Also, changes in the long-term government bond yields do not have significant short-term impact on stock prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electricity spot prices have always been a demanding data set for time series analysis, mostly because of the non-storability of electricity. This feature, making electric power unlike the other commodities, causes outstanding price spikes. Moreover, the last several years in financial world seem to show that ’spiky’ behaviour of time series is no longer an exception, but rather a regular phenomenon. The purpose of this paper is to seek patterns and relations within electricity price outliers and verify how they affect the overall statistics of the data. For the study techniques like classical Box-Jenkins approach, series DFT smoothing and GARCH models are used. The results obtained for two geographically different price series show that patterns in outliers’ occurrence are not straightforward. Additionally, there seems to be no rule that would predict the appearance of a spike from volatility, while the reverse effect is quite prominent. It is concluded that spikes cannot be predicted based only on the price series; probably some geographical and meteorological variables need to be included in modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was focussed on statistical analysis methods and proposes the use of Bayesian inference to extract information contained in experimental data by estimating Ebola model parameters. The model is a system of differential equations expressing the behavior and dynamics of Ebola. Two sets of data (onset and death data) were both used to estimate parameters, which has not been done by previous researchers in (Chowell, 2004). To be able to use both data, a new version of the model has been built. Model parameters have been estimated and then used to calculate the basic reproduction number and to study the disease-free equilibrium. Estimates of the parameters were useful to determine how well the model fits the data and how good estimates were, in terms of the information they provided about the possible relationship between variables. The solution showed that Ebola model fits the observed onset data at 98.95% and the observed death data at 93.6%. Since Bayesian inference can not be performed analytically, the Markov chain Monte Carlo approach has been used to generate samples from the posterior distribution over parameters. Samples have been used to check the accuracy of the model and other characteristics of the target posteriors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The strategic group theory provides an intermediate level of analysis between a single company and the whole industry for identifying issues about the company's competitive position and strategic choices. Strategic groups are companies within an industry with similar strategic characteristics or competing on similar bases. Strategic choices are aligned with the firms’ resources. The purpose of this study was to identify the strategic groups in the wind energy industry in Europe, and study, whether a certain group membership results in financial performance differences. Altogether 80 European wind energy companies were included in the study, which were clustered into four strategic groups according to their age and growth rate. Each group corresponds to a different strategy. The results show that the wind energy companies can be clustered according to the chosen strategic characteristics. Strategic decisions were investigated with characteristic variables. Performance variables were used in the analysis measuring profitability, liquidity and solvency of the groups. These strategic choices of the companies did not have a significant influence on the firms’ performance. The more mature and slower growing group proved to be the most successful. However, the differences between groups were generally not statistically significant. The only statistically significant difference found was in the solvency ratio between Mature Slow and Young Rapid groups. Measured with these variables, more mature and slower growing companies performed better. Therefore, a certain strategic group membership results in performance differences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to its non-storability, electricity must be produced at the same time that it is consumed, as a result prices are determined on an hourly basis and thus analysis becomes more challenging. Moreover, the seasonal fluctuations in demand and supply lead to a seasonal behavior of electricity spot prices. The purpose of this thesis is to seek and remove all causal effects from electricity spot prices and remain with pure prices for modeling purposes. To achieve this we use Qlucore Omics Explorer (QOE) for the visualization and the exploration of the data set and Time Series Decomposition method to estimate and extract the deterministic components from the series. To obtain the target series we use regression based on the background variables (water reservoir and temperature). The result obtained is three price series (for Sweden, Norway and System prices) with no apparent pattern.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this academic economic geographical dissertation is to study and describe how competitiveness in the Finnish paper industry has developed during 2001–2008. During these years, the Finnish paper industry has faced economically challenging times. This dissertation attempts to fill the existing gap between theoretical and empirical discussions concerning economic geographical issues in the paper industry. The main research questions are: How have the supply chain costs and margins developed during 2001–2008? How do sales prices, transportation, and fixed and variable costs correlate with gross margins in a spatial context? The research object for this case study is a typical large Finnish paper mill that exports over 90 % of its production. The economic longitudinal research data were obtained from the case mill’s controlled economic system and, correlation (R2) analysis was used as the main research method. The time series data cover monthly economic and manufacturing observations from the mill from 2001 to 2008. The study reveals the development of prices, costs and transportation in the case mill, and it shows how economic variables correlate with the paper mills’ gross margins in various markets in Europe. The research methods of economic geography offer perspectives that pay attention to the spatial (market) heterogeneity. This type of research has been quite scarce in the research tradition of Finnish economic geography and supply chain management. This case study gives new insight into the research tradition of Finnish economic geography and supply chain management and its applications. As a concrete empirical result, this dissertation states that the competitive advantages of the Finnish paper industry were significantly weakened during 2001–2008 by low paper prices, costly manufacturing and expensive transportation. Statistical analysis expose that, in several important markets, transport costs lower gross margins as much as decreasing paper prices, which was a new finding. Paper companies should continuously pay attention to lowering manufacturing and transporting costs to achieve more profitable economic performance. The location of a mill being far from markets clearly has an economic impact on paper manufacturing, as paper demand is decreasing and oversupply is pressuring paper prices down. Therefore, market and economic forecasting in the paper industry is advantageous at the country and product levels while simultaneously taking into account the economic geographically specific dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transportation of fluids is one of the most common and energy intensive processes in the industrial and HVAC sectors. Pumping systems are frequently subject to engineering malpractice when dimensioned, which can lead to poor operational efficiency. Moreover, pump monitoring requires dedicated measuring equipment, which imply costly investments. Inefficient pump operation and improper maintenance can increase energy costs substantially and even lead to pump failure. A centrifugal pump is commonly driven by an induction motor. Driving the induction motor with a frequency converter can diminish energy consumption in pump drives and provide better control of a process. In addition, induction machine signals can also be estimated by modern frequency converters, dispensing with the use of sensors. If the estimates are accurate enough, a pump can be modelled and integrated into the frequency converter control scheme. This can open the possibility of joint motor and pump monitoring and diagnostics, thereby allowing the detection of reliability-reducing operating states that can lead to additional maintenance costs. The goal of this work is to study the accuracy of rotational speed, torque and shaft power estimates calculated by a frequency converter. Laboratory tests were performed in order to observe estimate behaviour in both steady-state and transient operation. An induction machine driven by a vector-controlled frequency converter, coupled with another induction machine acting as load was used in the tests. The estimated quantities were obtained through the frequency converter’s Trend Recorder software. A high-precision, HBM T12 torque-speed transducer was used to measure the actual values of the aforementioned variables. The effect of the flux optimization energy saving feature on the estimate quality was also studied. A processing function was developed in MATLAB for comparison of the obtained data. The obtained results confirm the suitability of this particular converter to provide accurate enough estimates for pumping applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

TAVOITTEET: Tämän tutkielman tarkoitus on tarkastella eri toimialojen likviditeettitasoja vuosien 2007 ja 2013 välillä. Se tarkastelee myös kassanhallinnan ja likviditeetin kirjallisuutta, erilaisia likviditeettiä kuvaavia tunnuslukuja sekä asioita, joilla on vaikutusta likviditeettiin. Tämän lisäksi se tutkii informaatio ja kommunikaatio sektoria tarkemmin. DATA: Data on kerätty Orbis tietokannasta. Toimialakohtaiset keskiarvot on laskettu joko kappaleen 2 esittämillä kaavoilla tai noudettu suoraan tietokannasta. Hajonta kuvaajat on tehty Excelillä ja korrelaatio matriisi ja regressioanalyysit SAS EG:llä. TULOKSET: Tämä tutkimus esittää toimialakohtaiset keskiarvot liquidity ratiosta, solvency ratiosta sekä gearingista, kuten monista muista likviditeettiä kuvaavista tai siihen vaikuttavista tunnusluvuista. Tutkimus osoittaa, että keskimäärin likviditeetti ja maksuvalmius ovat säilyneet melko samana, mutta toimialakohtaiset muutokset ovat voimakkaita. IC sektorilla likviditeettiin vaikuttaa katetuotto, työntekijöiden määrä, liikevaihto, taseen määrä sekä maksuaika.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores behavioral patterns of web users on an online magazine web-site. The goal of the study is to first find and visualize user paths within the data generated during collection, and to identify some generic behavioral typologies of user behavior. To form a theoretical foundation for processing data and identifying behavioral ar-chetypes, the study relies on established consumer behavior literature to propose typologies of behavior. For data processing, the study utilizes methodologies of ap-plied cluster analysis and sequential path analysis. Utilizing a dataset of click stream data generated from the real-life clicks of 250 ran-domly selected website visitors over a period of six weeks. Based on the data collect-ed, an exploratory method is followed in order to find and visualize generally occur-ring paths of users on the website. Six distinct behavioral typologies were recog-nized, with the dominant user consuming mainly blog content, as opposed to editori-al content. Most importantly, it was observed that approximately 80% of clicks were of the blog content category, meaning that the majority of web traffic occurring in the site takes place in content other than the desired editorial content pages. The out-come of the study is a set of managerial recommendations for each identified behavioral archetype.