885 resultados para empirical shell model
Resumo:
The thesis aims to build a theoretical model to explain consumer investment intentions in stocks and investment funds. The model examines the relationships between subjective investment knowledge, expected sacrifice, expected investment value, compatibility, perceived behavioral control and investment intentions. The data was collected via web-based survey and consisted of 45- to 65-year-old Finnish consumers (n=154). Confirmatory factor analysis (CFA), structural equation modeling (SEM) and t-tests were applied in analyzing the data. The results suggest that among average household consumers expected investment value consists of three dimensions, namely, economic, functional, and emotional, whereas expected sacrifice consists of effort, financial risk, source risk, and psychological risk. Two structural models were assessed, one for stock investments and one for investment funds. Whereas the models presented somewhat different outcomes, in both models compatibility had an essential role in explaining consumer investment intentions. Compatibility was affected by expected investment value and expected sacrifice. Subjective investment knowledge impacted consumers’ evaluations of the value and sacrifices. The effect of perceived behavioral control on investment intentions was rather small, however significant. Moreover, the results suggest that there are significant differences between consumers with no investment experience and consumers with investment experience in subjective investment knowledge, the dimensions of expected sacrifices and expected investment value, perceived behavioral control, compatibility and investment intentions.
Resumo:
Open innovation paradigm states that the boundaries of the firm have become permeable, allowing knowledge to flow inwards and outwards to accelerate internal innovations and take unused knowledge to the external environment; respectively. The successful implementation of open innovation practices in firms like Procter & Gamble, IBM, and Xerox, among others; suggest that it is a sustainable trend which could provide basis for achieving competitive advantage. However, implementing open innovation could be a complex process which involves several domains of management; and whose term, classification, and practices have not totally been agreed upon. Thus, with many possible ways to address open innovation, the following research question was formulated: How could Ericsson LMF assess which open innovation mode to select depending on the attributes of the project at hand? The research followed the constructive research approach which has the following steps: find a practical relevant problem, obtain general understanding of the topic, innovate the solution, demonstrate the solution works, show theoretical contributions, and examine the scope of applicability of the solution. The research involved three phases of data collection and analysis: Extensive literature review of open innovation, strategy, business model, innovation, and knowledge management; direct observation of the environment of the case company through participative observation; and semi-structured interviews based of six cases involving multiple and heterogeneous open innovation initiatives. Results from the cases suggest that the selection of modes depend on multiple reasons, with a stronger influence of factors related to strategy, business models, and resources gaps. Based on these and others factors found in the literature review and observations; it was possible to construct a model that supports approaching open innovation. The model integrates perspectives from multiple domains of the literature review, observations inside the case company, and factors from the six open innovation cases. It provides steps, guidelines, and tools to approach open innovation and assess the selection of modes. Measuring the impact of open innovation could take years; thus, implementing and testing entirely the model was not possible due time limitation. Nevertheless, it was possible to validate the core elements of the model with empirical data gathered from the cases. In addition to constructing the model, this research contributed to the literature by increasing the understanding of open innovation, providing suggestions to the case company, and proposing future steps.
Resumo:
The objective of this thesis is to understand how to create and develop a successful place brand and how to manage it systematically. The thesis thoroughly explains the phenomenon of place brands and place branding and presents different sub-categories of place branding. The theoretical part of the thesis provides a wide overview on the prevailing literature of place branding, place brand development and place brand management, which form the basis of the thesis’ theoretical framework. The theoretical evidence is gathered from a case living area. The living area is developed by one construction company, which has a significant role in the construction industry in Finland. The empirical evidence is gathered through semi-structured in-depth interviews by interviewing the new living area’s carefully selected stakeholder groups. Afterwards the empirical data is analyzed and reflected to the theoretical findings. After examining the case living area, the thesis will present a new living area branding process model based on prevailing theories and empirical findings.
Resumo:
The thesis aims to build a coherent view and understanding of the innovation process and organizational technology adoption in Finnish bio-economy companies with a focus on innovations of a disruptive nature. Disruptive innovations are exceptional hence in order to create generalizations and a unified view of the subject the perspective is also on less radical innovations. Other interests of the thesis are how ideas are discovered and generated and how the nature of the innovation and size of the company affect the technology adoption and innovation process. The data was collected by interviewing six small and six large Finnish bio-economy companies. The results suggest companies regardless of size consider innovation as a core asset in the competitive markets. Organizations want to be considered innovators and early adopters yet these qualities are limited by certain, mainly resource-based factors. In addition the industry, scalability and Finland’s geographical location when seeking funding provide certain challenges. The innovation process may be considered relatively similar whether the idea or technology stems from an internal or external source suggesting the technology adoption process can in fact be linked to the innovation process theories. Thus the thesis introduces a new theoretical model which based on the results of the study and the theories of technology adoption and innovation process aims on characterizing how ideas and technology from both external and internal sources generate into innovations. The innovation process is in large bio-economy companies most often similar to or a modified version of the stage-gate model, while small companies generally have less structured processes. Nevertheless the more disruptive the innovation, the less it fits in the structured processes. This implies disruptive innovation cannot be put in a certain mould but it is rather processed case-by-case.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
As increasing efficiency of a wind turbine gearbox, more power can be transferred from rotor blades to generator and less power is used to cause wear and heating in the gearbox. By using a simulation model, behavior of the gearbox can be studied before creating expensive prototypes. The objective of the thesis is to model a wind turbine gearbox and its lubrication system to study power losses and heat transfer inside the gearbox and to study the simulation methods of the used software. Software used to create the simulation model is Siemens LMS Imagine.Lab AMESim, which can be used to create one-dimensional mechatronic system simulation models from different fields of engineering. When combining components from different libraries it is possible to create a simulation model, which includes mechanical, thermal and hydraulic models of the gearbox. Results for mechanical, thermal, and hydraulic simulations are presented in the thesis. Due to the large scale of the wind turbine gearbox and the amount of power transmitted, power loss calculations from AMESim software are inaccurate and power losses are modelled as constant efficiency for each gear mesh. Starting values for simulation in thermal and hydraulic simulations were chosen from test measurements and from empirical study as compact and complex design of gearbox prevents accurate test measurements. In further studies to increase the accuracy of the simulation model, components used for power loss calculations needs to be modified and values for unknown variables are needed to be solved through accurate test measurements.
Resumo:
Mass transfer kinetics in osmotic dehydration is usually modeled by Fick's law, empirical models and probabilistic models. The aim of this study was to determine the applicability of Peleg model to investigate the mass transfer during osmotic dehydration of mackerel (Scomber japonicus) slices at different temperatures. Osmotic dehydration was performed on mackerel slices by cooking-infusion in solutions with glycerol and salt (a w = 0.64) at different temperatures: 50, 70, and 90 ºC. Peleg rate constant (K1) (h(g/gdm)-1) varied with temperature variation from 0.761 to 0.396 for water loss, from 5.260 to 2.947 for salt gain, and from 0.854 to 0.566 for glycerol intake. In all cases, it followed the Arrhenius relationship (R²>0.86). The Ea (kJ / mol) values obtained were 16.14; 14.21, and 10.12 for water, salt, and glycerol, respectively. The statistical parameters that qualify the goodness of fit (R²>0.91 and RMSE<0.086) indicate promising applicability of Peleg model.
Resumo:
This study is based on a large survey study of over 1500 Finnish companies’ usage, needs and implementation difficulties of management accounting systems. The study uses quantitative, qualitative and mixed methods to answer the research questions. The empirical data used in the study was gathered through structured interviews with randomly selected companies of varying sizes and industries. The study answers the three research questions by analyzing the characteristics and behaviors of companies working in Finland. The study found five distinctive groups of companies according to the characteristics of their cost information and management accounting system use. The study also showed that the state of cost information and management accounting systems depends on the industry and size of the companies. It was found that over 50% of the companies either did not know how their systems could be updated or saw systems as inadequate. The qualitative side also highlighted the needs for tailored and integrated management accounting systems for creating more value to the managers of companies. The major inhibitors of new system implementation were the lack of both monetary and human resources. Through the use of mixed methods and design science a new and improved sophistication model is created based on previous research results combined with the information gathered from previous literature. The sophistication model shows the different stages of management accounting systems in use and what companies can achieve with the implementation and upgrading of their systems.
Resumo:
This research studied the project performance measurement from the perspective of strategic management. The objective was to find a generic model for project performance measurement that emphasizes strategy and decision making. Research followed the guidelines of a constructive research methodology. As a result, the study suggests a model that measures projects with multiple meters during and after projects. Measurement after the project is suggested to be linked to the strategic performance measures of a company. The measurement should be conducted with centralized project portfolio management e.g. using the project management office in the organization. Metrics, after the project, measure the project’s actual benefit realization. During the project, the metrics are universal and they measure the accomplished objectives relation to costs, schedule and internal resource usage. Outcomes of these measures should be forecasted by using qualitative or stochastic methods. Solid theoretical background for the model was found from the literature that covers the subjects of performance measurement, projects and uncertainty. The study states that the model can be implemented in companies. This statement is supported by empirical evidence from a single case study. The gathering of empiric evidence about the actual usefulness of the model in companies is left to be done by the evaluative research in the future.
Resumo:
Return and volatility dynamics in financial markets across the world have recently become important for the purpose of asset pricing, portfolio allocation and risk management. However, volatility, which come about as a result of the actions of market participants can help adapt to different situations and perform when it really matters. With recent development and liberalization among financial markets in emerging and frontier markets, the need for how the equity and foreign exchange markets interact and the extent to which return and volatility spillover are spread across countries is of importance to investors and policy makers at large. Financial markets in Africa have received attention leading to investors diversifying into them in times of crisis and contagion effects in developed countries. Regardless of the benefits these markets may offer, investors must be wary of issues such as thin trading, volatility that exists in the equity and currency markets and its related fluctuations. The study employs a VAR-GARCH BEKK model to study the return and volatility dynamics between the stock and foreign exchange sectors and among the equity markets of Egypt, Kenya, Nigeria, South Africa and Tunisia. The main findings suggest a higher dependence of own return in the stock markets and a one way return spillover from the currencies to the equity markets except for South Africa which has a weaker interrelation among the two markets. There is a relatively limited integration among the equity markets. Return and volatility spillover is mostly uni-directional except for a bi-directional relationship between the equity markets of Egypt and Tunisia. The study implication still proves a benefit for portfolio managers diversifying in these African equity markets, since they are independent of each other and may not be highly affected by the influx of negative news from elsewhere. However, there is the need to be wary of return and volatility spillover between the equity and currency markets, hence devising better hedging strategies to curb them.
Resumo:
Tämä diplomityö arvioi hitsauksen laadunhallintaohjelmistomarkkinoiden kilpailijoita. Kilpailukenttä on uusi ja ei ole tarkkaa tietoa siitä minkälaisia kilpailijoita on markkinoilla. Hitsauksen laadunhallintaohjelmisto auttaa yrityksiä takaamaan korkean laadun. Ohjelmisto takaa korkean laadun varmistamalla, että hitsaaja on pätevä, hän noudattaa hitsausohjeita ja annettuja parametreja. Sen lisäksi ohjelmisto kerää kaiken tiedon hitsausprosessista ja luo siitä vaadittavat dokumentit. Diplomityön teoriaosuus muodostuu kirjallisuuskatsauksesta ratkaisuliike-toimintaan, kilpailija-analyysin ja kilpailuvoimien teoriaan sekä hitsauksen laadunhallintaan. Työn empiriaosuus on laadullinen tutkimus, jossa tutkitaan kilpailevia hitsauksen laadunhallintaohjelmistoja ja haastatellaan ohjelmistojen käyttäjiä. Diplomityön tuloksena saadaan uusi kilpailija-analyysimalli hitsauksen laadunhallintaohjelmistoille. Mallin avulla voidaan arvostella ohjelmistot niiden tarjoamien primääri- ja sekundääriominaisuuksien perusteella. Toiseksi tässä diplomityössä analysoidaan nykyinen kilpailijatilanne hyödyntämällä juuri kehitettyä kilpailija-analyysimallia.
Resumo:
Target of this study was to develop a total cost calculation model to compare all costs from manufacturing and logistics from own factories or from partner factories to global distribution centers in a case company. Especially the total cost calculation model was needed to simulate an own factory utilization effect in the total cost calculation context. This study consist of the theoretical literature review and the empirical case study. This study was completed using the constructive research approach. The result of this study was a new total cost calculation model. The new total cost calculation model includes not only all the costs caused by manufacturing and logistics, but also the relevant capital costs. Using the new total cost calculation model, case company is able to complete the total cost calculations taking into account the own factory utilization effect in different volume situations and volume shares between an own factory and a partner factory.
Resumo:
This Master’s Thesis analyses the effectiveness of different hedging models on BRICS (Brazil, Russia, India, China, and South Africa) countries. Hedging performance is examined by comparing two different dynamic hedging models to conventional OLS regression based model. The dynamic hedging models being employed are Constant Conditional Correlation (CCC) GARCH(1,1) and Dynamic Conditional Correlation (DCC) GARCH(1,1) with Student’s t-distribution. In order to capture the period of both Great Moderation and the latest financial crisis, the sample period extends from 2003 to 2014. To determine whether dynamic models outperform the conventional one, the reduction of portfolio variance for in-sample data with contemporaneous hedge ratios is first determined and then the holding period of the portfolios is extended to one and two days. In addition, the accuracy of hedge ratio forecasts is examined on the basis of out-of-sample variance reduction. The results are mixed and suggest that dynamic hedging models may not provide enough benefits to justify harder estimation and daily portfolio adjustment. In this sense, the results are consistent with the existing literature.
Resumo:
The meaning of information technology (IT) and information systems have increased during the last few years. This is mainly because business is nowadays seen more and more as a service business and IT is one of the key elements to support those business services. Since the meaning of IT services has increased also the meaning of IT service support should be a factor paid more attention to. Especially after a merger and acquisition (M&A) it is more important than ever to consider service support. The purpose of this study is to discover the best practices for choosing a suitable service support model. The research question is How to choose a service support organization model for the ERP service desk function after a merger? A qualitative method is selected as a research method. This thesis includes two parts: a literature review and a case study. Theoretical part compiles an integrated model of previous research on the topic. It consists a collection of academic articles, publications and reports. The empirical part focuses on the issues in the case organization. That part tries to answer the question: what would be the most suitable service support model for the case organization? The empirical part is conducted by interviewing the employees of the case organization. This study finds that even though there are many ways of selecting a service support model it is difficult to define an unambiguous guidelines. However, there are few main objectives that should be taken into account regardless the case. Especially by using ITIL processes it is possible to implement a comprehensive service support and raise overall awareness of the existing service support models. The main functions that need to be taken into account are nature, industry and size of the organization. Also the business strategy, goals and resources need to be considered. These are the same factors that are noticed in the case study as well. The suggestions for the case organization are presented based on the interviews and the literature review.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.