935 resultados para Implementation Model
Resumo:
EONIA is a market based overnight interest rate, whose role as the starting point of the yield curve makes it critical from the perspective of the implementation of European Central Bank´s common monetary policy in the euro area. The financial crisis that started in 2007 had a large impact on the determination mechanism of this interest rate, which is considered as the central bank´s operational target. This thesis examines the monetary policy implementation framework of the European Central Bank and changes made to it. Furthermore, we discuss the development of the recent turmoil in the money market. EONIA rate is modelled by means of a regression equation using variables related to liquidity conditions, refinancing need, auction results and calendar effects. Conditional volatility is captured by an EGARCH model, and autocorrelation is taken into account by employing an autoregressive structure. The results highlight how the tensions in the initial stage of the market turmoil were successfully countered by ECB´s liquidity policy. The subsequent response of EONIA to liquidity conditions under the full allotment liquidity provision procedure adopted after the demise of Lehman Brothers is also established. A clear distinction in the behavior of the interest rate between the sub-periods was evident. In the light of the results obtained, some of the challenges posed by the exit-strategy implementation will be addressed.
Resumo:
The purpose of this study was to develop co-operation between business units of the company operating in graphic industry. The development was done by searching synergy opportunities between these business units. The final aim was to form a business model, which is based on co-operation of these business units.The literature review of this thesis examines synergies and especially the process concerning the search and implementation of synergies. Also the concept of business model and its components are examined. The research was done by using qualitative research method. The main data acquiring method to the empirical part was theme interviews. The data was analyzed using thematisation and content analysis.The results of the study include seven identified possible synergies and a business model, which is based on the co-operation of the business units. The synergy opportunities are evaluated and the implementation order of the synergies is suggested. The presented synergies create the base for the proposed business model.
Resumo:
The objective of this study was to model mathematically and to simulate the dynamic behavior of an auger-type fertilizer applicator (AFA) in order to use the variable-rate application (VRA) and reduce the coefficient of variation (CV) of the application, proposing an angular speed controller θ' for the motor drive shaft. The input model was θ' and the response was the fertilizer mass flow, due to the construction, density of fertilizer, fill factor and the end position of the auger. The model was used to simulate a control system in open loop, with an electric drive for AFA using an armature voltage (V A) controller. By introducing a sinusoidal excitation signal in V A with amplitude and delay phase optimized and varying θ' during an operation cycle, it is obtained a reduction of 29.8% in the CV (constant V A) to 11.4%. The development of the mathematical model was a first step towards the introduction of electric drive systems and closed loop control for the implementation of AFA with low CV in VRA.
Resumo:
This thesis presents a design for an asynchronous interface to Robotiq adaptive gripper s-model. Designed interface is a communication layer that works on top of modbus layer. The design contains function definitions, finite state machine and exceptions. The design was not fully implemented but enough was so that it can be used. The implementation was done with c++ in linux environment. Additionally to the implementation a simple demo program was made to show the interface is used. Also grippers closing speed and force were measured. There is also a brief introduction into robotics and robot grasping.
Resumo:
According to many academic researches, the development of marketing capabilities can enhance organizational performance. Similarly, downstream marketing capabilities have an important role in accomplishment the organizational goals. Particularly the downstream marketing capabilities identified in this research are the Marketing Communication, Selling, Marketing implementation, and Market information management. These four capabilities are summarized under the following abilities. First, the ability to manage customers’ opinion regarding the offered value from the organization. Second, the ability of the organization to obtain orders from new and established customers. Third, the ability of aligning and translate the marketing strategy into an operating action plan along with the deployment of the organizational resources. Forth, the continuous process of gathering and managing information about the markets. Moreover, the literature review of this research shed light on the elements that compose the downstream marketing capabilities. Specifically, this research examined the downstream processes and the required information required to control these processes based on the American Productivity and Quality Center’s Process Classification Framework. Furthermore, the literature review examined some of the technological tools that are used in marketing processes, and also some managerial implication regarding the management of the downstream marketing employees. Along with the investigation of downstream marketing capabilities, the literature review investigated the utilization and the benefits of Component Business Model and Process Classification Framework, as they are defined by the organizations that developed them. Besides this initial study, the research presents how the examined organization is using the two frameworks together by cross-referring them. Finally, the research presents the optimal deployment of the collected downstream capabilities elements in the current organizational structure. The optimal deployment has been grounded on the information collected from the literature review but also from internal documentation, provided from the examined organization. By comparing the optimal deployment and the current condition on the organization, the research exhibits some points for improvement, but also some of the projects that are currently in progress inside the organization and eventually will provide solutions to these downsides.
Resumo:
Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.
Resumo:
Open innovation paradigm states that the boundaries of the firm have become permeable, allowing knowledge to flow inwards and outwards to accelerate internal innovations and take unused knowledge to the external environment; respectively. The successful implementation of open innovation practices in firms like Procter & Gamble, IBM, and Xerox, among others; suggest that it is a sustainable trend which could provide basis for achieving competitive advantage. However, implementing open innovation could be a complex process which involves several domains of management; and whose term, classification, and practices have not totally been agreed upon. Thus, with many possible ways to address open innovation, the following research question was formulated: How could Ericsson LMF assess which open innovation mode to select depending on the attributes of the project at hand? The research followed the constructive research approach which has the following steps: find a practical relevant problem, obtain general understanding of the topic, innovate the solution, demonstrate the solution works, show theoretical contributions, and examine the scope of applicability of the solution. The research involved three phases of data collection and analysis: Extensive literature review of open innovation, strategy, business model, innovation, and knowledge management; direct observation of the environment of the case company through participative observation; and semi-structured interviews based of six cases involving multiple and heterogeneous open innovation initiatives. Results from the cases suggest that the selection of modes depend on multiple reasons, with a stronger influence of factors related to strategy, business models, and resources gaps. Based on these and others factors found in the literature review and observations; it was possible to construct a model that supports approaching open innovation. The model integrates perspectives from multiple domains of the literature review, observations inside the case company, and factors from the six open innovation cases. It provides steps, guidelines, and tools to approach open innovation and assess the selection of modes. Measuring the impact of open innovation could take years; thus, implementing and testing entirely the model was not possible due time limitation. Nevertheless, it was possible to validate the core elements of the model with empirical data gathered from the cases. In addition to constructing the model, this research contributed to the literature by increasing the understanding of open innovation, providing suggestions to the case company, and proposing future steps.
Resumo:
By recent years the phenomenon called crowdsourcing has been acknowledged as an innovative form of value creation that must be taken seriously. Crowdsourcing can be defined as an act of outsourcing tasks originally performed inside an organization, or assigned externally in form of a business relationship, to an undefinably large, heterogeneous mass of potential actors. This thesis constructs a framework for successful implementation of crowdsourcing initiatives. Firms that rely entirely on their own research and ideas cannot compete with the innovative capacity that crowd-powered firms have. Nowadays, crowdsourcing has become one of the key capabilities of businesses due to its innovative capabilities, in addition to the existing internal resources of the firm. By utilizing crowdsourcing the business gains access to an enormous pool of competence and knowledge. However, various risks remain such as uncertainty of crowd structure and loss of internal know-how. Crowdsourcing Success Framework introduces a step by step model for implementing crowdsourcing into the everyday operations of the business. It starts from the decision to utilize crowdsourcing and continues further into planning, organizing and execution. Finally, this thesis presents the success factors of crowdsourcing initiative.
Resumo:
The main objective of the study was to form a strategic process model and project management tool to help IFRS change implementation projects in the future. These research results were designed based on the theoretical framework of Total Quality Management and leaning on the facts that were collected during the empirical case study of IAS 17 change. The us-age of the process oriented approach in IFRS standard change implementation after the initial IFRS implementation is rationalized with the following arguments: 1) well designed process tools lead to optimization of resources 2) With the help of process stages and related tasks it is easy to ensure the efficient way of working and managing the project as well as make sure to include all necessary stakeholders to the change process. This research is following the qualitative approach and the analysis is in describing format. The first part of the study is a literature review and the latter part has been conducted as a case study. The data has been col-lected in the case company with interviews and observation. The main findings are a process model for IFRS standard change process and a check-list formatted management tool for up-coming IFRS standard change projects. The process flow follows the main cornerstones in IASB’s standard setting process and the management tool has been divided to stages accordingly.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
In this paper, we review the advances of monocular model-based tracking for last ten years period until 2014. In 2005, Lepetit, et. al, [19] reviewed the status of monocular model based rigid body tracking. Since then, direct 3D tracking has become quite popular research area, but monocular model-based tracking should still not be forgotten. We mainly focus on tracking, which could be applied to aug- mented reality, but also some other applications are covered. Given the wide subject area this paper tries to give a broad view on the research that has been conducted, giving the reader an introduction to the different disciplines that are tightly related to model-based tracking. The work has been conducted by searching through well known academic search databases in a systematic manner, and by selecting certain publications for closer examination. We analyze the results by dividing the found papers into different categories by their way of implementation. The issues which have not yet been solved are discussed. We also discuss on emerging model-based methods such as fusing different types of features and region-based pose estimation which could show the way for future research in this subject.
Resumo:
Tutkimuksen tavoitteena oli selvittää automaattisten tilausjärjestelmien onnistuneen käyttöönottoon taustalla vaikuttavia tekijöitä vähittäiskaupan toimialalla ja etsiä ratkaisua kyseisten järjestelmien onnistuneeseen käyttöönottoon tässä ympäristössä. Tutkimus analysoi yli sadan kaupan järjestelmän käyttöönottoa ja käyttöönoton tuloksia. Tutkimusta varten haastateltiin niin yhtiön sisältä kuin ulkopuoleltakin mukana olleita hankintajärjestelmän ja jalkautuksen asiantuntijoita. Tämän lisäksi järjestelmän käyttöönottaneisiin kauppoihin lähetettiin kyselyt, joita analysoitiin ryhmissä automaattisen tilausjärjestelmän tietojen pohjalta. Työn tuloksena pystyttiin tunnistamaan tietty joukko taustatekijöitä, jotka tulee ottaa käyttöönotossa huomioon sekä saatuihin tutkimustuloksiin perustuen kehitettiin malli vastaavanlaisten järjestelmien käyttöönotolle vähittäiskaupan alalle.
Resumo:
This study is based on a large survey study of over 1500 Finnish companies’ usage, needs and implementation difficulties of management accounting systems. The study uses quantitative, qualitative and mixed methods to answer the research questions. The empirical data used in the study was gathered through structured interviews with randomly selected companies of varying sizes and industries. The study answers the three research questions by analyzing the characteristics and behaviors of companies working in Finland. The study found five distinctive groups of companies according to the characteristics of their cost information and management accounting system use. The study also showed that the state of cost information and management accounting systems depends on the industry and size of the companies. It was found that over 50% of the companies either did not know how their systems could be updated or saw systems as inadequate. The qualitative side also highlighted the needs for tailored and integrated management accounting systems for creating more value to the managers of companies. The major inhibitors of new system implementation were the lack of both monetary and human resources. Through the use of mixed methods and design science a new and improved sophistication model is created based on previous research results combined with the information gathered from previous literature. The sophistication model shows the different stages of management accounting systems in use and what companies can achieve with the implementation and upgrading of their systems.
Resumo:
For the past decades, educational large-scale reforms have been elaborated and implemented in many countries and often resulted in partial or complete failure. These results brought researchers to study policy processes in order to address this particular challenge. Studies on implementation processes brought to light an existing causal relationship between the implementation process and the effectiveness of a reform. This study aims to describe the implementation process of educational change in Finland, who produced efficient educational reforms over the last 50 years. The case study used for the purpose of this study is the national reform of undivided basic education (yhtenäinen peruskoulu) implemented in the end of the 1990s. Therefore, this research aims to describe how the Finnish undivided basic education reform was implemented. This research was carried out using a pluralist and structuralist approach of policy process and was analyzed according to the hybrid model of implementation process. The data were collected using a triangulation of methods, i.e. documentary research, interviews and questionnaires. The data were qualitative and were analyzed using content analysis methods. This study concludes that the undivided basic education reform was applied in a very decentralized manner, which is a reflection of the decentralized system present in Finland. Central authorities provided a clear vision of the purpose of the reform, but did not control the implementation process. They rather provided extensive support in the form of transmission of information and development of collaborative networks. Local authorities had complete autonomy in terms of decision-making and implementation process. Discussions, debates and decisions regarding implementation processes took place at the local level and included the participation of all actors present on the field. Implementation methods differ from a region to another, with is the consequence of the variation of the level of commitment of local actors but also the diversity of local realities. The reform was implemented according to existing structures and values, which means that it was in cohesion with the context in which it was implemented. These results cannot be generalized to all implementation processes of educational change in Finland but give a great insight of what could be the model used in Finland. Future studies could intent to confirm the model described here by studying other reforms that took place in Finland.
Resumo:
For the past decades, educational large-scale reforms have been elaborated and implemented in many countries and often resulted in partial or complete failure. These results brought researchers to study policy processes in order to address this particular challenge. Studies on implementation processes brought to light an existing causal relationship between the implementation process and the effectiveness of a reform. This study aims to describe the implementation process of educational change in Finland, who produced efficient educational reforms over the last 50 years. The case study used for the purpose of this study is the national reform of undivided basic education (yhtenäinen peruskoulu) implemented in the end of the 1990s. Therefore, this research aims to describe how the Finnish undivided basic education reform was implemented. This research was carried out using a pluralist and structuralist approach of policy process and was analyzed according to the hybrid model of implementation process. The data were collected using a triangulation of methods, i.e. documentary research, interviews and questionnaires. The data were qualitative and were analyzed using content analysis methods. This study concludes that the undivided basic education reform was applied in a very decentralized manner, which is a reflection of the decentralized system present in Finland. Central authorities provided a clear vision of the purpose of the reform, but did not control the implementation process. They rather provided extensive support in the form of transmission of information and development of collaborative networks. Local authorities had complete autonomy in terms of decision-making and implementation process. Discussions, debates and decisions regarding implementation processes took place at the local level and included the participation of all actors present on the field. Implementation methods differ from a region to another, with is the consequence of the variation of the level of commitment of local actors but also the diversity of local realities. The reform was implemented according to existing structures and values, which means that it was in cohesion with the context in which it was implemented. These results cannot be generalized to all implementation processes of educational change in Finland but give a great insight of what could be the model used in Finland. Future studies could intent to confirm the model described here by studying other reforms that took place in Finland.