870 resultados para Time Based Management (TBM)
Resumo:
The study of price risk management concerning high grade steel alloys and their components was conducted. This study was focused in metal commodities, of which nickel, chrome and molybdenum were in a central role. Also possible hedging instruments and strategies for referred metals were studied. In the literature part main themes are price formation of Ni, Cr and Mo, the functioning of metal exchanges and main hedging instruments for metal commodities. This section also covers how micro and macro variables may affect metal prices from the viewpoint of short as well as longer time period. The experimental part consists of three sections. In the first part, multiple regression model with seven explanatory variables was constructed to describe price behavior of nickel. Results were compared after this with information created with comparable simple regression model. Additionally, long time mean price reversion of nickel was studied. In the second part, theoretical price of CF8M alloy was studied by using nickel, ferro-chrome and ferro-molybdenum as explanatory variables. In the last section, cross hedging possibilities for illiquid FeCr -metal was studied with five LME futures. Also this section covers new information concerning possible forthcoming molybdenum future contracts as well. The results of this study confirm, that linear regression models which are based on the assumption of market rationality, are not able to reliably describe price development of metals at issue. Models fulfilling assumptions for linear regression may though include useful information of statistical significant variables which have effect on metal prices. According to the experimental part, short futures were found to incorporate the most accurate information concerning the price movements in the future. However, not even 3M futures were able to predict turning point in the market before the faced slump. Cross hedging seemed to be very doubtful risk management strategy for illiquid metals, because correlations coefficients were found to be very sensitive for the chosen time span.
Resumo:
Condition monitoring systems for physical assets are constantly becoming more and more common in the industrial sector. At the same time an increasing portion of asset monitoring systems are being remotely supported. As global competitors are actively developing solutions for condition monitoring and condition-based maintenance, which it enables, Wärtsilä too feels the pressure to provide customers with more sophisticated condition-based maintenance solutions. The main aim of this thesis study is to consider Wärtsilä remote condition monitoring solutions and how they relate to similar solutions from other suppliers and end customers’ needs, in the context of offshore assets. A theoretical study is also included in the thesis, where the concepts of condition monitoring, condition-based maintenance, maintenance management and physical asset management are introduced.
Resumo:
In recent times of global turmoil, the need for uncertainty management has become ever momentous. The need for enhanced foresight especially concerns capital-intensive industries, which need to commit their resources and assets with long-term planning horizons. Scenario planning has been acknowledged to have many virtues - and limitations - concerning the mapping of the future and illustrating the alternative development paths. The present study has been initiated to address both the need of improved foresight in two capital-intensive industries, i.e. the paper and steel industries and the imperfections in the current scenario practice. The research problem has been approached by engendering a problem-solving vehicle, which combines, e.g. elements of generic scenario process, face-to-face group support methods, deductive scenario reasoning and causal mapping into a fully integrated scenario process. The process, called the SAGES scenario framework, has been empirically tested by creating alternative futures for two capital-intensive industries, i.e. the paper and steel industries. Three scenarios for each industry have been engendered together with the identification of the key megatrends, the most important foreign investment determinants, key future drivers and leading indicators for the materialisation of the scenarios. The empirical results revealed a two-fold outlook for the paper industry, while the steel industry future was seen as much more positive. The research found support for utilising group support systems in scenario and strategic planning context with some limitations. Key perceived benefits include high time-efficiency, productivity and lower resource-intensiveness. Group support also seems to enhance participant satisfaction, encourage innovative thinking and provide the users with personalised qualitative scenarios.
Resumo:
The aim of this thesis was to analyze the background information of an activity-based costing system, which is being used in a domestic forest industry company. The reports produced by the system have not been reliable, and this has caused the utilization of the system to diminish. The study was initiated by examining the theory of activity-based costing. It was also discovered, that the system produces management accounting information and therefore also that theory was introduced briefly. Next the possible sources of errors were examined. The significance of these errors was evaluated and waste handling was chosen as a subject of further study. The problem regarding waste handling was that there is no waste compensation in current model. When paper or board machine produces waste, it can be used as raw material in the process. However, at the moment the product, which is being produced, at the time does not get any compensation. The use of compensation has not been possible due to not knowing the quantity of process waste. As a result of the study a calculatory model, which enables calculating the quantity of process waste based on the data from the mill system, was introduced. This, for one, enables starting to use waste compensation in the future.
Resumo:
The ability of the supplier firm to generate and utilise customer-specific knowledge has attracted increasing attention in the academic literature during the last decade. It has been argued the customer knowledge should treated as a strategic asset the same as any other intangible assets. Yet, at the same time it has been shown that the management of customer-specific knowledge is challenging in practice, and that many firms are better at acquiring customer knowledge than at making use of it. This study examines customer knowledge processing in the context of key account management in large industrial firms. This focus was chosen because key accounts are demanding and complex. It is not unusual for a single key account relationship to constitute a complex web of relationships between the supplier and the key account – thus easily leading to the dispersion of customer-specific knowledge in the supplier firm. Although the importance of customer-specific knowledge generation has been widely acknowledged in the literature, surprisingly little attention has been paid to the processes through which firms generate, disseminate and use such knowledge internally for enhancing the relationships with their major, strategically important key account customers. This thesis consists of two parts. The first part comprises a theoretical overview and draws together the main findings of the study, whereas the second part consists of five complementary empirical research papers based on survey data gathered from large industrial firms in Finland. The findings suggest that the management of customer knowledge generated about and form key accounts is a three-dimensional process consisting of acquisition, dissemination and utilization. It could be concluded from the results that customer-specific knowledge is a strategic asset because the supplier’s customer knowledge processing activities have a positive effect on supplier’s key account performance. Moreover, in examining the determinants of each phase separately the study identifies a number of intra-organisational factors that facilitate the process in supplier firms. The main contribution of the thesis lies in linking the concept of customer knowledge processing to the previous literature on key account management. Moreover, given than this literature is mainly conceptual or case-based, a further contribution is to examine its consequences and determinants based on quantitative empirical data.
Resumo:
The objective of the work has been to study why systems thinking should be used in combination with TQM, what are the main benefits of the integration and how it could best be done. The work analyzes the development of systems thinking and TQM with time and the main differences between them. The work defines prerequisites for adopting a systems approach and the organizational factors which embody the development of an efficient learning organization. The work proposes a model based on combination of an interactive management model and redesign to be used for application of systems approach with TQM in practice. The results of the work indicate that there are clear differences between systems thinking and TQM which justify their combination. Systems approach provides an additional complementary perspective to quality management. TQM is focused on optimizing operations at the operational level while interactive management and redesign of organization are focused on optimization operations at the conceptual level providing a holistic system for value generation. The empirical study demonstrates the applicability of the proposed model in one case study company but its application is tenable and possible also beyond this particular company. System dynamic modeling and other systems based techniques like cognitive mapping are useful methods for increasing understanding and learning about the behavior of systems. The empirical study emphasizes the importance of using a proper early warning system.
Resumo:
Resonance energy transfer (RET) is a non-radiative transfer of the excitation energy from the initially excited luminescent donor to an acceptor. The requirements for the resonance energy transfer are: i) the spectral overlap between the donor emission spectrum and the acceptor absorption spectrum, ii) the close proximity of the donor and the acceptor, and iii) the suitable relative orientations of the donor emission and the acceptor absorption transition dipoles. As a result of the RET process the donor luminescence intensity and the donor lifetime are decreased. If the acceptor is luminescent, a sensitized acceptor emission appears. The rate of RET depends strongly on the donor–acceptor distance (r) and is inversely proportional to r6. The distance dependence of RET is utilized in binding assays. The proximity requirement and the selective detection of the RET-modified emission signal allow homogeneous separation free assays. The term lanthanide-based RET is used when luminescent lanthanide compounds are used as donors. The long luminescence lifetimes, the large Stokes’ shifts and the intense, sharply-spiked emission spectra of the lanthanide donors offer advantages over the conventional organic donor molecules. Both the organic lanthanide chelates and the inorganic up-converting phosphor (UCP) particles have been used as donor labels in the RET based binding assays. In the present work lanthanide luminescence and lanthanide-based resonance energy transfer phenomena were studied. Luminescence lifetime measurements had an essential role in the research. Modular frequency-domain and time-domain luminometers were assembled and used successfully in the lifetime measurements. The frequency-domain luminometer operated in the low frequency domain ( 100 kHz) and utilized a novel dual-phase lock-in detection of the luminescence. One of the studied phenomena was the recently discovered non-overlapping fluorescence resonance energy transfer (nFRET). The studied properties were the distance and temperature dependences of nFRET. The distance dependence was found to deviate from the Förster theory and a clear temperature dependence was observed whereas conventional RET was completely independent of the temperature. Based on the experimental results two thermally activated mechanisms were proposed for the nFRET process. The work with the UCP particles involved the measurement of the luminescence properties of the UCP particles synthesized in our laboratory. The goal of the UCP particle research is to develop UCP donor labels for binding assays. In the present work the effect of the dopant concentrations and the core–shell structure on the total up-conversion luminescence intensity, the red–green emission ratio, and the luminescence lifetime was studied. Also the non-radiative nature of the energy transfer from the UCP particle donors to organic acceptors was demonstrated for the first time in aqueous environment and with a controlled donor–acceptor distance.
Resumo:
The most outstanding conceptual challenge of modern crisis management is the principle of consent. It is not a problem only at the operational level - it challenges the entire decision-making structures of crisis management operations. In post-cold war times and especially in the 21st century, there has been a transition from peacekeeping with limited size and scope towards large and complex peace operations. This shift has presented peace operations with a dilemma. How to balance between maintaining consent for peace operations, whilst being able to use military force to coerce those attempting to wreck peace processes? To address such a dilemma, this research aims to promote understanding, on what can be achieved by military crisis management operations (peace support operations) in the next decade. The research concentrates on the focal research question: Should military components induce consent or rely on the compliance of conflicting parties in crisis management operations of the next decade (2020 – 2030)? The focus is on military – political strategic level considerations, and especially on the time before political decisions to commit to a crisis management operation. This study does not focus on which actor or organisation should intervene. The framework of this thesis derives from the so called ‘peacebuilding space’, the scope of peace operations and spoiler theory. Feasibility of both peace enforcement and peacekeeping in countering future risk conditions are analysed in this framework. This future-orientated qualitative research uses the Delphi-method with a panel of national and international experts. Citation analysis supports identification of relevant reference material, which consists of contemporary literature, the Delphi-questionnaires and interviews. The research process followed three main stages. In the first stage, plausible future scenarios and risk conditions were identified with the Delphi-panel. In the second stage, operating environments for peace support operations were described and consequent hypotheses formulated. In the third stage, these hypotheses were tested on the Delphi-panel. The Delphi-panel is sufficiently wide and diverse to produce plausible yet different insights. The research design utilised specifically military crisis management and peace operations theories. This produced various and relevant normative considerations. Therefore, one may argue that this research; which is based on accepted contemporary theory, hypotheses derived thereof and utilising an expert panel, contributes to the realm of peace support operations. This research finds that some degree of peace enforcement will be feasible and necessary in at least the following risk conditions: failed governance; potential spillover of ethnic, religious, ideological conflict; vulnerability of strategic chokepoints and infrastructures in ungoverned spaces; as well as in territorial and extra-territorial border disputes. In addition, some form of peace enforcement is probably necessary in risk conditions pertaining to: extremism of marginalised groups; potential disputes over previously uninhabited and resource-rich territories; and interstate rivalry. Furthermore, this research finds that peacekeeping measures will be feasible and necessary in at least risk conditions pertaining to: potential spillover of ethnic, religious, ideological conflict; uncontrolled migration; consequences from environmental catastrophes or changes; territorial and extra-territorial border disputes; and potential disputes over previously uninhabited and resource-rich territories. These findings are all subject to both generic and case specific preconditions that must exist for a peace support operation. Some deductions could be derived from the research findings. Although some risk conditions may appear illogical, understanding the underlying logic of a conflict is fundamental to understanding transition in crisis management. Practitioners of crisis management should possess cognizance of such transition. They must understand how transition should occur from threat to safety, from conflict to stability – and so forth. Understanding transition is imperative for managing the dynamic evolution of preconditions, which begins at the outset of a peace support operation. Furthermore, it is pertinent that spoilers are defined from a peace process point of view. If spoilers are defined otherwise, it changes the nature of an operation towards war, where the logic is breaking the will of an enemy - and surrender. In peace support operations, the logic is different: actions towards spoilers are intended to cause transition towards consent - not defeat. Notwithstanding future developments, history continues to provide strategic education. However, the distinction is that the risk conditions occur in novel futures. Hence, lessons learned from the past should be fitted to the case at hand. This research shows compelling evidence that swaying between intervention optimism and pessimism is not substantiated. Both peace enforcement and peacekeeping are sine qua non for successful military crisis management in the next decade.
Warning system based on theoretical-experimental study of dispersion of soluble pollutants in rivers
Resumo:
Information about capacity of transport and dispersion of soluble pollutants in natural streams are important in the management of water resources, especially in planning preventive measures to minimize the problems caused by accidental or intentional waste, in public health and economic activities that depend on the use of water. Considering this importance, this study aimed to develop a warning system for rivers, based on experimental techniques using tracers and analytical equations of one-dimensional transport of soluble pollutants conservative, to subsidizing the decision-making in the management of water resources. The system was development in JAVA programming language and MySQL database can predict the travel time of pollutants clouds from a point of eviction and graphically displays the temporal distribution of concentrations of passage clouds, in a particular location, downstream from the point of its launch.
Resumo:
After decades of mergers and acquisitions and successive technology trends such as CRM, ERP and DW, the data in enterprise systems is scattered and inconsistent. Global organizations face the challenge of addressing local uses of shared business entities, such as customer and material, and at the same time have a consistent, unique, and consolidate view of financial indicators. In addition, current enterprise systems do not accommodate the pace of organizational changes and immense efforts are required to maintain data. When it comes to systems integration, ERPs are considered “closed” and expensive. Data structures are complex and the “out-of-the-box” integration options offered are not based on industry standards. Therefore expensive and time-consuming projects are undertaken in order to have required data flowing according to business processes needs. Master Data Management (MDM) emerges as one discipline focused on ensuring long-term data consistency. Presented as a technology-enabled business discipline, it emphasizes business process and governance to model and maintain the data related to key business entities. There are immense technical and organizational challenges to accomplish the “single version of the truth” MDM mantra. Adding one central repository of master data might prove unfeasible in a few scenarios, thus an incremental approach is recommended, starting from areas most critically affected by data issues. This research aims at understanding the current literature on MDM and contrasting it with views from professionals. The data collected from interviews revealed details on the complexities of data structures and data management practices in global organizations, reinforcing the call for more in-depth research on organizational aspects of MDM. The most difficult piece of master data to manage is the “local” part, the attributes related to the sourcing and storing of materials in one particular warehouse in The Netherlands or a complex set of pricing rules for a subsidiary of a customer in Brazil. From a practical perspective, this research evaluates one MDM solution under development at a Finnish IT solution-provider. By means of applying an existing assessment method, the research attempts at providing the company with one possible tool to evaluate its product from a vendor-agnostics perspective.
Resumo:
Usage of batteries as energy storage is emerging in automotive and mobile working machine applications in future. When battery systems become larger, battery management becomes an essential part of the application concerning fault situations of the battery and safety of the user. A properly designed battery management system extends one charge cycle of battery pack and the whole life time of the battery pack. In this thesis main objectives and principles of BMS are studied and first order Thevenin’s model of the lithium-titanate battery cell is built based on laboratory measurements. The battery cell model is then verified by comparing the battery cell model and the actual battery cell and its suitability for use in BMS is studied.
Resumo:
The meeting of the Publication "Evidence Based Telemedicine - Trauma and Emergency Surgery" (TBE-CiTE), through literature review, selected three recent articles on the treatment of victims stab wounds to the abdominal wall. The first study looked at the role of computed tomography (CT) in the treatment of patients with stab wounds to the abdominal wall. The second examined the use of laparoscopy over serial physical examinations to evaluate patients in need of laparotomy. The third did a review of surgical exploration of the abdominal wound, use of diagnostic peritoneal lavage and CT for the early identification of significant lesions and the best time for intervention. There was consensus to laparotomy in the presence of hemodynamic instability or signs of peritonitis, or evisceration. The wound should be explored under local anesthesia and if there is no injury to the aponeurosis the patient can be discharged. In the presence of penetration into the abdominal cavity, serial abdominal examinations are safe without CT. Laparoscopy is well indicated when there is doubt about any intracavitary lesion, in centers experienced in this method.
Resumo:
According to several surveys and observations, the percentage of successfully conducted IT projects without over-budgeting and delays in time schedule are extremely low. Many projects also are evaluated as failures in terms of delivered functionality. Nuldén (1996) compares IT projects with bad movies; after watching for 2 hours, one still tries to finish it even though one understands that it is a complete waste of time. The argument for that is 'I've already invested too much time to terminate it now'. The same happens with IT projects: sometimes the company continues wasting money on these projects for a long time, even though there are no expected benefits from these projects. Eventually these projects are terminated anyway, but until this moment, the company spends a lot. The situation described above is a consequence of “escalation of commitment” - project continuation even after a manager receives negative feedback of the project’s success probability. According to Keil and Mähring (2010), even though escalation can occur in any type of project, it is more common among complex technological projects, such as IT projects. Escalation of commitment very often results in runaway projects. In order to avoid it, managers use de-escalation strategies, which allow the resources to be used in more effective. These strategies lead to project termination or turning around, which stops the flow of wasted investments. Numbers of researches explore escalation of commitment phenomena based on experiments and business cases. Moreover, during the last decade several frameworks were proposed for de-escalation strategy. However, there is no evidence of successful implementation of the de-escalation of commitment strategy in the literature. In addition, despite that fact that IT project management methodologies are widely used in the companies, none of them cover the topic of escalation of commitment risks. At the same time, there are no researches proposing the way to implement de-escalation of commitment strategy into the existing project management methodology The research is focused on a single case of large ERP implementation project by the consulting company. Hence, the main deliverables of the study include suggestions of improvement in de-escalation methods and techniques in the project and in the company. Moreover, the way to implement these methods into existing project management methodology and into the company general policies is found.
Resumo:
With information technology (IT) playing an increasing important role in driving the business, the value of IT investment is often challenged because not all of those investment decisions are made in a reasonable way or aligned with business strategies. IT investment portfolio management (PfM) is an effective way to prioritize and select the right IT projects to invest in, by taking all the project proposals into consideration as a whole, based on their business value, risks, costs, and interrelationships. There are different decision models to prioritise projects, and the Analytic Hierarchy Process (AHP) is one of the most commonly-used methods and is discussed in this master thesis. At the same time, there are IT projects on different levels for a multinational company, from global to local. For instance, many of them are probably proposed by joint ventures on local level. In the oil & gas industry, joint ventures are often formed especially in the area of the upstream (exploration & production). How to involve those projects into the IT investment PfM approach of the parent company is a challenge, because the parent company cannot make the decisions on its own. It needs to prioritize all projects in an adequate way, communicate with JVs and influence them. Also, different control levels on JVs need to be considered. This paper hence attempts to introduce a tailored approach of IT investment PfM for a multinational oil & gas company to address the issues around JVs.
Resumo:
Social tagging evolved in response to a need to tag heterogeneous objects, the automated tagging of which is usually not feasible by current technological means. Social tagging can be used for more flexible competence management within organizations. The profiles of employees can be built in the form of groups of tags, as employees tag each other, based on their familiarity of each other’s expertise. This can serve as a replacement for the more traditional competence management approaches, which usually become outdated due to social and organizational hurdles, and obsolete data. These limitations can be overcome by people tagging, as the information revealed by such tags is usually based on most recent employee interaction and knowledge. Task management as part of personal information management aims at the support of users’ individual task handling. This can include collaborating with other individuals, sharing one’s knowledge, both functional and process-related, and distributing documents and web resources. In this context, Task patterns can be used as templates that collect information and experience around tasks associated to it during run time, facilitating agility. The effective collaboration among contributors necessitates the means to find the appropriate individuals to work with on the task, and this can be made possible by using social tagging to describe individual competencies. The goal of this study is to support finding and tagging people within task management, through the effective exploitation of the work/task context. This involves the utilization of knowledge of the workers’ expertise, nature of the task/task pattern and information available from the documents and web resources attached to the task. Vice versa, task management provides an excellent environment for social tagging due to the task context that already provides suitable tags. The study also aims at assisting users of the task management solution with the collaborative construction of light-weight ontology by inferring semantic relations between tags. The thesis project aims at an implementation of people finding & tagging within the java application for task management that consumes web services, which provide the required ontology for the organization.