117 resultados para GIS BASED PLANNING TOOLS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän diplomityön päämääränä on ratkaista Yritys A:n rakennusliiketoi-minnan kehittämistehtävä. Tehtävänä on analysoida kohteena oleva liike-toiminta, sen toimintaympäristö ja toimintaedellytykset. Analyysien tulok-sena muodostetaan johtopäätökset liiketoiminnan tilanteesta ja luodaan Yritys A:n rakennusliiketoiminnalle strategia ja toteutussuunnitelma. Työn teoreettisessa osassa esitellään analyysimallit ja strategiatyö vaiheittain. Empiirisessä osassa sovelletaan esiteltyjä työkaluja ja malleja sekä luo-daan niihin pohjautuen strategia ja sen toteuttamissuunnitelma. Työn totuloksina syntyvät tehtävänannon mukaiset suunnitelmat ovat suoraan Yritys A:n omistajien ja johdon hyödynnettävissä. Tulosten luotettavuus ja toimivuus pohjautuu taustaryhmän ja työn toteuttajan laajaan näköalaan ja pitkälliseen tunte-mukseen kohdeyrityksen liiketoiminnasta. Työssä on käytetty tutkimuksel-lisen kehittämisen, konstruktiivisen tapaustutkimuksen ja avoimen haastat-telun menetelmiä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first main objective of this study was to develop a planning capacity management model for a chocolate and confectionary company in Finland. The second objective was to analyze how capacity updates would affect cost accounting practices. In addition, a creation of an update and maintenance process of planning capacities was determined as a sub-objective of developing the capacity management model. The thesis was started with analyzing the needs, requirements and constraints of the capacity management model and model’s connection with cost accounting. This was done by interviewing key officials like production planners, managers and controllers. A thorough literature review was also done at an early phase. Furthermore, internal systems and software architecture got to be acquainted with. The model was constructed as an Excel-based platform which will get its input data from Enterprise Resource Planning system and Production Performance Measurement system. The main purpose of the planning capacity management model is to make sure that the production planners can utilize more precise parameters but it also offers tools for the production managers to assess production performance and effectiveness at the product level. In addition to production planning, planning capacities are also tightly involved with cost accounting as many direct and indirect costs are allocated to the products utilizing planning capacities. For this reason, the linkage between product costing and capacity is also diversely examined in the thesis. Development suggestions conclude the report by giving some guidelines for more precise and consistent production planning and cost accounting between the factories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human genome comprises roughly 20 000 protein coding genes. Proteins are the building material for cells and tissues, and proteins are functional compounds having an important role in many cellular responses, such as cell signalling. In multicellular organisms such as humans, cells need to communicate with each other in order to maintain a normal function of the tissues within the body. This complex signalling between and within cells is transferred by proteins and their post-translational modifications, one of the most important being phosphorylation. The work presented here concerns the development and use of tools for phosphorylation analysis. Mass spectrometers have become essential tools to study proteins and proteomes. In mass spectrometry oriented proteomics, proteins can be identified and their post-translational modifications can be studied. In this Ph.D. thesis the objectives were to improve the robustness of sample handling methods prior to mass spectrometry analysis for peptides and their phosphorylation status. The focus was to develop strategies that enable acquisition of more MS measurements per sample, higher quality MS spectra and simplified and rapid enrichment procedures for phosphopeptides. Furthermore, an objective was to apply these methods to characterize phosphorylation sites of phosphopeptides. In these studies a new MALDI matrix was developed which allowed more homogenous, intense and durable signals to be acquired when compared to traditional CHCA matrix. This new matrix along with other matrices was subsequently used to develop a new method that combines multiple spectra from different matrises from identical peptides. With this approach it was possible to identify more phosphopeptides than with conventional LC/ESI-MS/MS methods, and to use 5 times less sample. Also, phosphopeptide affinity MALDI target was prepared to capture and immobilise phosphopeptides from a standard peptide mixture while maintaining their spatial orientation. In addition a new protocol utilizing commercially available conductive glass slides was developed that enabled fast and sensitive phosphopeptide purification. This protocol was applied to characterize the in vivo phosphorylation of a signalling protein, NFATc1. Evidence for 12 phosphorylation sites were found, and many of those were found in multiply phosphorylated peptides

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity distribution network operation (NO) models are challenged as they are expected to continue to undergo changes during the coming decades in the fairly developed and regulated Nordic electricity market. Network asset managers are to adapt to competitive technoeconomical business models regarding the operation of increasingly intelligent distribution networks. Factors driving the changes for new business models within network operation include: increased investments in distributed automation (DA), regulative frameworks for annual profit limits and quality through outage cost, increasing end-customer demands, climatic changes and increasing use of data system tools, such as Distribution Management System (DMS). The doctoral thesis addresses the questions a) whether there exist conditions and qualifications for competitive markets within electricity distribution network operation and b) if so, identification of limitations and required business mechanisms. This doctoral thesis aims to provide an analytical business framework, primarily for electric utilities, for evaluation and development purposes of dedicated network operation models to meet future market dynamics within network operation. In the thesis, the generic build-up of a business model has been addressed through the use of the strategicbusiness hierarchy levels of mission, vision and strategy for definition of the strategic direction of the business followed by the planning, management and process execution levels of enterprisestrategy execution. Research questions within electricity distribution network operation are addressed at the specified hierarchy levels. The results of the research represent interdisciplinary findings in the areas of electrical engineering and production economics. The main scientific contributions include further development of the extended transaction cost economics (TCE) for government decisions within electricity networks and validation of the usability of the methodology for the electricity distribution industry. Moreover, DMS benefit evaluations in the thesis based on the outage cost calculations propose theoretical maximum benefits of DMS applications equalling roughly 25% of the annual outage costs and 10% of the respective operative costs in the case electric utility. Hence, the annual measurable theoretical benefits from the use of DMS applications are considerable. The theoretical results in the thesis are generally validated by surveys and questionnaires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Appearance of the vibration is the very important problem in long tool turning and milling. Current solutions of minimizing vibrations provided by different tool suppliers are very expensive. This Master’s Thesis is presenting the new type of vibration free machining tools produced by Konepaja ASTEX Gear Oy that have cheaper production costs compare to competitors’ products. Vibration problems in machining and their today’s solutions are analyzed in this work. The new vibration damping invention is presented and described. Moreover, the production, laboratory experimental modal analysis and practical testing of the new vibration free prototypes are observed and analyzed on the pages of this Thesis. Based on the testing results the new invention is acknowledged to be successful and approved for further studies and developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of domain-specific languages (DSLs) has been proposed as an approach to cost-e ectively develop families of software systems in a restricted application domain. Domain-specific languages in combination with the accumulated knowledge and experience of previous implementations, can in turn be used to generate new applications with unique sets of requirements. For this reason, DSLs are considered to be an important approach for software reuse. However, the toolset supporting a particular domain-specific language is also domain-specific and is per definition not reusable. Therefore, creating and maintaining a DSL requires additional resources that could be even larger than the savings associated with using them. As a solution, di erent tool frameworks have been proposed to simplify and reduce the cost of developments of DSLs. Developers of tool support for DSLs need to instantiate, customize or configure the framework for a particular DSL. There are di erent approaches for this. An approach is to use an application programming interface (API) and to extend the basic framework using an imperative programming language. An example of a tools which is based on this approach is Eclipse GEF. Another approach is to configure the framework using declarative languages that are independent of the underlying framework implementation. We believe this second approach can bring important benefits as this brings focus to specifying what should the tool be like instead of writing a program specifying how the tool achieves this functionality. In this thesis we explore this second approach. We use graph transformation as the basic approach to customize a domain-specific modeling (DSM) tool framework. The contributions of this thesis includes a comparison of di erent approaches for defining, representing and interchanging software modeling languages and models and a tool architecture for an open domain-specific modeling framework that e ciently integrates several model transformation components and visual editors. We also present several specific algorithms and tool components for DSM framework. These include an approach for graph query based on region operators and the star operator and an approach for reconciling models and diagrams after executing model transformation programs. We exemplify our approach with two case studies MICAS and EFCO. In these studies we show how our experimental modeling tool framework has been used to define tool environments for domain-specific languages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asevaikutusta laukaisusta kohteeseen simuloiva integroitu laskentaketju koostuu sisä-, ulko- ja maaliballistiikan malleista. Ulkoballistiikka kattaa laskentamallit radanlaskennan, sääkorjauksen ja ammusaerodynamiikan alueilla. Graafisella käyttöliittymällä toteutetulla, fysikaalisesti tarkkaan mallinnukseen perustuvalla ja kokonaisuuden kattavalla laskentajärjestelmällä on kasvavaa tarvetta teknisiä ja koulutuksellisia tarkoituksia varten. Erikoisesti, jos laskentaketjuun lisätään räjähdysvaikutuksen mallintaminen, voidaan simuloida asejärjestelmien vaikutusta kohteessa käyttäjien arvostamalla tavalla. Tietointensiiviset ballistiikan laskentamallit ovat välttämättömiä työkaluja teknisen suunnitteluosaamisen kattamiseksi ja kilpailuedun luomiseksi verkostoituneessa yritysympäristössä. Yliopistotutkimuksen tuottamien laskennallisten menetelmien hyötykäyttö yritysten suunnittelujärjestelmissä syventää teknistä osaamista, jolla on myös henkilöstöä motivoiva vaikutus teknisesti vaikeutuvilla markkinoilla. Työssä arvioidaan toimialaa analysoimalla eri käyttötarpeita samoille tietokantoihin tukeutuville laskentamalleille. Tarkastellaan teknisiä perusteita, käyttöympäristöjä ja markkinoita liiketoimintamahdollisuuksien tunnistamiseksi. Työn tuloksena syvennetään näkemystä ydinosaamisista ja visioidaan liikeidean erottumista kilpailijoista, markkinoita ja sen kehittämistä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Case-based reasoning (CBR) is a recent approach to problem solving and learning that has got a lot of attention over the last years. In this work, the CBR methodology is used to reduce the time and amount of resources spent on carry out experiments to determine the viscosity of the new slurry. The aim of this work is: to develop a CBR system to support the decision making process about the type of slurries behavior, to collect a sufficient volume of qualitative data for case base, and to calculate the viscosity of the Newtonian slurries. Firstly in this paper, the literature review about the types of fluid flow, Newtonian and non-Newtonian slurries is presented. Some physical properties of the suspensions are also considered. The second part of the literature review provides an overview of the case-based reasoning field. Different models and stages of CBR cycles, benefits and disadvantages of this methodology are considered subsequently. Brief review of the CBS tools is also given in this work. Finally, some results of work and opportunities for system modernization are presented. To develop a decision support system for slurry viscosity determination, software application MS Office Excel was used. Designed system consists of three parts: workspace, the case base, and section for calculating the viscosity of Newtonian slurries. First and second sections are supposed to work with Newtonian and Bingham fluids. In the last section, apparent viscosity can be calculated for Newtonian slurries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Delays in the justice system have been undermining the functioning and performance of the court system all over the world for decades. Despite the widespread concern about delays, the solutions have not kept up with the growth of the problem. The delay problem existing in the justice courts processes is a good example of the growing need and pressure in professional public organizations to start improving their business process performance.This study analyses the possibilities and challenges of process improvement in professional public organizations. The study is based on experiences gained in two longitudinal action research improvement projects conducted in two separate Finnish law instances; in the Helsinki Court of Appeal and in the Insurance Court. The thesis has two objectives. First objective is to study what kinds of factors in court system operations cause delays and unmanageable backlogs and how to reduce and prevent delays. Based on the lessons learned from the case projects the objective is to give new insights on the critical factors of process improvement conducted in professional public organizations. Four main areas and factors behind the delay problem is identified: 1) goal setting and performance measurement practices, 2) the process control system, 3) production and capacity planning procedures, and 4) process roles and responsibilities. The appropriate improvement solutions include tools to enhance project planning and scheduling and monitoring the agreed time-frames for different phases of the handling process and pending inventory. The study introduces the identified critical factors in different phases of process improvement work carried out in professional public organizations, the ways the critical factors can be incorporated to the different stages of the projects, and discusses the role of external facilitator in assisting process improvement work and in enhancing ownership towards the solutions and improvement. The study highlights the need to concentrate on the critical factors aiming to get the employees to challenge their existing ways of conducting work, analyze their own processes, and create procedures for diffusing the process improvement culture instead of merely concentrating of finding tools, techniques, and solutions appropriate for applications from the manufacturing sector

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the pilotage effectiveness study was to come up with a process descrip-tion of the pilotage procedure, to design performance indicators based on this process description, to be used by Finnpilot, and to work out a preliminary plan for the imple-mentation of the indicators within the Finnpilot organisation. The theoretical aspects of pilotage as well as the guidelines and standards used were determined through a literature review. Based on the literature review, a process flow model with the following phases was created: the planning of pilotage, the start of pilo-tage, the act of pilotage, the end of pilotage and the closing of pilotage. The model based on the literature review was tested through interviews and observation of pilotage. At the same time an e-mail survey directed at foreign pilotage organisations, which included a questionnaire concerning their standards and management systems, operations procedures, measurement tools and their attitude to the passage planning, was conducted. The main issues in the observations and interviews were the passage plan and the bridge team co-operation. The phases of the pilotage process model emerged in both the pilotage activities and the interviews whereas bridge team co-operation was relatively marginal. Most of the pilotage organisations, who responded to the query, also use some standard-based management system. All organisations who answered the survey use some sort of a pilotage process model. According to the query, the main measuring tools for pilotage are statistical information concerning pilotage and the organisations, the customer feedback surveys, and financial results. Attitudes to-wards passage planning were mostly positive among the organisations. A workshop with pilotage experts was arranged where the process model constructed on the basis of the literature review was tuned to match practical pilotage. In the workshop it was determined that certain phases and the corresponding tasks, through which pilo-tage can be described as a process, were identifiable in all pilotage. The result of the workshop was a complemented process model, which separates incoming and outgoing traffic, as well as the fairway pilotage and harbour pilotage from each other. Addition-ally indicators divided according to the data gathering method were defined. Data con-cerning safety and traffic flow is gathered in the form of customer feedback. The pilot's own perceptions of the pilotage process are gathered through self-assessment. The measurement data which is connected to the phases of the pilotage process is generated e.g. by gathering statistics of the success of the pilot dispatches, the accuracy of the pi-lotage and the incidents that occurred during the pilotage, near misses, deviations and accidents. The measurement data is collected via the PilotWeb at the closing of the pilo-tage. A separate project and a project group with pilots also participating will be established for the deployment of the performance indicators. The phases of the project are: the definition phase, the implementation phase and the deployment phase. The purpose of the definition phase is to prepare questions for ship commanders concerning the cus-tomer feedback questionnaire and also to work out the self-assessment queries and the queries concerning the process indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prostate-specific antigen (PSA) is a marker that is commonly used in estimating prostate cancer risk. Prostate cancer is usually a slowly progressing disease, which might not cause any symptoms whatsoever. Nevertheless, some cases of cancer are aggressive and need to be treated before they become life-threatening. However, the blood PSA concentration may rise also in benign prostate diseases and using a single total PSA (tPSA) measurement to guide the decision on further examinations leads to many unnecessary biopsies, over-detection, and overtreatment of indolent cancers which would not require treatment. Therefore, there is a need for markers that would better separate cancer from benign disorders, and would also predict cancer aggressiveness. The aim of this study was to evaluate whether intact and nicked forms of free PSA (fPSA-I and fPSA-N) or human kallikrein-related peptidase 2 (hK2) could serve as new tools in estimating prostate cancer risk. First, the immunoassays for fPSA-I and free and total hK2 were optimized so that they would be less prone to assay interference caused by interfering factors present in some blood samples. The optimized assays were shown to work well and were used to study the marker concentrations in the clinical sample panels. The marker levels were measured from preoperative blood samples of prostate cancer patients scheduled for radical prostatectomy. The association of the markers with the cancer stage and grade was studied. It was found that among all tested markers and their combinations especially the ratio of fPSA-N to tPSA and ratio of free PSA (fPSA) to tPSA were associated with both cancer stage and grade. They might be useful in predicting the cancer aggressiveness, but further follow-up studies are necessary to fully evaluate the significance of the markers in this clinical setting. The markers tPSA, fPSA, fPSA-I and hK2 were combined in a statistical model which was previously shown to be able to reduce unnecessary biopsies when applied to large screening cohorts of men with elevated tPSA. The discriminative accuracy of this model was compared to models based on established clinical predictors in reference to biopsy outcome. The kallikrein model and the calculated fPSA-N concentrations (fPSA minus fPSA-I) correlated with the prostate volume and the model, when compared to the clinical models, predicted prostate cancer in biopsy equally well. Hence, the measurement of kallikreins in a blood sample could be used to replace the volume measurement which is time-consuming, needs instrumentation and skilled personnel and is an uncomfortable procedure. Overall, the model could simplify the estimation of prostate cancer risk. Finally, as the fPSA-N seems to be an interesting new marker, a direct immunoassay for measuring fPSA-N concentrations was developed. The analytical performance was acceptable, but the rather complicated assay protocol needs to be improved until it can be used for measuring large sample panels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elinkaariarviointi on menetelmä, missä tuotejärjestelmän aikaiset syötteet ja tuotteet koostetaan yhteen ja tuloksena saadaan sen ympäristökuormitus. Elinkaariarviointi on päätöksentekoa tukeva työkalu. Jätelain kokonaisuudistuksen myötä elinkaariarvioinnin käyttö tullee lisääntymään kuntavastuullisessa jätehuollossa. Helsingin seudun ympäristöpalvelut -kuntayhtymä HSY:n jätehuollon tavoitteena on rakentaa elinkaarimalli, jonka avulla voidaan selvittää koko toiminnan aiheuttama ympäristökuormitus ja taloudelliset vaikutukset. HSY:n jätehuolto on päättänyt toteuttaa elinkaarimallin rakentamisen konsulttityönä. Työn tavoitteena on ollut laatia toimintaohjeisto HSY:n jätehuollon elinkaarimallinnuspalveluiden hankkimiseksi. Elinkaarimalli voidaan tehdä kaupallista ohjelmistoa käyttämällä. Tähän selvitykseen on valittu arvioitavaksi kolme elinkaariarvioinnin työkalua: EASEWASTE, WRATE ja GaBi 4.4. Ohjelmistojen ominaisuuksia on arvioitu kirjallisuuden ja haastattelun perusteella. Työssä on laadittu kriteeristö näiden ohjelmistojen arviointiin. Kirjallisuuden perusteella on selvitetty elinkaariarvioinnin soveltamiskohteet kuntavastuullisessa jätehuollossa. HSY:n jätehuollon elinkaariarvioinnin soveltamiskohteet ja mallinnustarpeet on tunnistettu haastattelemalla HSY:n jätehuollon asiantuntijoita. HSY:n jätehuollolle rakennettavan mallin päivittämistä, käyttöä ja kehittämistä tulisi hallita HSY:n jätehuollon toimesta. Kaikki työssä arvioidut ohjelmistot soveltuvat HSY:n jätehuollon tunnistamien mallinnustarpeiden laskentaan. Elinkaarimallinnuspalveluiden toimintaohjeistolla pyritään varmistamaan HSY:n jätehuollon tarpeisiin soveltuvan mallin hankinta ja jatkotoimenpiteiden suunnittelu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän tutkimuksen päätavoitteena oli löytää tehokkain ja järkevin tapa organisoida yrityksen kunnossapitoyksikön työkalutoiminnot. Tavoitteena oli saada aikaan uusi toimintamalli, jonka avulla saavutetaan toiminnallinen ja kustannustehokas työkalutoiminta. Tutkimus toteutettiin konstruktiivisella tutkimusotteella. Työn teoriaosuus koostuu pääasiassa teollisuuden kunnossapitoa käsittelevistä tutkimuksista, artikkeleista ja kirjoista. Teoriaosuudessa käsitellään myös kunnossapidon ulkoistamista sekä siihen liittyviä haasteita ja mahdollisuuksia. Teoriaosuudessa käsiteltiin lyhyesti myös varastonhallintaa. Empriaosuus on kerätty yrityksen toiminnanohjausjärjestelmien avulla sekä tutustuen työkalutoimintoihin ja sen erityispiirteisiin. Tutkimuksen alussa selvitettiin nykyisen toiminnan erityispiirteet, ongelmat ja kehittämiskohteet. Näille ongelmille ja kehittämiskohteille etsittiin järkevät ratkaisut ja kehitettiin toimintasuunnitelma, jonka avulla yritys pääsee tavoiteltuun tehokkaaseen toimintamalliin.