99 resultados para European best practices
Resumo:
The purpose of the Master’s Thesis is to study the best practices to virtual project management from the project manager’ point of view. The best practices are divided according to a five-phase virtual project life cycle model. Each phase include concrete suggestions for actions. Research’s theoretical background is wide because of the broad subject matter. In the theoretical part topics such as virtual working, virtual project management challenges are examined and some concrete actions to tackle these challenges are introduced. Thesis’ approach is constructive, where a known problem is solved piece by piece after creating a pre-understanding of the topic. Existing research work is utilized when creating a model for virtual project team management. The basis of the model comes from various best practices read from literature and from the interviews conducted on experienced virtual project managers in the case organization. As a result the model combines both previous research and the organizations empirical experience. As an output of the thesis a model for virtual project team management is developed, which can be used as a guideline by the virtual project managers in their work. The model includes actions and practices what can be used to overcome the challenges of virtual project management.
Resumo:
Hoitajien informaatioteknologian hyväksyntä ja käyttö psykiatrisissa sairaaloissa Informaatioteknologian (IT) käyttö ei ole ollut kovin merkittävässä roolissa psykiatrisessa hoitotyössä, vaikka IT sovellusten on todettu vaikuttaneen radikaalisti terveydenhuollon palveluihin ja hoitohenkilökunnan työprosesseihin viime vuosina. Tämän tutkimuksen tavoitteena on kuvata psykiatrisessa hoitotyössä toimivan hoitohenkilökunnan informaatioteknologian hyväksyntää ja käyttöä ja luoda suositus, jonka avulla on mahdollista tukea näitä asioita psykiatrisissa sairaaloissa. Tutkimus koostuu viidestä osatutkimuksesta, joissa on hyödynnetty sekä tilastollisia että laadullisia tutkimusmetodeja. Tutkimusaineistot on kerätty yhdeksän akuuttipsykiatrian osaston hoitohenkilökunnan keskuudessa vuosien 2003-2006 aikana. Technology Acceptance Model (TAM) –teoriaa on hyödynnetty jäsentämään tutkimusprosessia sekä syventämään ymmärrystä saaduista tutkimustuloksista. Tutkimus osoitti kahdeksan keskeistä tekijää, jotka saattavat tukea psykiatrisessa sairaalassa toimivien hoitajien tietoteknologiasovellusten hyväksyntää ja hyödyntämistä, kun nämä tekijät otetaan huomioon uusia sovelluksia käyttöönotettaessa. Tekijät jakautuivat kahteen ryhmään; ulkoiset tekijät (resurssien suuntaaminen, yhteistyö, tietokonetaidot, IT koulutus, sovelluksen käyttöön liittyvä harjoittelu, potilas-hoitaja suhde), sekä käytön helppous ja sovelluksen käytettävyys (käytön ohjeistus, käytettävyyden varmistaminen). TAM teoria todettiin käyttökelpoiseksi tulosten tulkinnassa. Kehitetty suositus sisältää ne toimenpiteet, joiden avulla on mahdollista tukea sekä organisaation johdon että hoitohenkilökunnan sitoutumista ja tätä kautta varmistaa uuden sovelluksen hyväksyntä ja käyttö hoitotyössä. Suositusta on mahdollista hyödyntää käytännössä kun uusia tietojärjestelmiä implementoidaan käyttöön psykiatrisissa sairaaloissa.
Resumo:
Tässä tutkimuksessa vastataan kysymykseen ”Kuinka IT-palvelunhallinnan parhaita käytäntöjä voidaan implementoida osaksi kohdeorganisaation toimintaa?”. Tutkimuksen teoreettinen viitekehys muodostetaan parhaiden käytäntöjen siirron, tietojohtamisen ja tiedon siirron tieteellisistä keskusteluista. Tutkimuksen empiirinen osuus suoritetaan kvalitatiivisena tapaustutkimuksena. Tutkimuksen teoreettisessa osuudessa muodostetaan integroiva malli aiemmista tieteellisistä keskusteluista, joista tutkimuksen viitekehys rakentuu. Tutkimuksen empiirisessä osuudessa tutkitaan kohdeorganisaation edellytyksiä parhaiden käytäntöjen käyttöönottoon sekä toimintatapoja ja tukijärjestelmiä, joilla käyttöönottoa voidaan tukea. Keskeisimpinä kohdeorganisaatiolle suositeltavina toimenpiteinä esitetään yksikön sisäisen viestinnän tehostamista, tiedonhallintakäytäntöjen määrittelyä ja formalisointia, kannustavuuden lisäystä ja toiminnan mitoittamista resursseihin sopivaksi.
Resumo:
”METKU –projektissa” (Merenkulun turvallisuuskulttuurin kehittäminen) tutkitaan kansainvälisen turvallisuusjohtamiskoodin (ISM-koodin) vaikutuksia merenkulun turvallisuuteen ja etsitään kehittämiskohteita merenkulun turvallisuusjohtamisen parantamiseksi. Tämä haastatteluraportti on laadittu METKU –projektin yhteistyössä työpakettien 1 ja 2 kesken. Tähän raporttiin haastateltiin yhteensä 94 merenkulun ammattilaista. Suurimman osan haastateltavista muodostivat aktiiviset merenkulkijat: miehistön jäsenet, päällystö ja alusten päälliköt. Haastattelukohteena oli seitsemän suomalaista varustamoa. Haastatteluissa kerättiin merenkulkijoiden kokemuksia ja mielipiteitä ISM-koodin vaikutuksesta heidän käytännön työhönsä. Suomalaiset merenkulkijat uskovat, että tänä päivänä varustamoiden johtajat ovat hyvin sitoutuneita turvallisuuteen. Myös miehistön asenteet turvallisuuteen ovat ISM-koodin käytön myötä parantuneet. Haasteltavien yhteinen huoli kohdistui jatkuvan parantamisen toimivuuteen. Kaikki haastatellut ryhmät olivat samaa mieltä siitä, että poikkeamien raportointi ei ISMkoodin vaatimuksesta huolimatta toimi kunnolla. ISM-koodin käyttöön otosta on ollut merenkululle selkeää hyötyä. Haastateltavat esittivät hyötyinä parantuneen yhteistyön ja tiedonkulun alusten ja varustamon välillä sekä sen, että merenkulun toiminnan laatu on parantunut. Monet haastateltavat korostivat, että ISM-koodin selkeät turvallisuusvastuut yhtiölle on ollut merkittävä hyöty. Itse ISM-koodiin merenkulkijoilla ei ollut juurikaan huomauttamista. Sen sijaan turvallisuusjohtamisen käytännön toteutuksessa nähtiin parantamisen varaa. ISMkoodin aiheuttamina ongelmina mainittiin mm. lisääntynyt byrokratia ja liian monimutkaiset ja yksityiskohtaiset turvallisuuskäsikirjat. Monet haastateltavat toivovat, että ISM-koodin käytännön soveltamiseen laadittaisiin ohjeita.
Resumo:
B2B document handling is moving from paper to electronic networks and electronic domain very rapidly. Moving, handling and transforming large electronic business documents requires a lot from the systems handling them. This paper explores new technologies such as SOA, event-driven systems and ESB and a scalable, event-driven enterprise service bus is created to demonstrate these new approaches to message handling. As an end result, we have a small but fully functional messaging system with several different components. This is the first larger Java-project done in-house, so on the side we developed our own set of best practices of Java development, setting up configurations, tools, code repositories and class naming and much more.
Resumo:
Research focus of this thesis is to explore options for building systems for business critical web applications. Business criticality here includes requirements for data protection and system availability. The focus is on open source software. Goals are to identify robust technologies and engineering practices to implement such systems. Research methods include experiments made with sample systems built around chosen software packages that represent certain technologies. The main research focused on finding a good method for database data replication, a key functionality for high-availability, database-driven web applications. Research included also finding engineering best practices from books written by administrators of high traffic web applications. Experiment with database replication showed, that block level synchronous replication offered by DRBD replication software offered considerably more robust data protection and high-availability functionality compared to leading open source database product MySQL, and its built-in asynchronous replication. For master-master database setups, block level replication is more recommended way to build high-availability into the system. Based on thesis research, building high-availability web applications is possible using a combination of open source software and engineering best practices for data protection, availability planning and scaling.
Resumo:
IT outsourcing refers to the way companies focus on their core competencies and buy the supporting functions from other companies specialized in that area. Service is the total outcome of numerous of activities by employees and other resources to provide solutions to customers' problems. Outsourcing and service business have their unique characteristics. Service Level Agreements quantify the minimum acceptable service to the user. The service quality has to be objectively quantified so that its achievement or non-achievement of it can be monitored. Usually offshoring refers to the transferring of tasks to low-cost nations. Offshoring presents a lot of challenges that require special attention and they need to be assessed thoroughly. IT Infrastructure management refers to installation and basic usability assistance of operating systems, network and server tools and utilities. ITIL defines the industry best practices for organizing IT processes. This thesis did an analysis of server operations service and the customers’ perception of the quality of daily operations. The agreed workflows and processes should be followed better. Service providers’ processes are thoroughly defined but both the customer and the service provider might disobey them. Service provider should review the workflows regarding customer functions. Customer facing functions require persistent skill development, as they communicate the quality to the customer. Service provider needs to provide better organized communication and knowledge exchange methods between the specialists in different geographical locations.
Resumo:
Tutkimuksen tarkoituksena on tietohallinnon projektiportfolion hallinnan kehittäminen Metropolia ammattikorkeakoulussa. Portfolion hallinta kattaa projektiesitysten luomisen, projektien ja projektiesitysten arvioinnin ja priorisoinnin sekä projektien käynnistämisen. Painopisteenä arvioinnin ja priorisoinnin lisäksi on portfolion tarkastelu yhtenä kokonaisuutena. Tutkimus toteutettiin konstruktiivisena tutkimuksena, yhteistyössä Metropolia ammattikorkeakoulun kanssa. Tutkimusmenetelminä käytettiin kirjallisuuskatsausta, strukturoituahaastattelua ja kyselylomaketta. Kirjallisuuskatsaus kohdennettiin portfolion hallinnan prosessiin, siinä käytettäviin menetelmiin sekä projektien arviointikriteereihin. Metropoliassa tehtyjen haastattelujen ja kirjallisten kyselyjen avulla selvitettiin portfolion hallinnan nykytilannetta ja kehittämiskohteita. Haastattelemalla kahta julkishallinnon organisaatiota kartoitettiin portfolion hallinnan parhaita käytäntöjä. Tutkimuksen tuloksena on Metropolian tietohallinnon portfolion hallintaprosessi sekä sitä tukevat menetelmät ja projektien arviointikriteerit. Näiden avulla portfolio voidaan priorisoida yhtenäisesti ja systemaattisesti, huomioiden portfolion hallinnan tärkeimmät näkökulmat. Tutkimuksen tulokset ovat hyödynnettävissä Metropolian muissa yksiköissä tai vastaavanlaisissa organisaatioissa
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.
Resumo:
Project management has evolved in recent decades. Project portfolio management, together with multi project management, is an emerging area in the project management field in practice, and correspondingly in academic research and forums. In multi project management, projects cannot be handled isolated from each other, as they often have interdependencies that have to be taken into account. If the interdependencies between projects are evaluated during the selection process, the success rate of the project portfolio is increased. Interdependencies can be human resources, technological, and/or market based. Despite of the fact that interdependency as a phenomenon has roots in the 1960s and is related to famous management theories, it has not been much studied, although in practice most companies use it to great extent. There exists some research on interdependency, but prior publications have not emphasized the phenomenon per se, because a practical orientation practitioner techniques prevails in the literature. This research applies the method triangulation, electronic surveys and multiple case study. The research concentrates on small to large companies in Estonia and Finland, mainly in construction, engineering, ICT, and machinery industries. The literature review reveals that interdependencies are deeply involved in R&D and innovation. Survey analysis shows that companies are aware of interdependency issues in general, but they i have lack of detailed knowledge to use it thoroughly. Empirical evidence also indicates that interdependency techniques influence the success rate and other efficiency aspects to different extents. There are a lot of similarities in interdependency related managerial issues in companies of varying sizes and countries in Northern Europe. Differences found in the study are for instance the fact that smaller companies face more difficulties in implementing and evaluating interdependency procedures. Country differences between Estonia and Finland stem from working solutions to manage interdependencies on a daily basis.historical and cultural reasons, such as the special features of a transition country compared to a mature country. An overview of the dominant problems, best practices, and commonly used techniques associated with interdependency is provided in the study. Empirical findings show that many interdependency techniques are not used in practice. A multiple case study was performed in the study to find out how interdependencies are managed in real life on a daily basis. The results show that interdependencies are mostly managed in an informal manner. A description of managing the interdependencies and implementation procedures is given. Interdependency procedures are hard to implement, especially in smaller companies. Companies have difficulties in implementing interdependency procedures and evaluating them. The study contains detailed results on how companies have implemented working solutions to manage interdependencies on a daily basis
Resumo:
This thesis is a study of articles published in scientific journals about working capital management using bibliometric methods. The study was restricted to articles published in 1990–2010 that deal with the whole working capital management topic not a single sub-area of it. Working capital is defined as current assets minus current liabilities; sometimes also a definition of inventory plus accounts receivable minus accounts payable is used. The data was retrieved from the databases ISI Web of Science and Sciverse Scopus. Articles about working capital management were found 23. Content analysis, statistical analysis and citation analysis was performed to the articles. The most cited articles found in citation analysis were also analyzed by nearly same methods. This study found that scientific research of working capital management seems not to be concentrated to specific persons, organizations or journals. The originality and novelty in many articles is low. Many articles studied relation between working capital management and profitability in firms or working capital management practices of firms using statistical analyses. Data in articles was firms of all sizes, except in developing economies only big firms were used. Interesting areas for future research could be surveys made about working capital management practices in firms, finding of best practices, tools for working capital management, inventing or improving alternative views to working capital management like process-oriented view and firm or industry specific studies.
Resumo:
Longitudinal studies are quite rare in the area of Operations Management. One reason might be the time needed to conduct such studies, and then the lack of experience and real-life examples and results. The aim of the thesis is to examine longitudinal studies in the area of OM and the possible advantages, challenges and pitfalls of such studies. A longitudinal benchmarking study, Made in Finland, was analyzed in terms of the study methodology and its outcomes. The timeline of this longitudinal study is interesting. The first study was made in 1993, the second in 2004 and the third in 2010. Between these studies some major changes occurred in the Finnish business environment. Between the first and second studies, Finland joined the ETA and the EU, and globalization started with the rise of the Internet era, while between the second and third studies financial turmoil started in 2007. The sample and cases used in this study were originally 23 manufacturing sites in Finland. These sites were interviewed in 1993, in 2004 and 2010. One important and interesting aspect is that all the original sites participated in 2004, and 19 sites were still able to participate in 2010. Four sites had been closed and/or moved abroad. All of this gave a good opportunity to study the changes that occurred in the Finnish manufacturing sites and their environment, and how they reacted to these changes, and the effects on their performance. It is very seldom, if ever, that the same manufacturing sites have been studied in a longitudinal setting by using three data points. The results of this study are thus unique, and the experience gained is valuable for practitioners.
Resumo:
Open innovation is becoming increasingly popular in academic literature and in business life, but even if people heard about it, they might not understand what it really is, they may over-estimate it thinking that it is savior or underestimate it, concentrating on limitations and risks. Current work sheds light on most important concepts of open innovation theory. Goal of current research is to offer business processes improvement for both inbound and outbound modes in case company. It is relevant as open innovation proved to affect firms‘ performance in general case and in case company, and Nokia planned to develop open innovation implementation since 2008 but still competitors succeed in it more, therefore analysis of current situation with open innovation in Nokia and recommendations how to improve it are topical. Case study method was used to answer the question ―How open innovation processes can be improved?‖. 11 in-depth interviews with Nokia senior managers and independent consultants were used to reach the goal of the thesis, as well as secondary sources. Results of current work are as-is and to-be models (process models of today and best practices models) of several open innovation modes, and recommendation for case company, which will be presented to company representatives and checked for practical applicability.
Resumo:
Environmental accountability has become a major source of competitive advantage for industrial companies, because customers consider it as relevant buying criterion. However, in order to leverage their environmental responsibility, industrial suppliers have to be able to demonstrate the environmental value of their products and services, which is also the aim of Kemira, a global water chemistry company considered in this study. The aim of this thesis is to develop a tool which Kemira can use to assess the environmental value of their solutions for the customer companies in mining industry. This study answers to questions on what kinds of methods to assess environmental impacts exist, and what kind of tool could be used to assess the environmental value of Kemira’s water treatment solutions. The environmental impacts of mining activities vary greatly between different mines. Generally the major impacts include the water related issues and wastes. Energy consumption is also a significant environmental aspect. Water related issues include water consumption and impacts in water quality. There are several methods to assess environmental impacts, for example life cycle assessment, eco-efficiency tools, footprint calculations and process simulation. In addition the corresponding financial value may be estimated utilizing monetary assessment methods. Some of the industrial companies considered in the analysis of industry best practices use environmental and sustainability assessments. Based on the theoretical research and conducted interviews, an Excel based tool utilizing reference data on previous customer cases and customer specific test results was considered to be most suitable to assess the environmental value of Kemira’s solutions. The tool can be used to demonstrate the functionality of Kemira’s solutions in customers’ processes, their impacts in other process parameters and their environmental and financial aspects. In the future, the tool may be applied to fit also Kemira’s other segments, not only mining industry.