11 resultados para systematic methods
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Kehitystoimintaan liittyvä termi, innovaatio, voidaan määritellä monella eri tavalla. Riippumatta tarkasta määritelmästä innovaatioiden johtaminen voidaan määritellä prosessiksi, jossa eri suoritusvaiheet seuraavat toisiaan. Olennaista on erottaa innovaatioprosessin alkupää ja varsinainen kehitysprojekti toisistaan. Innovaatioprosessin alkupää mielletään usein sumeaksi ja vaikeasti hallittavaksi, kun taas kehitysprojektin johtamiseen on kehitetty systemaattisia menetelmiä, esimerkiksi Stage-Gate-prosessi. Kehitysprojektien johtamisessa ongelmaksi muodostuu resurssien riittämättömyys mahdollisiin projekteihin nähden. Tämä liittää kehitystoiminnan kiinteästi portfoliojohtamiseen. RTD-toiminnalla tarkoitetaan UPM Timberissä kehitystoimintaa, joka sisältää tuotekehityksen lisäksi myös prosessien ja teknologian kehitystoiminnot. UPM Timberiltä on puuttunut systemaattinen toimintamalli RTD-projektien läpiviemiseksi eikä niiden valintaan ja resurssien allokointiin ole ollut järjestelmää. Tutkimuksessa määritetään RTD-toiminnan kriittiset menestystekijät sekä muodostetaan UPM Timberille RTD-toiminnan kehityshaasteet. Niiden perusteella luodaan UPM Timberille systemaattinen toimintamalli, RTD-prosessikuvaus, RTD-projektien läpiviemiseksi sekä integroidaan portfoliojohtaminen toimintamalliin.
Resumo:
This study explores personal liberty in psychiatric care from a service user involvement perspective. The data were collected in four phases during the period 2000-2006 in psychiatric settings in Finland. Firstly, patient satisfaction and factors associated with user involvement were studied (n = 313). Secondly, patients’ experiences of deprivation of their liberty were explored (n = 51). Thirdly, an overview on patients’ options for lodging complaints was conducted, and all complaints (n = 4645) lodged in Finland from 2000 to 2004 were examined. Fourthly, the effects of different patient education methods on inpatients’ experiences of deprivation of liberty were tested (n = 311). It emerged that patients were quite satisfied, but reported dissatisfaction in restrictions, compulsory care and information dissemination. Patients experienced restrictions on leaving the ward and on communication, confiscation of property and coercive measures as deprivation of liberty. Patients’ experienced these interventions to be negative. In Finland, the patient complaint process is complicated and not easily accessible. In general, patient complaints increased considerably in Finland during the study period. In psychiatric care the number of complaints was quite stable and complaints led more seldom to consequences. An Internet-based patient education system was equivalent with traditional education and treatment as usual in supporting personal liberty during hospital care. This dissertation provides new information about the realization of patients' rights in psychiatric care. In order to improve patients' involvement, systematic methods to increase personal liberty during care need to be developed, the procedures for patients lodging complaints should be simplified, and patients' access to information needs to be ensured using multiple methods.
Resumo:
Palvelut ovat nousseet lähes kiinteäksi osaksi tuotteita sekä niiden rinnalle omaksi liiketoiminta-alueekseen. Tämän vuoksi palveluiden kehittäminen on viime vuosikymmeninä noussut sekä yritysten että akateemisten yhteisöjen mielenkiinnonkohteeksi. Menestyksekäs palveluiden kehittäminen vaatii usein systemaattisia menetelmiä, mutta menetelmien tulee antaa tilaa myös tarkoituksenmukaisille poikkeamille. Kohdeyritys pyrkii tuottamaan laadukkaita palveluita asiakastarpeet huomioiden. Tämän takia yritys haluaa kehittää olemassa olevia palveluita jatkuvasti ja pyrkii keräämään sekä työntekijöiltä että asiakkailta tietoa palvelutarpeista, joista voidaan lähteä kehittämään uusia palvelukokonaisuuksia. Tutkimus pyrkii havainnoimaan kohdeyrityksen palveluiden kehitystä ja tavoitteena on muodostaa havaintojen perusteella prosessin, jota yritys voi jatkossa hyödyntää sekä edelleen kehittää. Tutkimus on kehittävä tapaustutkimus, jonka teoreettisen viitekehyksen muodostavat uusien palveluiden kehittämisen - ja palveluinnovaatioprosessien näkökulmat. Yrityksen palveluiden kehitystä tarkastellaan kahden primaarisen aineiston sekä kahden sekundaarisen aineiston kautta. Näin ollen palveluiden kehitystä voidaan tarkastella useista näkökulmista käyttäen menetelmänä aineistotriangulaatiota. Tutkimuksessa havaittiin, että kohdeyrityksen palveluiden kehitysprosessi muodostui hyvin vastaavanlaiseksi kuin teoriaosuudessa esitettyjen uusien palveluiden kehitys- ja palveluinnovaatioprosessit. Asiakkaita ja työntekijöitä osallistettiin jonkin verran prosessin aikana. Lisäksi havaittiin, että uusien palveluiden kehitys- ja palveluinnovaatioprosessit eivät ehkä ole erillisiä kehitysprosesseja, vaikka niistä käytetäänkin eri nimityksiä kirjallisuudessa.
Resumo:
The present dissertation is devoted to the systematic approach to the development of organic toxic and refractory pollutants abatement by chemical decomposition methods in aqueous and gaseous phases. The systematic approach outlines the basic scenario of chemical decomposition process applications with a step-by-step approximation to the most effective result with a predictable outcome for the full-scale application, confirmed by successful experience. The strategy includes the following steps: chemistry studies, reaction kinetic studies in interaction with the mass transfer processes under conditions of different control parameters, contact equipment design and studies, mathematical description of the process for its modelling and simulation, processes integration into treatment technology and its optimisation, and the treatment plant design. The main idea of the systematic approach for oxidation process introduction consists of a search for the most effective combination between the chemical reaction and the treatment device, in which the reaction is supposed to take place. Under this strategy,a knowledge of the reaction pathways, its products, stoichiometry and kinetics is fundamental and, unfortunately, often unavailable from the preliminary knowledge. Therefore, research made in chemistry on novel treatment methods, comprisesnowadays a substantial part of the efforts. Chemical decomposition methods in the aqueous phase include oxidation by ozonation, ozone-associated methods (O3/H2O2, O3/UV, O3/TiO2), Fenton reagent (H2O2/Fe2+/3+) and photocatalytic oxidation (PCO). In the gaseous phase, PCO and catalytic hydrolysis over zero valent ironsare developed. The experimental studies within the described methodology involve aqueous phase oxidation of natural organic matter (NOM) of potable water, phenolic and aromatic amino compounds, ethylene glycol and its derivatives as de-icing agents, and oxygenated motor fuel additives ¿ methyl tert-butyl ether (MTBE) ¿ in leachates and polluted groundwater. Gas-phase chemical decomposition includes PCO of volatile organic compounds and dechlorination of chlorinated methane derivatives. The results of the research summarised here are presented in fifteenattachments (publications and papers submitted for publication and under preparation).
Resumo:
Requirements-relatedissues have been found the third most important risk factor in software projects and as the biggest reason for software project failures. This is not a surprise since; requirements engineering (RE) practices have been reported deficient inmore than 75% of all; enterprises. A problem analysis on small and low maturitysoftware organizations revealed two; central reasons for not starting process improvement efforts: lack of resources and uncertainty; about process improvementeffort paybacks.; In the constructive part of the study a basic RE method, BaRE, was developed to provide an; easy to adopt way to introduce basic systematic RE practices in small and low maturity; organizations. Based on diffusion of innovations literature, thirteen desirable characteristics; were identified for the solution and the method was implemented in five key components:; requirements document template, requirements development practices, requirements; management practices, tool support for requirements management, and training.; The empirical evaluation of the BaRE method was conducted in three industrial case studies. In; this evaluation, two companies established a completely new RE infrastructure following the; suggested practices while the third company conducted continued requirements document; template development based on the provided template and used it extensively in practice. The; real benefits of the adoption of the method were visible in the companies in four to six months; from the start of the evaluation project, and the two small companies in the project completed; their improvement efforts with an input equal to about one person month. The collected dataon; the case studies indicates that the companies implemented new practices with little adaptations; and little effort. Thus it can be concluded that the constructed BaRE method is indeed easy to; adopt and it can help introduce basic systematic RE practices in small organizations.
Resumo:
This thesis examines and explains the procedure used to redesign the attachment of permanent magnets to the surface of the rotor of a synchronous generator. The methodology followed to go from the actual assembly to converge to the final purposed innovation was based on the systematic approach design. This meant that first a series of steps had to be predefined as a frame of reference later to be used to compare and select proposals, and finally to obtain the innovation that was sought. Firstly, a series of patents was used as the background for the upcoming ideas. To this end, several different patented assemblies had been found and categorized according the main element onto which this thesis if focused, meaning the attachment element or method. After establishing the technological frame of reference, a brainstorm was performed to obtain as many ideas as possible. Then these ideas were classified, regardless of their degree of complexity or usability, since at this time the quantity of the ideas was the important issue. Subsequently, they were compared and evaluated from different points of view. The comparison and evaluation in this case was based on the use of a requirement list, which established the main needs that the design had to fulfill. Then the selection could be done by grading each idea in accordance with these requirements. In this way, one was able to obtain the idea or ideas that best fulfilled these requirements. Once all of the ideas were compared and evaluated, the best or most suitable idea or ideas were separated. Finally, the selected idea or ideas was/were analyzed in extension and a number of improvements were made. Consequently, a final idea was refined and made more suitable at its performance, manufacture, and life cycle assessment. Therefore, in the end, the design process gave a solution to the problem pointed out at the beginning.
Resumo:
In the 21st century, agile project management (APM) has emerged as a major evolutionary step in the area of software project management. APM is defined as a conceptual framework, consisting of various methods such as Scrum, quick respond to change, better customer collaboration, minimum coverage of documentation and extreme programming (XP) that facilitates to produce working software in multiple iterations with team work. Because agile project management has become more popular in the software industry in recent years, it constitutes an interesting and comprehensive research topic. This thesis presents a systematic literature review (SLR) of published research articles concerning agile project management. Based on a predefined search strategy, 273 such articles were identified, of which 44 were included in the review. The selected 44 articles were published between years 2005 and 2012. The thesis defines a review process by developing a review protocol and presenting the results of the review. The results are expected to provide researchers, software man
Resumo:
Context: Web services have been gaining popularity due to the success of service oriented architecture and cloud computing. Web services offer tremendous opportunity for service developers to publish their services and applications over the boundaries of the organization or company. However, to fully exploit these opportunities it is necessary to find efficient discovery mechanism thus, Web services discovering mechanism has attracted a considerable attention in Semantic Web research, however, there have been no literature surveys that systematically map the present research result thus overall impact of these research efforts and level of maturity of their results are still unclear. This thesis aims at providing an overview of the current state of research into Web services discovering mechanism using systematic mapping. The work is based on the papers published 2004 to 2013, and attempts to elaborate various aspects of the analyzed literature including classifying them in terms of the architecture, frameworks and methods used for web services discovery mechanism. Objective: The objective if this work is to summarize the current knowledge that is available as regards to Web service discovery mechanisms as well as to systematically identify and analyze the current published research works in order to identify different approaches presented. Method: A systematic mapping study has been employed to assess the various Web Services discovery approaches presented in the literature. Systematic mapping studies are useful for categorizing and summarizing the level of maturity research area. Results: The result indicates that there are numerous approaches that are consistently being researched and published in this field. In terms of where these researches are published, conferences are major contributing publishing arena as 48% of the selected papers were conference published papers illustrating the level of maturity of the research topic. Additionally selected 52 papers are categorized into two broad segments namely functional and non-functional based approaches taking into consideration architectural aspects and information retrieval approaches, semantic matching, syntactic matching, behavior based matching as well as QOS and other constraints.
Resumo:
This study reviews the research on interaction techniques and methods that could be applied in mobile augmented reality scenarios. The review is focused on themost recent advances and considers especially the use of head-mounted displays. Inthe review process, we have followed a systematic approach, which makes the reviewtransparent, repeatable, and less prone to human errors than if it was conducted in amore traditional manner. The main research subjects covered in the review are headorientation and gaze-tracking, gestures and body part-tracking, and multimodality– as far as the subjects are related to human-computer interaction. Besides these,also a number of other areas of interest will be discussed.
Resumo:
In this paper, we review the advances of monocular model-based tracking for last ten years period until 2014. In 2005, Lepetit, et. al, [19] reviewed the status of monocular model based rigid body tracking. Since then, direct 3D tracking has become quite popular research area, but monocular model-based tracking should still not be forgotten. We mainly focus on tracking, which could be applied to aug- mented reality, but also some other applications are covered. Given the wide subject area this paper tries to give a broad view on the research that has been conducted, giving the reader an introduction to the different disciplines that are tightly related to model-based tracking. The work has been conducted by searching through well known academic search databases in a systematic manner, and by selecting certain publications for closer examination. We analyze the results by dividing the found papers into different categories by their way of implementation. The issues which have not yet been solved are discussed. We also discuss on emerging model-based methods such as fusing different types of features and region-based pose estimation which could show the way for future research in this subject.
Resumo:
Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.