916 resultados para automatic test case generation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detection of latent tuberculosis infection (LTBI) is a cost-effective procedure in patients at high risk of developing tuberculosis later and who could benefit from preventive treatment. The commonest situation where screening is indicated is the search for infected contacts of an index case with pulmonary tuberculosis. As a screening procedure the current tendency is to replace the time-honoured tuberculin skin test by one of the new blood tests measuring the release of interferon gamma by sensitised T lymphocytes after stimulation by specific peptides from M. tuberculosis. The main advantage of the new tests is the absence of interference with BCG and non-tuberculous mycobacteria, which confers high specificity on the test. This allows a more selective choice of persons for whom preventive treatment is indicated. Some controversial issues remain, such as sensitivity in children and immunocompromised subjects, the predictive value of the blood test and interpretation of possible changes in test results over time. The technical aspects required for performance of the tests must be considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A 43 year healthy old man complains of fever with abdominal pain, vomiting, diarrhoea, followed by the development of thrombocytopenia and acute renal failure. The laboratory tests show the presence of Hantavirus specific IgM and IgG which is confirmed by a specific test revealing Puumala serotype as responsible. The patient received a symptomatic treatment with a favourable evolution allowing discharge about ten days after the beginning of symptoms. Hantavirus are transmitted by rodents, and this patient has certainly been infected in Switzerland in the absence of travel abroad during the incubation period. This means that when confronted in Switzerland with an acute nephritis of unknown origin, a diagnosis of nephropathia epidemica must be taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Antitumour necrosis factor (anti-TNF) treatments may reactivate latent tuberculosis infection (LTBI). For detecting LTBI, the tuberculin skin test (TST) has low sensitivity and specificity. Interferon-gamma release assays (IGRA) have been shown to be more sensitive and specific than TST. OBJECTIVE: To compare the TST and the T-SPOT.TB IGRA for identifying LTBI in patients with psoriasis before anti-TNF treatment. METHODS: A retrospective study was carried out over a 4-year period on patients with psoriasis requiring anti-TNF treatment. All were subjected to the TST, T-SPOT.TB and chest X-ray. Risk factors for LTBI and history of bacillus Calmette-Guérin (BCG) vaccination were recorded. The association of T-SPOT.TB and TST results with risk factors for LTBI was tested through univariate logistic regression models. Agreement between tests was quantified using kappa statistics. Treatment for LTBI was started 1 month before anti-TNF therapy when indicated. RESULTS: Fifty patients were included; 90% had prior BCG vaccination. A positive T-SPOT.TB was strongly associated with a presumptive diagnosis of LTBI (odds ratio 7.43; 95% confidence interval 1.38-39.9), which was not the case for the TST. Agreement between the T-SPOT.TB and TST was poor, kappa = 0.33 (SD 0.13). LTBI was detected and treated in 20% of the patients. In 20% of the cases, LTBI was not retained in spite of a positive TST but a negative T-SPOT.TB. All patients received an anti-TNF agent for a median of 56 weeks (range 20-188); among patients with a positive TST/negative T-SPOT.TB, no tuberculosis was detected with a median follow-up of 64 weeks (44-188). One case of disseminated tuberculosis occurred after 28 weeks of adalimumab treatment in a patient with LTBI in spite of treatment with rifampicin. CONCLUSION: This study is the first to underline the frequency of LTBI in patients with psoriasis (20%), and to support the use of IGRA instead of the TST for its detection. Nevertheless, there is still a risk of tuberculosis under anti-TNF therapy, even if LTBI is correctly diagnosed and treated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leprosy is a contagious and chronic systemic granulomatous disease caused by Mycobacterium leprae (Hansen"s bacillus). It is transmitted from person to person and has a long incubation period (between two and six years). The disease presents polar clinical forms (the"multibacillary" lepromatous leprosy and the"paucibacillary" tuberculoid leprosy), as well as other intermediate forms with hybrid characteristics. Oral manifestations usually appear in lepromatous leprosy and occur in 20-60% of cases. They may take the form of multiple nodules (lepromas) that progress to necrosis and ulceration. The ulcers are slow to heal, and produce atrophic scarring or even tissue destruction. The lesions are usually located on the hard and soft palate, in the uvula, on the underside of the tongue, and on the lips and gums. There may also be destruction of the anterior maxilla and loss of teeth. The diagnosis, based on clinical suspicion, is confirmed through bacteriological and histopathological analyses, as well as by means of the lepromin test (intradermal reaction that is usually negative in lepromatous leprosy form and positive in the tuberculoid form). The differential diagnosis includes systemic lupus erythematosus, sarcoidosis, cutaneous leishmaniasis and other skin diseases, tertiary syphilis, lymphomas, systemic mycosis, traumatic lesions and malignant neoplasias, among other disorders. Treatment is difficult as it must be continued for long periods, requires several drugs with adverse effects and proves very expensive, particularly for less developed countries. The most commonly used drugs are dapsone, rifampicin and clofazimine. Quinolones, such as ofloxacin and pefloxacin, as well as some macrolides, such as clarithromycin and minocyclin, are also effective. The present case report describes a patient with lepromatous leprosy acquired within a contagious family setting during childhood and adolescence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This abstract presents how we redesigned, with user-centred design methods, the way we organize and present the content on the UOC Virtual Library website. The content is now offered in a way that is more intuitive, usable and easy to understand, based on criteria of customization, transparency and proximity.The techniques used to achieve these objectives included benchmarking, interviews and focus groups during the user requirement capture phase and user tests to assess the process and results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The size-advantage model (SAM) explains the temporal variation of energetic investment on reproductive structures (i.e. male and female gametes and reproductive organs) in long-lived hermaphroditic plants and animals. It proposes that an increase in the resources available to an organism induces a higher relative investment on the most energetically costly sexual structures. In plants, pollination interactions are known to play an important role in the evolution of floral features. Because the SAM directly concerns flower characters, pollinators are expected to have a strong influence on the application of the model. This hypothesis, however, has never been tested. Here, we investigate whether the identity and diversity of pollinators can be used as a proxy to predict the application of the SAM in exclusive zoophilous plants. We present a new approach to unravel the dynamics of the model and test it on several widespread Arum (Araceae) species. By identifying the species composition, abundance and spatial variation of arthropods trapped in inflorescences, we show that some species (i.e. A. cylindraceum and A. italicum) display a generalist reproductive strategy, relying on the exploitation of a low number of dipterans, in contrast to the pattern seen in the specialist A. maculatum (pollinated specifically by two fly species only). Based on the model presented here, the application of the SAM is predicted for the first two and not expected in the latter species, those predictions being further confirmed by allometric measures. We here demonstrate that while an increase in the female zone occurs in larger inflorescences of generalist species, this does not happen in species demonstrating specific pollinators. This is the first time that this theory is both proposed and empirically tested in zoophilous plants. Its overall biological importance is discussed through its application in other non-Arum systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental results of a new controller able to support bidirectional power flow in a full-bridge rectifier with boost-like topology are obtained. The controller is computed using port Hamiltonian passivity techniques for a suitable generalized state space averaging truncation system, which transforms the control objectives, namely constant output voltage dc-bus and unity input power factor, into a regulation problem. Simulation results for the full system show the essential correctness of the simplifications introduced to obtain the controller, although some small experimental discrepancies point to several aspects that need further improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research of condition monitoring of electric motors has been wide for several decades. The research and development at universities and in industry has provided means for the predictive condition monitoring. Many different devices and systems are developed and are widely used in industry, transportation and in civil engineering. In addition, many methods are developed and reported in scientific arenas in order to improve existing methods for the automatic analysis of faults. The methods, however, are not widely used as a part of condition monitoring systems. The main reasons are, firstly, that many methods are presented in scientific papers but their performance in different conditions is not evaluated, secondly, the methods include parameters that are so case specific that the implementation of a systemusing such methods would be far from straightforward. In this thesis, some of these methods are evaluated theoretically and tested with simulations and with a drive in a laboratory. A new automatic analysis method for the bearing fault detection is introduced. In the first part of this work the generation of the bearing fault originating signal is explained and its influence into the stator current is concerned with qualitative and quantitative estimation. The verification of the feasibility of the stator current measurement as a bearing fault indicatoris experimentally tested with the running 15 kW induction motor. The second part of this work concentrates on the bearing fault analysis using the vibration measurement signal. The performance of the micromachined silicon accelerometer chip in conjunction with the envelope spectrum analysis of the cyclic bearing faultis experimentally tested. Furthermore, different methods for the creation of feature extractors for the bearing fault classification are researched and an automatic fault classifier using multivariate statistical discrimination and fuzzy logic is introduced. It is often important that the on-line condition monitoring system is integrated with the industrial communications infrastructure. Two types of a sensor solutions are tested in the thesis: the first one is a sensor withcalculation capacity for example for the production of the envelope spectra; the other one can collect the measurement data in memory and another device can read the data via field bus. The data communications requirements highly depend onthe type of the sensor solution selected. If the data is already analysed in the sensor the data communications are needed only for the results but in the other case, all measurement data need to be transferred. The complexity of the classification method can be great if the data is analysed at the management level computer, but if the analysis is made in sensor itself, the analyses must be simple due to the restricted calculation and memory capacity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Langattoman laajakaistaisen tietoliikennetekniikan kehittyminen on herättänyt kiinnostuksen sen ammattimaiseen hyödyntämiseen yleisen turvallisuuden ja kriisinhallinnan tarpeisiin. Hätätilanteissa usein olemassa olevat kiinteät tietoliikennejärjestelmät eivät ole ollenkaan käytettävissä tai niiden tarjoama kapasiteetti ei ole riittävä. Tästä syystä on noussut esiin tarve nopeasti toimintakuntoon saatettaville ja itsenäisille langattomille laajakaistaisille järjestelmille. Tässä diplomityössä on tarkoitus tutkia langattomia ad hoc monihyppy -verkkoja yleisen turvallisuuden tarpeiden pohjalta ja toteuttaa testialusta, jolla voidaan demonstroida sekä tutkia tällaisen järjestelmän toimintaa käytännössä. Työssä tutkitaan pisteestä pisteeseen sekä erityisesti pisteestä moneen pisteeseen suoritettavaa tietoliikennettä. Mittausten kohteena on testialustan tiedonsiirtonopeus, lähetysteho ja vastaanottimen herkkyys. Näitä tuloksia käytetään simulaattorin parametreina, jotta simulaattorin tulokset olisivat mahdollisimman aidot ja yhdenmukaiset testialustan kanssa. Sen jälkeen valitaan valikoima yleisen turvallisuuden vaatimusten mukaisia ohjelmia ja sovellusmalleja, joiden suorituskyky mitataan erilaisten reititysmenetelmien alaisena sekä testialustalla että simulaattorilla. Tuloksia arvioidaan ja vertaillaan. Multicast monihyppy -video päätettiin sovelluksista valita tutkimusten pääkohteeksi ja sitä sekä sen ominaisuuksia on tarkoitus myös oikeissa kenttäkokeissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is currently a considerable diversity of quantitative measures available for summarizing the results in single-case studies. Given that the interpretation of some of them is difficult due to the lack of established benchmarks, the current paper proposes an approach for obtaining further numerical evidence on the importance of the results, complementing the substantive criteria, visual analysis, and primary summary measures. This additional evidence consists of obtaining the statistical significance of the outcome when referred to the corresponding sampling distribution. This sampling distribution is formed by the values of the outcomes (expressed as data nonoverlap, R-squared, etc.) in case the intervention is ineffective. The approach proposed here is intended to offer the outcome"s probability of being as extreme when there is no treatment effect without the need for some assumptions that cannot be checked with guarantees. Following this approach, researchers would compare their outcomes to reference values rather than constructing the sampling distributions themselves. The integration of single-case studies is problematic, when different metrics are used across primary studies and not all raw data are available. Via the approach for assigning p values it is possible to combine the results of similar studies regardless of the primary effect size indicator. The alternatives for combining probabilities are discussed in the context of single-case studies pointing out two potentially useful methods one based on a weighted average and the other on the binomial test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amphibole fractionation in the deep roots of subduction-related magmatic arcs is a fundamental process for the generation of the continental crust. Field relations and geochemical data of exposed lower crustal igneous rocks can be used to better constrain these processes. The Chelan Complex in the western U. S. forms the lowest level of a 40-km thick exposed crustal section of the North Cascades and is composed of olivine websterite, pyroxenite, hornblendite, and dominantly by hornblende gabbro and tonalite. Magmatic breccias, comb layers and intrusive contacts suggest that the Chelan Complex was build by igneous processes. Phase equilibria, textural observations and mineral chemistry yield emplacement pressures of similar to 1.0 GPa followed by isobaric cooling to 700 degrees C. The widespread occurrence of idiomorphic hornblende and interstitial plagioclase together with the lack of Eu anomalies in bulk rock compositions indicate that the differentiation is largely dominated by amphibole. Major and trace element modeling constrained by field observations and bulk chemistry demonstrate that peraluminous tonalite could be derived by removing successively 3% of olivine websterite, 12% of pyroxene hornblendite, 33% of pyroxene hornblendite, 19% of gabbros, 15% of diorite and 2% tonalite. Peraluminous tonalite with high Sr/Y that are worldwide associated with active margin settings can be derived from a parental basaltic melt by crystal fractionation at high pressure provided that amphibole dominates the fractionation process. Crustal assimilation during fractionation is thus not required to generate peraluminous tonalite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tavoitteena on kehittää arvoinnovaatio yritysverkostolle. Arvoinnovaation teoreettisen viitekehyksen konkretisoimiseksi ja arvioimiseksi käytetään työssä kohteena ProPortfolio - ohjelmistotuotetta sekä sen kehittämistyössä saatuja kokemuksia. ProPortfolio on vuosina 2001-2004 kehitetty mobiiliin teknologiaan perustuva ohjelmistosovellus. ProPortfolio ohjelmisto on liikkeenjohdon konsulttiyritys Tuloskunto Oy:n johdolla ja TEKES:n rahoittamassa hankkeessa kehitetty pientaloprojektien toteuttajaverkoston toiminnanohjausjärjestelmä. Pitkäaikaisen kilpailuedun saavuttaminen erottumalla kilpailijoiden massasta on kasvava haaste yrityksille. Tutkijoiden Kimin ja Mauborgnen kehittämän arvoinnovaation ja strategisen suunnittelun prosessin lähtökohtana on kilpailun kiertäminen differoitumalla kilpailijoista ja uusien markkinoiden luominen. Perinteisen strategisen ajattelutavan mukaan yritykset voivat tuottaa korkeampaa arvoaasiakkaille korkeammin kustannuksin tai kohtuullista arvoa pienemmin kustannuksin. Eli strateginen asemointi perustuu valintaan differoitumisen ja kustannusjohtajuuden välillä (Porter,1985). Arvoinnovaatioon perustuvassa asemoinnissa tavoitellaan sekä differoitumista että kustannusjohtajuutta samanaikaisesti. Arvoinnovaation keskeisimmät kriteerit täyttyivät kehityshankkeen lopputuloksena syntyneessä ProPortfolio -ohjelmistotuotteessa. Tämän työn pohjalta voidaan todeta, että Kimin ja Mauborgnen kehittämä strategian suunnitteluprosessi on laaja ja vaativa. Arvoinnovaatio syntyy harvoin hetkellisenä oivalluksena, vaan kehittyy järjestelmällisen työskentelyn tuloksena. ProPortfolio -ohjelmistotuotteen arvoinnovaation muodostuminen perustui kehityshankkeen onnistuneeseen asiakastarpeiden tunnistamiseen ja ennakoimiseen. Selkeän kuvan hahmottuminen ratkaistavista ongelmista auttoi asemoimaan hankkeen jatkokehitystyön oikein ja loi edellytykset arvoinnovaation muodostumiselle. Tutkijat Kim jaMauborgne ovat haastaneet perinteiset strategiatyöskentelyn mallit. Tutkimustuloksiinsa pohjautuen he ovat kehittäneet uuden arvoinnovaatioon perustuvan strategisen suunnittelun teoreettisen viitekehikon ja prosessin. Nämä mallit tulevat varmasti jättämään pysyvän jäljen strategiatyöskentelyn nykykäytäntöihin.