823 resultados para Management: Collection Evaluation
Resumo:
This report evaluates the use of remotely sensed images in implementing the Iowa DOT LRS that is currently in the stages of system architecture. The Iowa Department of Transportation is investing a significant amount of time and resources into creation of a linear referencing system (LRS). A significant portion of the effort in implementing the system will be creation of a datum, which includes geographically locating anchor points and then measuring anchor section distances between those anchor points. Currently, system architecture and evaluation of different data collection methods to establish the LRS datum is being performed for the DOT by an outside consulting team.
Resumo:
RATIONALE, AIMS AND OBJECTIVES: There is little evidence regarding the benefit of stress ulcer prophylaxis (SUP) outside a critical care setting. Overprescription of SUP is not devoid of risks. This prospective study aimed to evaluate the use of proton pump inhibitors (PPIs) for SUP in a general surgery department. METHOD: Data collection was performed prospectively during an 8-week period on patients hospitalized in a general surgery department (58 beds) by pharmacists. Patients with a PPI prescription for the treatment of ulcers, gastro-oesophageal reflux disease, oesophagitis or epigastric pain were excluded. Patients admitted twice during the study period were not reincluded. The American Society of Health-System Pharmacists guidelines on SUP were used to assess the appropriateness of de novo PPI prescriptions. RESULTS: Among 255 patients in the study, 138 (54%) received a prophylaxis with PPI, of which 86 (62%) were de novo PPI prescriptions. A total of 129 patients (94%) received esomeprazole (according to the hospital drug policy). The most frequent dosage was at 40 mg once daily. Use of PPI for SUP was evaluated in 67 patients. A total of 53 patients (79%) had no risk factors for SUP. Twelve and two patients had one or two risk factors, respectively. At discharge, PPI prophylaxis was continued in 33% of patients with a de novo PPI prescription. CONCLUSIONS: This study highlights the overuse of PPIs in non-intensive care unit patients and the inappropriate continuation of PPI prescriptions at discharge. Treatment recommendations for SUP are needed to restrict PPI use for justified indications.
Resumo:
The objective of this work was to evaluate the use of multispectral remote sensing for site-specific nitrogen fertilizer management. Satellite imagery from the advanced spaceborne thermal emission and reflection radiometer (Aster) was acquired in a 23 ha corn-planted area in Iran. For the collection of field samples, a total of 53 pixels were selected by systematic randomized sampling. The total nitrogen content in corn leaf tissues in these pixels was evaluated. To predict corn canopy nitrogen content, different vegetation indices, such as normalized difference vegetation index (NDVI), soil-adjusted vegetation index (Savi), optimized soil-adjusted vegetation index (Osavi), modified chlorophyll absorption ratio index 2 (MCARI2), and modified triangle vegetation index 2 (MTVI2), were investigated. The supervised classification technique using the spectral angle mapper classifier (SAM) was performed to generate a nitrogen fertilization map. The MTVI2 presented the highest correlation (R²=0.87) and is a good predictor of corn canopy nitrogen content in the V13 stage, at 60 days after cultivating. Aster imagery can be used to predict nitrogen status in corn canopy. Classification results indicate three levels of required nitrogen per pixel: low (0-2.5 kg), medium (2.5-3 kg), and high (3-3.3 kg).
Resumo:
Une expertise collective sur l'évaluation des risques liés aux nanomatériaux pour la population générale et pour l'environnement a identifié plusieurs centaines de produits de grande consommation contenant des nanomatériaux, présents dans notre quotidien : textiles, cosmétiques, alimentaires, équipements sportifs, matériaux de construction... Des études nouvelles suggèrent la possibilité de risques pour la santé et pour l'environnement de certains produits. Face à cette situation d'incertitude, l'Afsset recommande d'agir sans attendre au nom du principe de précaution et de 1) Rendre obligatoire la traçabilité des nanomatériaux. Cela passe par une obligation de déclaration par les industriels, 2) la mise en place un étiquetage clair qui mentionne la présence de nanomatériaux dans les produits et informe sur la possibilité de relargage à l'usage, 3) d'aller jusqu'à l'interdiction de certains usages des nanomatériaux pour lesquels l'utilité est faible par rapport aux dangers potentiels, 4) l'harmonisation des cadres réglementaires français et européens pour généraliser les meilleures pratiques : déclaration, autorisation, substitution. En particulier, une révision de REACh s'impose pour prendre en compte les nanomatériaux manufacturés de manière spécifique et quel que soit leur tonnage. L'expertise fait également des recommandations pour construire une méthode renouvelée d'évaluation des risques sanitaires qui soit adaptée aux spécificités des nanomatériaux. Pour cela l'Afsset a testé les méthodologies classiques d'évaluation des risques sur 4 produits particuliers et courants : la chaussette antibactérienne (nanoparticules d'argent), le ciment autonettoyant et le lait solaire (nanoparticules de dioxyde de titane), la silice alimentaire à l'état nanométrique. Ces 4 produits représentent bien les voies d'exposition de l'homme (cutanée, inhalation, ingestion) et la possibilité de dispersion environnementale. Ces travaux font apparaître une urgence à faire progresser les connaissances sur les expositions et les dangers potentiels des nanomatériaux. Aujourd'hui, seuls 2% des études publiées sur les nanomatériaux concernent leurs risques pour la santé et l'environnement. Le premier effort devra porter sur la normalisation des caractéristiques des nanomatériaux. Les priorités de la recherche devront cibler la toxicologie, l'écotoxicologie et la mesure des expositions. Enfin, l'Afsset prévoit de s'autosaisir pour définir en 2 ans, avec son groupe de travail, un outil simplifié d'évaluation des risques. Il s'agit d'une grille de cotation des risques qui permet de catégoriser les produits en plusieurs gammes de risques. Face à ce chantier considérable, une mise en réseau entre les organismes européens et internationaux pour se partager le travail est nécessaire. Elle a commencé autour de l'OCDE qui coordonne des travaux d'évaluation des risques et de l'ISO qui travaille à la mise en place de nouvelles normes. [Auteurs]
Resumo:
Diplomityössä on tutkittu reaaliaikaisen toimintolaskennan toteuttamista suomalaisen lasersiruja valmistavan PK-yrityksen tietojärjestelmään. Lisäksi on tarkasteltu toimintolaskennan vaikutuksia operatiiviseen toimintaan sekä toimintojen johtamiseen. Työn kirjallisuusosassa on käsitelty kirjallisuuslähteiden perusteella toimintolaskennan teorioita, laskentamenetelmiä sekä teknisessä toteutuksessa käytettyjä teknologioita. Työn toteutusosassa suunniteltiin ja toteutettiin WWW-pohjainen toimintolaskentajärjestelmä case-yrityksen kustannuslaskennan sekä taloushallinnon avuksi. Työkalu integroitiin osaksi yrityksen toiminnanohjaus- sekä valmistuksenohjausjärjestelmää. Perinteisiin toimintolaskentamallien tiedonkeruujärjestelmiin verrattuna case-yrityksessä syötteet toimintolaskentajärjestelmälle tulevat reaaliaikaisesti osana suurempaa tietojärjestelmäintegraatiota.Diplomityö pyrkii luomaan suhteen toimintolaskennan vaatimusten ja tietokantajärjestelmien välille. Toimintolaskentajärjestelmää yritys voi hyödyntää esimerkiksi tuotteiden hinnoittelussa ja kustannuslaskennassa näkemällä tuotteisiin liittyviä kustannuksia eri näkökulmista. Päätelmiä voidaan tehdä tarkkaan kustannusinformaatioon perustuen sekä määrittää järjestelmän tuottaman datan perusteella, onko tietyn projektin, asiakkuuden tai tuotteen kehittäminen taloudellisesti kannattavaa.
Resumo:
Tutkimuksen tavoitteena on tutkia telekommunikaatioalalla toimivan kohdeyrityksen ohjelmistojen toimitusprosessia° Tutkimus keskittyy mallintamaan toimitusprosessin, määrittelemään roolit ja vastuualueet, havaitsemaan ongelmakohdat ja ehdottamaan prosessille kehityskohteita. Näitä tavoitteita tarkastellaan teoreettisten prosessimallinnustekniikoiden ja tietojohtamisen SECI-prosessikehyksen läpi. Tärkein tiedonkeruun lähde oli haastatteluihin perustuva tutkimus, johon osallistuvat kaikki kohdeprosessiin kuuluvat yksiköt. Mallinnettu toimitusprosessi antoi kohdeyritykselle paremman käsityksen tarkasteltavasta prosessista ja siinä toimivien yksiköiden rooleistaja vastuualueista. Parannusehdotuksia olivat tiedonjaon kanavoinnin määritteleminen, luottamuksen ja sosiaalisten verkostojen parantaminen, ja tietojohtamisen laajamittainen implementointi.
Resumo:
Printed electronics is an emerging concept in electronics manufacturing and it is in very early development stage. The technology is not stable, design kits are not developed, and flows and Computer Aided Design (CAD) tools are not fixed yet. The European project TDK4PE addresses all this issues and this PFC has been realized on this context. The goal is to develop an XML-based information system for the collection and management of information from the technology and cell libraries developed in TDK4PE. This system will ease the treatment of that information for a later generation of specific Design Kits (DK) and the corresponding documentation. This work proposes a web application to generate technology files and design kits in a formatted way; it also proposes a structure for them and a database implementation for storing the needed information. The application will allow its users to redefine the structure of those files, as well as export and import XML files, between other formats.
Resumo:
BACKGROUND: People with neurological disease have a much higher risk of both faecal incontinence and constipation than the general population. There is often a fine dividing line between the two conditions, with any management intended to ameliorate, one risking precipitating the other. Bowel problems are observed to be the cause of much anxiety and may reduce quality of life in these people. Current bowel management is largely empirical with a limited research base. OBJECTIVES: To determine the effects of management strategies for faecal incontinence and constipation in people with neurological diseases affecting the central nervous system. SEARCH STRATEGY: We searched the Cochrane Incontinence Group Trials Register, the Cochrane Controlled Trials Register, MEDLINE, EMBASE and all reference lists of relevant articles. Date of the most recent searches: May 2000. SELECTION CRITERIA: All randomised or quasi-randomised trials evaluating any types of conservative, or surgical measure for the management of faecal incontinence and constipation in people with neurological diseases were selected. Specific therapies for the treatment of neurological diseases that indirectly affect bowel dysfunction have also been considered. DATA COLLECTION AND ANALYSIS: All three reviewers assessed the methodological quality of eligible trials and two reviewers independently extracted data from included trials using a range of pre-specified outcome measures. MAIN RESULTS: Only seven trials were identified by the search strategy and all were small and of poor quality. Oral medications for constipation were the subject of four trials. Cisapride does not seem to have clinically useful effects in people with spinal cord injuries (two trials). Psyllium was associated with increased stool frequency in people with Parkinson's disease but not altered colonic transit time (one trial). Some rectal preparations to initiate defecation produced faster results than others (one trial). Different time schedules for administration of rectal medication may produce different bowel responses (one trial). Mechanical evacuation may be more effective than oral or rectal medication (one trial). The clinical significance of any of these results is difficult to interpret. REVIEWER'S CONCLUSIONS: It is not possible to draw any recommendation for bowel care in people with neurological diseases from the trials included in this review. Bowel management for these people must remain empirical until well-designed controlled trials with adequate numbers and clinically relevant outcome measures become available.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.
Resumo:
Tutkimus koostuu neljästä artikkelista, jotka käsittelevät suomalaisten pienten ja keskisuurten teollisuusyritysten (PKT-yritysten) innovatiivisuutta, sen attribuutteja (ominaispiirteitä) sekä indikaattoreita. Tutkimuksessa tarkastellaan sekä kirjallisuudessa esitettyjä että PKT-johtajien ja PKT-yritystenkehityshankkeiden rahoituspäätöksiin osallistuvien yritystutkijoiden haastatteluissa esittämiä innovatiivisuuden määritelmiä. Innovatiivisuusindikaattoreista tarkastellaan PKT-yritysten kehittämishankkeille sovellettavia rahoitus- ja arviointikriteerejä sekä yritysten ulkopuolisten rahoittajien että PKT-johtajien näkökulmasta. Erityistä huomiota kohdistetaan sovellettuihin laadullisiin ja ei-numeerisiin innovatiivisuuden arviointikriteereihin. Sekä kirjallisuudessa että kymmenen yritystutkijan ja kuuden esimerkkiyrityksen johtajan haastatteluissa innovaation uutuus yhdistetään innovatiivisuuteen. Muita tärkeitä innovatiivisuuteen liitettyjä ominaisuuksia olivat markkinat, muista yrityksistä erottuminen ja yksilöiden luovuus. Ihmisläheiset ja yksilöihin liittyvät näkökulmat korostuvat yritystutkijoiden ja PKT-johtajien innovatiivisuuden määritelmissä, kun taas kirjallisuudessa annetaan enemmän painoa ympäristölle, tuotteille ja markkinoille. Yritystutkijat pitivät yritykseen ja sen johtajaan liittyviä tekijöitä tärkeinä rahoitettavien kehittämishankkeiden panosten arviointikriteereinä. Tuotteiden kaupallinen menestys oli rahoittajan kannalta tärkein tulostekijä. Tarkastelluissa esimerkkiyrityksissä kehityshankkeista päättäminen ja hankkeiden arviointi on toisaalta intuitiivista ja saattaa olla tiedostamatontakin, koska yritysten kehittämistoiminta on vähäistä. Pienyritysten johtajat korostavat arvioinnissa rahallisiamittareita, vaikka sekä numeerisia että laadullisia kriteereitä sovelletaan. Todennäköisin syy tälle on pienyritysten rajalliset taloudelliset voimavarat. Toinen mahdollinen syy rahoituksellisten tekijöiden painottamiseen on, että tämän päivän ihannejohtaja ymmärretään analyyttiseksi ja mm.rahavirtoja valvovaksi. Kuitenkin innovatiiviset yritysjohtajat pitävät innovaatioiden luomista yhtenä elämän hauskoista puolista. Innovatiiviset esimerkkiyritykset ovat tulevaisuuteen ja kasvuun suuntautuneita strategisella tasolla. Operationaalisella tasolla ne tuottavat keksintöjä ja innovaatioita. Patentteja tarkastelluilla yrityksillä on kuitenkin vähän. Sekä innovatiiviset että vähemmän innovatiiviset esimerkkiyritykset ovat voimakkaasti asiakassuuntautuneita ja erikoistuneita tiettyihin tuotteisiin ja asiakkaisiin. Asiakkaiden tarpeita tyydytetään kehittämällä niitä vastaavia tuotteita. Tästä johtuu, että valtaosa yritysten kehittämistoiminnasta kohdistuu tuotteisiin tai tuotantoon.
Resumo:
The objective of the thesis is to structure and model the factors that contribute to and can be used in evaluating project success. The purpose of this thesis is to enhance the understanding of three research topics. The goal setting process, success evaluation and decision-making process are studied in the context of a project, business unitand its business environment. To achieve the objective three research questionsare posed. These are 1) how to set measurable project goals, 2) how to evaluateproject success and 3) how to affect project success with managerial decisions.The main theoretical contribution comes from deriving a synthesis of these research topics which have mostly been discussed apart from each other in prior research. The research strategy of the study has features from at least the constructive, nomothetical, and decision-oriented research approaches. This strategy guides the theoretical and empirical part of the study. Relevant concepts and a framework are composed on the basis of the prior research contributions within the problem area. A literature review is used to derive constructs of factors withinthe framework. They are related to project goal setting, success evaluation, and decision making. On the basis of this, the case study method is applied to complement the framework. The empirical data includes one product development program, three construction projects, as well as one organization development, hardware/software, and marketing project in their contexts. In two of the case studiesthe analytic hierarchy process is used to formulate a hierarchical model that returns a numerical evaluation of the degree of project success. It has its origin in the solution idea which in turn has its foundation in the notion of projectsuccess. The achieved results are condensed in the form of a process model thatintegrates project goal setting, success evaluation and decision making. The process of project goal setting is analysed as a part of an open system that includes a project, the business unit and its competitive environment. Four main constructs of factors are suggested. First, the project characteristics and requirements are clarified. The second and the third construct comprise the components of client/market segment attractiveness and sources of competitive advantage. Together they determine the competitive position of a business unit. Fourth, the relevant goals and the situation of a business unit are clarified to stress their contribution to the project goals. Empirical evidence is gained on the exploitation of increased knowledge and on the reaction to changes in the business environment during a project to ensure project success. The relevance of a successful project to a company or a business unit tends to increase the higher the reference level of project goals is set. However, normal performance or sometimes performance below this normal level is intentionally accepted. Success measures make project success quantifiable. There are result-oriented, process-oriented and resource-oriented success measures. The study also links result measurements to enablers that portray the key processes. The success measures can be classified into success domains determining the areas on which success is assessed. Empiricalevidence is gained on six success domains: strategy, project implementation, product, stakeholder relationships, learning situation and company functions. However, some project goals, like safety, can be assessed using success measures that belong to two success domains. For example a safety index is used for assessing occupational safety during a project, which is related to project implementation. Product safety requirements, in turn, are connected to the product characteristics and thus to the product-related success domain. Strategic success measures can be used to weave the project phases together. Empirical evidence on their static nature is gained. In order-oriented projects the project phases are oftencontractually divided into different suppliers or contractors. A project from the supplier's perspective can represent only a part of the ¿whole project¿ viewed from the client's perspective. Therefore static success measures are mostly used within the contractually agreed project scope and duration. Proof is also acquired on the dynamic use of operational success measures. They help to focus on the key issues during each project phase. Furthermore, it is shown that the original success domains and success measures, their weights and target values can change dynamically. New success measures can replace the old ones to correspond better with the emphasis of the particular project phase. This adjustment concentrates on the key decision milestones. As a conclusion, the study suggests a combination of static and dynamic success measures. Their linkage to an incentive system can make the project management proactive, enable fast feedback and enhancethe motivation of the personnel. It is argued that the sequence of effective decisions is closely linked to the dynamic control of project success. According to the used definition, effective decisions aim at adequate decision quality and decision implementation. The findings support that project managers construct and use a chain of key decision milestones to evaluate and affect success during aproject. These milestones can be seen as a part of the business processes. Different managers prioritise the key decision milestones to a varying degree. Divergent managerial perspectives, power, responsibilities and involvement during a project offer some explanation for this. Finally, the study introduces the use ofHard Gate and Soft Gate decision milestones. The managers may use the former milestones to provide decision support on result measurements and ad hoc critical conditions. In the latter milestones they may make intermediate success evaluation also on the basis of other types of success measures, like process and resource measures.
Resumo:
Although usability evaluations have been focused on assessing different contexts of use, no proper specifications have been addressed towards the particular environment of academic websites in the Spanish-speaking context of use. Considering that this context involves hundreds of millions of potential users, the AIPO Association is running the UsabAIPO Project. The ultimate goal is to promote an adequate translation of international standards, methods and ideal values related to usability in order to adapt them to diverse Spanish-related contexts of use. This article presents the main statistical results coming from the Second and Third Stages of the UsabAIPO Project, where the UsabAIPO Heuristic method (based on Heuristic Evaluation techniques) and seven Cognitive Walkthroughs were performed over 69 university websites. The planning and execution of the UsabAIPO Heuristic method and the Cognitive Walkthroughs, the definition of two usability metrics, as well as the outline of the UsabAIPO Heuristic Management System prototype are also sketched.
Resumo:
Precision Viticulture (PV) is a concept that is beginning to have an impact on the wine-growing sector. Its practical implementation is dependant on various technological developments: crop sensors and yield monitors, local and remote sensors, Global Positioning Systems (GPS), VRA (Variable-Rate Application) equipment and machinery, Geographic Information Systems (GIS) and systems for data analysis and interpretation. This paper reviews a number of research lines related to PV. These areas of research have focused on four very specific fields: 1) quantification and evaluation of within-field variability, 2) delineation of zones of differential treatment at parcel level, based on the analysis and interpretation of this variability, 3) development of Variable-Rate Technologies (VRT) and, finally, 4) evaluation of the opportunities for site-specific vineyard management. Research in these fields should allow winegrowers and enologists to know and understand why yield variability exists within the same parcel, what the causes of this variability are, how the yield and its quality are interrelated and, if spatial variability exists, whether site-specific vineyard management is justifiable on a technical and economic basis.
Resumo:
BACKGROUND: Therapeutic hypothermia following hypoxic ischaemic encephalopathy in term infants was introduced into Switzerland in 2005. Initial documentation of perinatal and resuscitation details was poor and neuromonitoring insufficient. In 2011, a National Asphyxia and Cooling Register was introduced. AIMS: To compare management of cooled infants before and after introduction of the register concerning documentation, neuromonitoring, cooling methods and evaluation of temperature variability between cooling methods. STUDY DESIGN: Data of cooled infants before the register was in place (first time period: 2005-2010) and afterwards (second time period: 2011-2012) was collected with a case report form. RESULTS: 150 infants were cooled during the first time period and 97 during the second time period. Most infants were cooled passively or passively with gel packs during both time periods (82% in 2005-2010 vs 70% in 2011-2012), however more infants were cooled actively during the second time period (18% versus 30%). Overall there was a significant reduction in temperature variability (p < 0.001) comparing the two time periods. A significantly higher proportion of temperature measurements within target temperature range (72% versus 77%, p < 0.001), fewer temperature measurements above (24% versus 7%, p < 0.001) and more temperatures below target range (4% versus 16%, p < 0.001) were recorded during the second time period. Neuromonitoring improved after introduction of the cooling register. CONCLUSION: Management of infants with HIE improved since introducing the register. Temperature variability was reduced, more temperature measurements in the target range and fewer temperature measurements above target range were observed. Neuromonitoring has improved, however imaging should be performed more often.