878 resultados para Diagnosis support systems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Exposure to combination antiretroviral therapy (cART) can lead to important metabolic changes and increased risk of coronary heart disease (CHD). Computerized clinical decision support systems have been advocated to improve the management of patients at risk for CHD but it is unclear whether such systems reduce patients' risk for CHD. METHODS: We conducted a cluster trial within the Swiss HIV Cohort Study (SHCS) of HIV-infected patients, aged 18 years or older, not pregnant and receiving cART for >3 months. We randomized 165 physicians to either guidelines for CHD risk factor management alone or guidelines plus CHD risk profiles. Risk profiles included the Framingham risk score, CHD drug prescriptions and CHD events based on biannual assessments, and were continuously updated by the SHCS data centre and integrated into patient charts by study nurses. Outcome measures were total cholesterol, systolic and diastolic blood pressure and Framingham risk score. RESULTS: A total of 3,266 patients (80% of those eligible) had a final assessment of the primary outcome at least 12 months after the start of the trial. Mean (95% confidence interval) patient differences where physicians received CHD risk profiles and guidelines, rather than guidelines alone, were total cholesterol -0.02 mmol/l (-0.09-0.06), systolic blood pressure -0.4 mmHg (-1.6-0.8), diastolic blood pressure -0.4 mmHg (-1.5-0.7) and Framingham 10-year risk score -0.2% (-0.5-0.1). CONCLUSIONS: Systemic computerized routine provision of CHD risk profiles in addition to guidelines does not significantly improve risk factors for CHD in patients on cART.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Integral abutment bridges are constructed without an expansion joint in the superstructure of the bridge; therefore, the bridge girders, deck, abutment diaphragms, and abutments are monolithically constructed. The abutment piles in an integral abutment bridge are vertically orientated, and they are embedded into the pile cap. When this type of a bridge experiences thermal expansion or contraction, horizontal displacements are induced at the top of the abutment piles. The flexibility of the abutment piles eliminates the need to provide an expansion joint at the inside face to the abutments: Integral abutment bridge construction has been used in Iowa and other states for many years. This research is evaluating the performance of integral abutment bridges by investigating thermally induced displacements, strains, and temperatures in two Iowa bridges. Each bridge has a skewed alignment, contains five prestressed concrete girders that support a 30-ft wide roadway for three spans, and involves a water crossing. The bridges will be monitored for about two years. For each bridge, an instrumentation package includes measurement devices and hardware and software support systems. The measurement devices are displacement transducers, strain gages, and thermocouples. The hardware and software systems include a data-logger; multiplexers; directline telephone service and computer terminal modem; direct-line electrical power; lap-top computer; and an assortment of computer programs for monitoring, transmitting, and management of the data. Instrumentation has been installed on a bridge located in Guthrie County, and similar instrumentation is currently being installed on a bridge located in Story County. Preliminary test results for the bridge located in Guthrie County have revealed that temperature changes of the bridge deck and girders induce both longitudinal and transverse displacements of the abutments and significant flexural strains in the abutment piles. For an average temperature range of 73° F for the superstructure concrete in the bridge located in Guthrie County, the change in the bridge length was about 1 118 in. and the maximum, strong-axis, flexural-strain range for one of the abutment piles was about 400 micro-strains, which corresponds to a stress range of about 11,600 psi.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The circadian timing system controls cell cycle, apoptosis, drug bioactivation, and transport and detoxification mechanisms in healthy tissues. As a consequence, the tolerability of cancer chemotherapy varies up to several folds as a function of circadian timing of drug administration in experimental models. Best antitumor efficacy of single-agent or combination chemotherapy usually corresponds to the delivery of anticancer drugs near their respective times of best tolerability. Mathematical models reveal that such coincidence between chronotolerance and chronoefficacy is best explained by differences in the circadian and cell cycle dynamics of host and cancer cells, especially with regard circadian entrainment and cell cycle variability. In the clinic, a large improvement in tolerability was shown in international randomized trials where cancer patients received the same sinusoidal chronotherapy schedule over 24h as compared to constant-rate infusion or wrongly timed chronotherapy. However, sex, genetic background, and lifestyle were found to influence optimal chronotherapy scheduling. These findings support systems biology approaches to cancer chronotherapeutics. They involve the systematic experimental mapping and modeling of chronopharmacology pathways in synchronized cell cultures and their adjustment to mouse models of both sexes and distinct genetic background, as recently shown for irinotecan. Model-based personalized circadian drug delivery aims at jointly improving tolerability and efficacy of anticancer drugs based on the circadian timing system of individual patients, using dedicated circadian biomarker and drug delivery technologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman tavoitteena on kehittää prosessi yrityksen strategisten investointien hal-lintaan siten, että yrityksen strateginen arkkitehtuuri mukailee dynaamisten mark-kinoiden jatkuvasti muuttuvia kriittisiä menestystekijöitä. Tutkielma tarjoaa ratkai-sun strategisten investointien kohtaamaan epävarmuuteen, kompleksisuuteen ja si-säisiin konflikteihin luomalla dynaamisiin kyvykkyyksiin perustuvan prosessin, joka toteutetaan ryhmäpäätöksenteon tukisysteemien avulla asiantuntijatietoa hyö-dyntäen. Yrityksen strateginen arkkitehtuuri on mahdollista mallintaa skenaariopohjaisen strategiakartan eli kyvykkyyskartan avulla. Kyvykkyyskarttaan sisällytetyt QFD- ja AHP-mallit mahdollistavat strategisten investointien arvottamisen markkinoiden kriittisten menestystekijöiden suhteen. Dynaamisiin kyvykkyyksiin perustuvat lead user- ja skenaariosuunnitteluvaiheet mahdollistavat puolestaan joustavan investoin-tistrategian luonnin. Tutkielma osoittaa dynaamisia kyvykkyyksiä ja ryhmäpäätök-senteon tukisysteemejä hyödyntävän strategisten investointien hallintaprosessin tarjoavan ratkaisun strategisien investointipäätösten kohtaamiin haasteisiin.Ky-vykkyyskarttaan pohjautuvan strategisen arkkitehtuurin optimointimallin katsottiin olevan realistinen ja uskottava ja korostavan investointien strategisia vaikutuksia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The markets of biomass for energy are developing rapidly and becoming more international. A remarkable increase in the use of biomass for energy needs parallel and positive development in several areas, and there will be plenty of challenges to overcome. The main objective of the study was to clarify the alternative future scenarios for the international biomass market until the year 2020, and based on the scenario process, to identify underlying steps needed towards the vital working and sustainable biomass market for energy purposes. Two scenario processes were conducted for this study. The first was carried out with a group of Finnish experts and thesecond involved an international group. A heuristic, semi-structured approach, including the use of preliminary questionnaires as well as manual and computerised group support systems (GSS), was applied in the scenario processes.The scenario processes reinforced the picture of the future of international biomass and bioenergy markets as a complex and multi-layer subject. The scenarios estimated that the biomass market will develop and grow rapidly as well as diversify in the future. The results of the scenario process also opened up new discussion and provided new information and collective views of experts for the purposes of policy makers. An overall view resulting from this scenario analysis are the enormous opportunities relating to the utilisation of biomass as a resource for global energy use in the coming decades. The scenario analysis shows the key issues in the field: global economic growth including the growing need for energy, environmental forces in the global evolution, possibilities of technological development to solve global problems, capabilities of the international community to find solutions for global issues and the complex interdependencies of all these driving forces. The results of the scenario processes provide a starting point for further research analysing the technological and commercial aspects related the scenarios and foreseeing the scales and directions of biomass streams.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The diffusion of mobile telephony began in 1971 in Finland, when the first car phones, called ARP1 were taken to use. Technologies changed from ARP to NMT and later to GSM. The main application of the technology, however, was voice transfer. The birth of the Internet created an open public data network and easy access to other types of computer-based services over networks. Telephones had been used as modems, but the development of the cellular technologies enabled automatic access from mobile phones to Internet. Also other wireless technologies, for instance Wireless LANs, were also introduced. Telephony had developed from analog to digital in fixed networks and allowed easy integration of fixed and mobile networks. This development opened a completely new functionality to computers and mobile phones. It also initiated the merger of the information technology (IT) and telecommunication (TC) industries. Despite the arising opportunity for firms' new competition the applications based on the new functionality were rare. Furthermore, technology development combined with innovation can be disruptive to industries. This research focuses on the new technology's impact on competition in the ICT industry through understanding the strategic needs and alternative futures of the industry's customers. The change speed inthe ICT industry is high and therefore it was valuable to integrate the DynamicCapability view of the firm in this research. Dynamic capabilities are an application of the Resource-Based View (RBV) of the firm. As is stated in the literature, strategic positioning complements RBV. This theoretical framework leads theresearch to focus on three areas: customer strategic innovation and business model development, external future analysis, and process development combining these two. The theoretical contribution of the research is in the development of methodology integrating theories of the RBV, dynamic capabilities and strategic positioning. The research approach has been constructive due to the actual managerial problems initiating the study. The requirement for iterative and innovative progress in the research supported the chosen research approach. The study applies known methods in product development, for instance, innovation process in theGroup Decision Support Systems (GDSS) laboratory and Quality Function Deployment (QFD), and combines them with known strategy analysis tools like industry analysis and scenario method. As the main result, the thesis presents the strategic innovation process, where new business concepts are used to describe the alternative resource configurations and scenarios as alternative competitive environments, which can be a new way for firms to achieve competitive advantage in high-velocity markets. In addition to the strategic innovation process as a result, thestudy has also resulted in approximately 250 new innovations for the participating firms, reduced technology uncertainty and helped strategic infrastructural decisions in the firms, and produced a knowledge-bank including data from 43 ICT and 19 paper industry firms between the years 1999 - 2004. The methods presentedin this research are also applicable to other industries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Operatiivisen tiedon tuottaminen loppukäyttäjille analyyttistä tarkastelua silmällä pitäen aiheuttaa ongelmia useille yrityksille. Diplomityö pyrkii ratkaisemaan ko. ongelman Teleste Oyj:ssä. Työ on jaettu kolmeen pääkappaleeseen. Kappale 2 selkiyttää On-Line Analytical Processing (OLAP)- käsitteen. Kappale 3 esittelee muutamia OLAP-tuotteiden valmistajia ja heidän arkkitehtuurejaan sekä tyypillisten sovellusalueiden lisäksi huomioon otettavia asioita OLAP käyttöönoton yhteydessä. Kappale 4, tuo esille varsinaisen ratkaisun. Teknisellä arkkitehtuurilla on merkittävä asema ratkaisun rakenteen kannalta. Tässä on sovellettu Microsoft:n tietovarasto kehysrakennetta. Kappaleen 4 edetessä, tapahtumakäsittelytieto muutetaan informaatioksi ja edelleen loppukäyttäjien tiedoksi. Loppukäyttäjät varustetaan tehokkaalla ja tosiaikaisella analysointityökalulla moniulotteisessa ympäristössä. Vaikka kiertonopeus otetaan työssä sovellusesimerkiksi, työ ei pyri löytämään optimaalista tasoa Telesten varastoille. Siitä huolimatta eräitä parannusehdotuksia mainitaan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tätä diplomityötä sponsoroi suuri Isobritannialainen lentokoneteollisuudessa toimiva yritys, joka huomasi että globaalin tuotantostrategian ollessa painopisteenä ja tietoteknisten järjestelmien kuten CAD/CAM ollessa merkittävänä osana tuotantoa, on löydettävä ymmärrys siitä, mitkä ovat tuotannon tietojärjestelmien tarpeet ja onko niiden kehittämisestä hyötyä yritykselle.Diplomityössä selitetään Internet teknologiaan perustuvan kioskin kehittämisestä tietotukijärjestelmäksi tuotanto-osastolle, jossa valmistetaan moottorin osia CNC-koneilla. Kioskeissa on piirteitä, jotka voisivat osoittautua hyödyllisiksi myös tuotantoympäristöissä ja siksi tässä työssä tutkitaan kioskiin perustuvaa lähestymistapaa tuotantoympäristöön sovellettuna.Diplomityö kuvaa informaatiokioskin kehittämistä alkaen alkuvaatimusten keruusta tietojärjestelmää varten, tietojärjestelmän suunnittelu- ja kehitysvaiheen sekä lopuksi analysoi kioskin onnistuneisuutta tuotantoympäristössä käytettävyystutkimuksen avulla, joka suoritettiin sen jälkeen kun kioski oli implementoitu tehtaassa.Johtopäätökset osoittavat, että kioski on hyvin implementoitavissa tuotantoympäristöön ja todistaa, että tuotantoinformaation jakelu sähköisessä muodossa on huomattavasti tehokkaampaa kuin paperilla. Käyttäjien kommentit osoittavat että kioski on sopiva heidän tietotarpeisiinsa ja siitä on hyötyä heidän työlleen. Kioski tarjoaa hyötyjä tuotantotason lisäksi myös johtotasolle.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ihmisen toiminnan vaikutus ilmakehään johtaa todennäköisesti ilmastonmuutoksiin. Eräs näistä muutoksista on maapallon keskilämpötilan nousu, joka aiheutuu kasvihuonekaasujen lisääntyneestä pitoisuudesta ilmakehässä. Vaikutusten vähentämiseksi on hiilidioksidipäästöjä vähennettävä. Kioton pöytäkirja asettaa allekirjoittaneille maille päästövelvoitteet. Euroopan unionin tulee vähentää kasvihuonekaasupäästöjään 8%:lla. Eräs vähennysmekanismeista on päästökauppa. Päästökauppa on sekä keino suojella ympäristöä että ympäristöpoliittinen instrumentti kasvihuonekaasupäästövähennysten kustannusten keventämiseksi. Päästökauppa ei suoranaisesti vähennä kasvihuonekaasupäästöjä, vaan tasaa niitä maiden ja laitosten välillä. Uusiutuvan energian käytön edistäminen sekä kansainvälisesti että kansallisesti johtaa suoriin kasvihuonekaasupäästöjen vähenemiseen. Euroopan unionin jäsenvaltiot ovat asettaneet kansalliset viitearvot uusituvan sähkön kulutukselle. Saavuttaakseen nämä viitearvot maiden tulee tukea uusiutuvia energialähteitä eri menetelmin kuten vihreillä sertifikaateilla. Päästökauppa ja kaupattavat vihreät sertifikaatit tulevat vaikuttamaan energiantuottajien liiketoimintaan. Työssä on tutkittu päästökaupan ja vihreiden sertifikaattien vaikutuksia Vattenfall Kaukolämpö Oy:n, Vattenfall Sähköntuotanto Oy:n ja Vamy Oy:n liiketoimintaan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän työn tavoitteena oli kartoittaa Kymenlaakson pk-yritysten sähköisen liiketoiminnan nykyinen tilanne. Työssä selvitettiin myös merkittävimmät hyödyt ja esteet, jotka liittyvät sähköisen liiketoiminnan kehittymiseen maakunnassa. Lisäksi hahmotettiin sähköisen liiketoiminnan kehitysprosessin etenemistä pk-yrityksissä.Työ voidaan jakaa teoreettiseen ja empiiriseen osaan. Työn teoriaosuudessa tarkastellaan sähköisen liiketoiminnan sisältöä ja esitellään joitain tutkijoiden aiemmin esittämiä kehitysmalleja. Empiirinen osuus sisältää tutkimuksen, jonka avulla Kymenlaakson pk-yritysten nykyistä tilaa ja tulevaisuuden näkymiä kartoitettiin. Tehty tutkimus on myös perustana alkavalle eLiiketoiminta yrityksille -hankkeelle.Tutkimuksen tulosten mukaan Kymenlaakson pk-yritykset ovat kehittäneet kykyään hyödyntää tietotekniikkaa ja tietoverkkoja viimeisten vuosien aikana. Pk-yrityksillä on myös paljon haluja ja tarpeita kehitykseen, mutta esteiksi nousevat usein tiedon, resurssien ja ajanpuute. Yritysten tarpeisiin voidaan tehokkaasti vastata tarjoamalla lisää tietoa, koulutusta ja parempia tukijärjestelmiä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intravenous thrombolysis (IVT) as treatment in acute ischaemic strokes may be insufficient to achieve recanalisation in certain patients. Predicting probability of non-recanalisation after IVT may have the potential to influence patient selection to more aggressive management strategies. We aimed at deriving and internally validating a predictive score for post-thrombolytic non-recanalisation, using clinical and radiological variables. In thrombolysis registries from four Swiss academic stroke centres (Lausanne, Bern, Basel and Geneva), patients were selected with large arterial occlusion on acute imaging and with repeated arterial assessment at 24 hours. Based on a logistic regression analysis, an integer-based score for each covariate of the fitted multivariate model was generated. Performance of integer-based predictive model was assessed by bootstrapping available data and cross validation (delete-d method). In 599 thrombolysed strokes, five variables were identified as independent predictors of absence of recanalisation: Acute glucose > 7 mmol/l (A), significant extracranial vessel STenosis (ST), decreased Range of visual fields (R), large Arterial occlusion (A) and decreased Level of consciousness (L). All variables were weighted 1, except for (L) which obtained 2 points based on β-coefficients on the logistic scale. ASTRAL-R scores 0, 3 and 6 corresponded to non-recanalisation probabilities of 18, 44 and 74 % respectively. Predictive ability showed AUC of 0.66 (95 %CI, 0.61-0.70) when using bootstrap and 0.66 (0.63-0.68) when using delete-d cross validation. In conclusion, the 5-item ASTRAL-R score moderately predicts non-recanalisation at 24 hours in thrombolysed ischaemic strokes. If its performance can be confirmed by external validation and its clinical usefulness can be proven, the score may influence patient selection for more aggressive revascularisation strategies in routine clinical practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Työn tavoitteena oli antaa tietoa tehdasalueiden turvallisuuteen vaikuttavista tekijöistä ja luoda malli tehdasalueen liikenteen turvallisuuden parantamiseen päätöksenteon tukivälineitä apuna käyttäen. Työ sai alkunsa todellisen turvallisuutta parantavan investoinnin analysointi-tarpeesta. Aluksi työssä perehdytään yleisesti tehdasalueiden turvallisuuteen tarkastelemalla työturvallisuuden, riskienhallinnan ja turvallisuusjohtamisen asemaa, mittaamista ja taloudellisia vaikutuksia yrityksen toiminnalle. Yleiskatsaus tehdasalueiden turvallisuuteen antaa kuvan, millaiseen toimintaympäristöön turvallisuutta parantavaa mallia ollaan kehittämässä. Malli koostuu viidestä vaiheesta, joita voidaan käyttää kokonaisuutena tai toisistaan erillään. Ensin selvitään tutkittavan alueen kulkuväylät ja liikenne sekä mallinnetaan se. Tämän jälkeen kartoitetaan ongelmakohdat ja etsitään sopivia vaihtoehtoja niiden turvallisuuden parantamiseksi. Vaihtoehtoja analysoidaan SWOT-menetelmän avulla. Turvallisuutta parantavien investointien arvioimiseen esitetään muutamia mittareita, joita voidaan käyttää hankintojen arvioimiseen. Viimeisessä vaiheessa tutustutaan päätöksenteon tukisysteemeihin ja esitetään tietokoneavusteinen päätöksentekomenetelmä AHP. Käytännön soveltaminen on esitetty esimerkein mallin eri vaiheiden yhteydessä.Malli on tarkoitettu suunnittelijoiden, johdon ja työsuojelun työkaluksi, jonka avulla voidaan parantaa tehdasympäristön liikenteen tehokkuutta ja turvallisuutta sekä tutustuttaa käyttäjä päätöksentekomenetelmiin.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prospective epidemiological data have shown that blood pressure has a graded, continuous adverse effect on the risk of various forms of CVD (including stroke, myocardial infarction, heart failure, peripheral arterial disease and end-stage renal disease). 'Raised blood pressure' is frequently considered to be any systolic blood pressure greater than 115 mmHg. It accounts for 45% of all heart disease deaths and 51% of all stroke-related deaths [1], which together are the biggest causes of morbidity and mortality worldwide [2,3,4]. Annually, there are >17 million deaths due to CVD worldwide, of which 9.4 million are attributable to complications of raised blood pressure. This highlights the importance of both high-risk and population-based strategies in blood pressure management and control.