954 resultados para Critical attributes
Resumo:
Management of neurocritical care patients is focused on the prevention and treatment of secondary brain injury, i.e. the number of pathophysiological intracerebral (edema, ischemia, energy dysfunction, seizures) and systemic (hyperthermia, disorders of glucose homeostasis) events that occur following the initial insult (stroke, hemorrhage, head trauma, brain anoxia) that may aggravate patient outcome. The current therapeutic paradigm is based on multimodal neuromonitoring, including invasive (intracranial pressure, brain oxygen, cerebral microdialysis) and non-invasive (transcranial doppler, near-infrared spectroscopy, EEG) tools that allows targeted individualized management of acute coma in the early phase. The aim of this review is to describe the utility of multimodal neuromonitoring for the critical care management of acute coma.
Resumo:
Converging evidence favors an abnormal susceptibility to oxidative stress in schizophrenia. Decreased levels of glutathione (GSH), the major cellular antioxidant and redox regulator, was observed in cerebrospinal-fluid and prefrontal cortex of patients. Importantly, abnormal GSH synthesis of genetic origin was observed: Two case-control studies showed an association with a GAG trinucleotide repeat (TNR) polymorphism in the GSH key synthesizing enzyme glutamate-cysteine-ligase (GCL) catalytic subunit (GCLC) gene. The most common TNR genotype 7/7 was more frequent in controls, whereas the rarest TNR genotype 8/8 was three times more frequent in patients. The disease associated genotypes (35% of patients) correlated with decreased GCLC protein, GCL activity and GSH content. Similar GSH system anomalies were observed in early psychosis patients. Such redox dysregulation combined with environmental stressors at specific developmental stages could underlie structural and functional connectivity anomalies. In pharmacological and knock-out (KO) models, GSH deficit induces anomalies analogous to those reported in patients. (a) morphology: spine density and GABA-parvalbumine immunoreactivity (PV-I) were decreased in anterior cingulate cortex. KO mice showed delayed cortical PV-I at PD10. This effect is exacerbated in mice with increased DA from PD5-10. KO mice exhibit cortical impairment in myelin and perineuronal net known to modulate PV connectivity. (b) physiology: In cultured neurons, NMDA response are depressed by D2 activation. In hippocampus, NMDA-dependent synaptic plasticity is impaired and kainate induced g-oscillations are reduced in parallel to PV-I. (c) cognition: low GSH models show increased sensitivity to stress, hyperactivity, abnormal object recognition, olfactory integration and social behavior. In a clinical study, GSH precursor N-acetyl cysteine (NAC) as add on therapy, improves the negative symptoms and decreases the side effects of antipsychotics. In an auditory oddball paradigm, NAC improves the mismatched negativity, an evoked potential related to pre-attention and to NMDA receptors function. In summary, clinical and experimental evidence converge to demonstrate that a genetically induced dysregulation of GSH synthesis combined with environmental insults in early development represent a major risk factor contributing to the development of schizophrenia
Resumo:
Monoubiquitination of the Fanconi anaemia protein FANCD2 is a key event leading to repair of interstrand cross-links. It was reported earlier that FANCD2 co-localizes with NBS1. However, the functional connection between FANCD2 and MRE11 is poorly understood. In this study, we show that inhibition of MRE11, NBS1 or RAD50 leads to a destabilization of FANCD2. FANCD2 accumulated from mid-S to G2 phase within sites containing single-stranded DNA (ssDNA) intermediates, or at sites of DNA damage, such as those created by restriction endonucleases and laser irradiation. Purified FANCD2, a ring-like particle by electron microscopy, preferentially bound ssDNA over various DNA substrates. Inhibition of MRE11 nuclease activity by Mirin decreased the number of FANCD2 foci formed in vivo. We propose that FANCD2 binds to ssDNA arising from MRE11-processed DNA double-strand breaks. Our data establish MRN as a crucial regulator of FANCD2 stability and function in the DNA damage response.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.
Resumo:
In this thesis different parameters influencing critical flux in protein ultrafiltration and membrane foul-ing were studied. Short reviews of proteins, cross-flow ultrafiltration, flux decline and criticalflux and the basic theory of Partial Least Square analysis (PLS) are given at the beginning. The experiments were mainly performed using dilute solutions of globular proteins, commercial polymeric membranes and laboratory scale apparatuses. Fouling was studied by flux, streaming potential and FTIR-ATR measurements. Critical flux was evaluated by different kinds of stepwise procedures and by both con-stant pressure and constant flux methods. The critical flux was affected by transmembrane pressure, flow velocity, protein concentration, mem-brane hydrophobicity and protein and membrane charges. Generally, the lowest critical fluxes were obtained at the isoelectric points of the protein and the highest in the presence of electrostatic repulsion between the membrane surface and the protein molecules. In the laminar flow regime the critical flux increased with flow velocity, but not any more above this region. An increase in concentration de-creased the critical flux. Hydrophobic membranes showed fouling in all charge conditionsand, furthermore, especially at the beginning of the experiment even at very low transmembrane pressures. Fouling of these membranes was thought to be due to protein adsorption by hydrophobic interactions. The hydrophilic membranes used suffered more from reversible fouling and concentration polarisation than from irreversible foul-ing. They became fouled at higher transmembrane pressures becauseof pore blocking. In this thesis some new aspects on critical flux are presented that are important for ultrafiltration and fractionation of proteins.
Resumo:
Further genetic gains in wheat yield are required to match expected increases in demand. This may require the identification of physiological attributes able to produce such improvement, as well as the genetic bases controlling those traits in order to facilitate their manipulation. In the present paper, a theoretical framework of source and sink limitation to wheat yield is presented and the fine-tuning of crop development as an alternative for increasing yield potential is discussed. Following a top-down approach, most crop physiologists have agreed that the main attribute explaining past genetic gains in yield was harvest index (HI). By virtue of previous success, no further gains may be expected in HI and an alternative must be found. Using a bottom-up approach, the present paper firstly provides evidence on the generalized sink-limited condition of grain growth, determining that for further increases in yield potential, sink strength during grain filling has to be increased. The focus should be on further increasing grain number per m2, through fine-tuning pre-anthesis developmental patterns. The phase of rapid spike growth period (RSGP) is critical for grain number determination and increasing spike growth during pre-anthesis would result in an increased number of grains. This might be achieved by lengthening the duration of the phase (though without altering flowering time), as there is genotypic variation in the proportion of pre-anthesis time elapsed either before or after the onset of the stem elongation phase. Photoperiod sensitivity during RSGP could be then used as a genetic tool to further increase grain number, since slower development results in smoother floret development and more floret primordia achieve the fertile floret stage, able to produce a grain. Far less progress has been achieved on the genetic control of this attribute. None of the well-known major Ppd alleles seems to be consistently responsible for RSGP sensitivity. Alternatives for identifying the genetic factors responsible for this sensitivity (e.g. quantitative trait locus (QTL) identification in mapping populations) are being considered.
Resumo:
Isotope ratio mass spectrometry (IRMS) has been used in numerous fields of forensic science in a source inference perspective. This review compiles the studies published on the application of isotope ratio mass spectrometry (IRMS) to the traditional fields of forensic science so far. It completes the review of Benson et al. [1] and synthesises the extent of knowledge already gathered in the following fields: illicit drugs, flammable liquids, human provenancing, microtraces, explosives and other specific materials (packaging tapes, safety matches, plastics, etc.). For each field, a discussion assesses the state of science and highlights the relevance of the information in a forensic context. Through the different discussions which mark out the review, the potential and limitations of IRMS, as well as the needs and challenges of future studies are emphasized. The paper elicits the various dimensions of the source which can be obtained from the isotope information and demonstrates the transversal nature of IRMS as a tool for source inference.
Resumo:
This study described elite football (soccer) goalkeepers' activity and performance in critical game situations. The 11 best French players (M age = 15.5 yr., SD = 0.5) participated in the study. Interviews focused on goalkeepers' experiences were conducted to identify meaningful events involved in failed actions. Players formulated 23 critical game situations. Verbatim encoding using a thematic analysis indicated that four main categories (coming off the line, goal-line clearance, one-on-one, and diving) represented the most critical situations encountered during matches. The relations among experience and action, inner states, background, attention contents, and intentions were elucidated. The discussion is grounded on the properties of such critical game situations and their implications for improving goalkeepers' performance.
Resumo:
BACKGROUND: The pre-conditioning of tumor vessels by low-dose photodynamic therapy (L-PDT) was shown to enhance the distribution of chemotherapy in different tumor types. However, how light dose affects drug distribution and tumor response is unknown. Here we determined the effect of L-PDT fluence on vascular transport in human mesothelioma xenografts. The best L-PDT conditions regarding drug transport were then combined with Lipoplatin(®) to determine tumor response. in vivo. Lasers Surg. Med. 47:323-330, 2015. © 2015 Wiley Periodicals, Inc. METHODS: Nude mice bearing dorsal skinfold chambers were implanted with H-Meso1 cells. Tumors were treated by Visudyne(®) -mediated photodynamic therapy with 100 mW/cm(2) fluence rate and a variable fluence (5, 10, 30, and 50 J/cm(2) ). FITC-Dextran (FITC-D) distribution was assessed in real time in tumor and normal tissues. Tumor response was then determined with best L-PDT conditions combined to Lipoplatin(®) and compared to controls in luciferase expressing H-Meso1 tumors by size and whole body bioluminescence assessment (n = 7/group). RESULTS: Tumor uptake of FITC-D following L-PDT was significantly enhanced by 10-fold in the 10 J/cm(2) but not in the 5, 30, and 50 J/cm(2) groups compared to controls. Normal surrounding tissue uptake of FITC-D following L-PDT was significantly enhanced in the 30 J/cm(2) and 50 J/cm(2) groups compared to controls. Altogether, the FITC-D tumor to normal tissue ratio was significantly higher in the 10 J/cm(2) group compared others. Tumor growth was significantly delayed in animals treated by 10 J/cm2-L-PDT combined to Lipoplatin(®) compared to controls. CONCLUSIONS: Fluence of L-PDT is critical for the optimal distribution and effect of subsequently administered chemotherapy. These findings have an importance for the clinical translation of the vascular L-PDT concept in the clinics. Lasers Surg. Med. 47:323-330, 2015.
Resumo:
In this commentary, we argue that the term 'prediction' is overly used when in fact, referring to foundational writings of de Finetti, the correspondent term should be inference. In particular, we intend (i) to summarize and clarify relevant subject matter on prediction from established statistical theory, and (ii) point out the logic of this understanding with respect practical uses of the term prediction. Written from an interdisciplinary perspective, associating statistics and forensic science as an example, this discussion also connects to related fields such as medical diagnosis and other areas of application where reasoning based on scientific results is practiced in societal relevant contexts. This includes forensic psychology that uses prediction as part of its vocabulary when dealing with matters that arise in the course of legal proceedings.
Resumo:
Nykyaikaisessa liiketoimintaympäristössä yritysten kriittisiksi resursseiksi ovat muodostuneet liiketoimintaa tukevat tietojärjestelmät. Mahdollisuus hyödyntää näitä resursseja riippuu ko. liiketoiminnalle kriittisten järjestelmien luotettavuudesta ja hyödynnettävien sovellusten saatavuudesta. Eräs tilanne jossa järjestelmien kyky tukea todellisia liiketoimintaprosesseja vaarantuu on katastrofi. Vaikutukseltaan katastrofi voi olla paikallinen tai kattaa laajojakin alueita. Eri tyyppisiin katastrofeihin on varauduttava niiden edellyttämin tavoin. Eräs kriittisten tietojärjestelmien arkkitehtuuriin vaikuttanut trendi 90-luvulla on ollut client/server lähestymistapa. Client/server paradigman mukaan sovellus jaetaan tasoihin siten että esitys-, sovellus- ja tietokantakerrokset voidaan erottaa fyysisesti toisistaan näiden silti muodostaessa loogisesti yhtenäisen kokonaisuuden. Liiketoiminnan näkökulmasta 90- luvun mullistavia IT-uutuuksia olivat toiminnanohjausjärjestelmät, joiden avulla oli mahdollista hallita koko tuotantoketjua ja muita prosessikokonaisuuksia lähes reaaliajassa. Monikerroksisten toiminnanohjausjärjestelmien luotettavuus on osoittautunut haastavavaksi sillä kaikkien kerrosten suojaaminen kaikilta mahdollisilta katastrofeilta täydellisesti on nykyisellä teknologialla mahdotonta. Kompromissien tekemiseksi on oltava selvillä kunkin menetetyn prosessin aiheuttamista taloudellisista ja liiketoiminnallisista vaikutuksista. Tämän vuoksi juuri toiminnanohjausjärjestelmät ovat mielenkiintoisia, vaikuttavathan ne liiketoimintaprosesseihin läpi koko yrityksen prosessiketjun. Monikerroksisten client/server arkkitehtuuriin pohjautuvien toiminnanohjausjärjestelmien suojaamisessa katastrofeilta onkin sovellettava useita tekniikoita ja teknologioita, ja yhdistettävä kokonaisuus prosessikehykseen. Näin voidaan luoda suunnitelmallinen osa IT strategiaa, joka ottaa kantaa liiketoiminnan jatkuvuuteen katastrofitilanteessa ja mahdollistaa nopean ja täydellisen palautumisen kaikissa olosuhteissa.
Resumo:
Työn pääasiallisena tavoitteena on selvittää ydinosaamista, jota tarvitaan kansainvälisen metsäteollisuusyrityksen loppukäyttäjävetoisessa liiketoiminnassa. Lisäksi tavoitteena on laatia kompetenssimalli, jota voidaan käyttää mm. rekrytoinnissa, koulutuksen ja työnkierron suunnittelussa. Tarkoituksena on tarkastella sekä tällä hetkellä että tulevaisuudessa tarvittavia ydinosaamisia. Työ keskittyy myyntihenkilöiden kompetenssien kuvaamiseen. Kirjallisuuden ja asiantuntijoiden haastattelujen avulla on selvitetty. kompetensseihin liittyviä asioita ja käsitteitä, kuten organisaation oppiminen, innovaatiot sekä asiakaslähtöisyys. Tämän jälkeen on kartoitettu case-yrityksen kartonkiyksikön myyntihenkilökunnan kompetensseja. Kompetenssikartoitus on tehty teemahaastattelujen avulla sekä lisäksi on käytetty työn aikana laadittua kompetenssimatriisia. Saatuja tuloksia on verrattu yhden asiakkaan näkemyksiin. Tulosten mukaan kolme tärkeintä tämän hetken kompetenssia ovat: holistinen näkemys asioista, suhteiden rakentaminen ja kielitaito. Tulevaisuuden tärkeimpiä kompetensseja puolestaan ovat: asiakaslähtöisyys, sähköinen kaupankäynti ja suhteiden rakentaminen. Lopuksi on käsitelty yleisiä metsäteollisuuden tulevaisuuden haasteita sekä annettu joitain kehitysideoita kompetenssien kehittämistä varten.