74 resultados para Distributed Material Flow Control


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Beryllium sensitization (BeS) is caused by exposure to beryllium in the workplace and may progress to chronic beryllium disease (CBD). This granulomatous lung disorder mimicks sarcoidosis clinically, but is characterized by beryllium specific CD4+ T-cells immune response. BeS is classically detected by beryllium lymphocyte proliferation test (BeLPT), but this assay requires radioactivity and is not very sensitive. In the context of a study aiming to evaluate if CBD patients are misdiagnosed as sarcoidosis patients in Switzerland, we developed EliSpot and CFSE beryllium flow cytometric test. Methods: 23 patients considered as having sarcoidosis (n = 21), CBD (n = 1) and possible CBD (n = 1) were enrolled. Elispot was performed using plate covered with gamma-IFN mAb. Cells were added to wells and incubated overnight at 37 °C with medium (neg ctrl), SEB (pos ctrl) or BeSO4 at 1, 10 and 100 microM. Anti-IFN-gamma biotinylated mAb were added and spots were visualized using streptavidinhorseradish peroxidase and AEC substrate reagent. Results were reported as spot forming unit (SFU). For Beryllium specific CFSE flow cytometry analysis, CFSE labelled cells were cultured in the presence of SEB and 1, 10 or 100 microM BeSO4. Unstimulated CFSE labeled cells were defined as controls. The cells were incubated for 6 days at 37 °C and 5% CO2. Surface labelling of T-lymphocytes and vivid as control of cells viability was performed at the time of harvest. Results: Using EliSpot technology, we were able to detect a BeS in 1/23 enrolled patients with a mean of 780 SFU (cut off value at 50 SFU). This positive result was confirmed using different concentration of BeSO4. Among the 23 patients tested, 22 showed negative results with EliSpot. Using CFSE flow cytometry, 1/7 tested patients showed a positive result with a beryllium specific CD4+ count around 30% versus 45% for SEB stimulation as positif control and 0.6 % for negative control. This patient was the one with a positive EliSpot assay. Conclusions: The preliminary data demonstrated the feasibility of Elispot and CFSE flow cytometry to detect BeS. The patient with a beryllium specific positive EliSpot and CFSE flow cytometry result had been exposed to beryllium at her workplace 20 years ago and is still regularly controlled for her pulmonary status. A positive BeLPT had already been described in 2001 in France for this patient. Further validation of these techniques are in progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The biocontrol activity of the root-colonizing Pseudomonas fluorescens strain CHA0 is largely determined by the production of antifungal metabolites, especially 2,4-diacetylphloroglucinol. The expression of these metabolites depends on abiotic and biotic environmental factors, in particular, elements present in the rhizosphere. In this study, we have developed a new method for the in situ analysis of antifungal gene expression using flow cytometry combined with green fluorescent protein (GFP)-based reporter fusions to the phlA and prnA genes essential for the production of the antifungal compounds 2,4-diacetylphloroglucinol and pyrrolnitrin, respectively, in strain CHA0. Expression of phlA-gfp and prnA-gfp in CHA0 cells harvested from the rhizosphere of a set of plant species as well as from the roots of healthy, leaf pathogen-attacked, and physically stressed plants were analyzed using a FACSCalibur. After subtraction of background fluorescence emitted by plant-derived particles and CHA0 cells not carrying the gfp reporters, the average gene expression per bacterial cell could be calculated. Levels of phlA and prnA expression varied significantly in the rhizospheres of different plant species. Physical stress and leaf pathogen infection lowered phlA expression levels in the rhizosphere of cucumber. Our results demonstrate that the newly developed approach is suitable to monitor differences in levels of antifungal gene expression in response to various plant-derived factors. An advantage of the method is that it allows quantification of bacterial gene expression in rhizosphere populations at a single-cell level. To our best knowledge, this is the first study using flow cytometry for the in situ analysis of biocontrol gene expression in a plant-beneficial bacterium in the rhizosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semi-automatic capillary gas chromatographic method with classical flame ionization detection, which satisfies the conditions for required performance and gave acceptable results within the framework of an interlaboratory certification programme for PAHs in sewage sludge, is described. The interesting feature of the procedure is that it incorporates automatic operations such as sample fractionation by semi-preparative HPLC, fraction collection at signal level recognition and evaporation under nitrogen flow. Multiple injections in the GC capillary column are performed in the on-column mode via an autosampler with temperature-programmable injector. Automatic data acquisition and chromatogram treatment are made via computer software. This partially automatic procedure releases personnel from tedious and time-consuming tasks and its robust character was validated through the certification of reference material for PAHs in sewage sludge, demonstrating its reliable performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tumour immunologists strive to develop efficient tumour vaccination and adoptive transfer therapies that enlarge the pool of tumour-specific and -reactive effector T-cells in vivo. To assess the efficiency of the various strategies, ex vivo assays are needed for the longitudinal monitoring of the patient's specific immune responses providing both quantitative and qualitative data. In particular, since tumour cell cytolysis is the end goal of tumour immunotherapy, routine immune monitoring protocols need to include a read-out for the cytolytic efficiency of Ag-specific cells. We propose to combine current immune monitoring techniques in a highly sensitive and reproducible multi-parametric flow cytometry based cytotoxicity assay that has been optimised to require low numbers of Ag-specific T-cells. The possibility of re-analysing those T-cells that have undergone lytic activity is illustrated by the concomitant detection of CD107a upregulation on the surface of degranulated T-cells. To date, the LiveCount Assay provides the only possibility of assessing the ex vivo cytolytic activity of low-frequency Ag-specific cytotoxic T-lymphocytes from patient material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Anastomotic leak remains a common and potentially deleterious complication after esophagectomy. Preoperative embolization of the left gastric artery and splenic artery (PAE) has been suggested to lower anastomotic leak rates. We present the results of our 5-year experience with this technique.Methods. All patients undergoing PAE before esophagectomy since introduction of this technique in 2004 were compared in a 1: 2 matched-pair analysis with patients without PAE. Matching criteria were type of anastomosis, neoadjuvant treatment, comorbidity, and age. Data were derived from a retrospective chart review from 2000 to 2006 that was perpetuated as a prospective database up to date. Outcome measures were anastomotic leak, overall complications, and hospital stay.Results. Between 2000 and 2009, 102 patients underwent esophagectomy for cancer in our institution with an overall leak rate of 19% and a mortality of 8%. All 19 patients having PAE since 2004 were successfully matched 1: 2 to 38 control patients without PAE; both groups were similar regarding demographics and operation characteristics. Two PAE (11%) and 8 control patients (21%) had an anastomotic leak, but the difference was statistically not significant (p = 0.469). Overall and major complication rates for PAE and control group were 89% versus 79% (p = 0.469) and 37% versus 34% (p = 1.000), respectively. Median intensive care unit and hospital stay were 3 versus 3 days (p = 1.000) and 22 versus 17 days (p = 0.321), respectively.Conclusions. In our experience, PAE has no significant impact on complications and anastomotic leak in particular after esophagectomy. (Ann Thorac Surg 2011;91:1556-61) (C) 2011 by The Society of Thoracic Surgeons

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a spatiotemporal adaptive multiscale algorithm, which is based on the Multiscale Finite Volume method. The algorithm offers a very efficient framework to deal with multiphysics problems and to couple regions with different spatial resolution. We employ the method to simulate two-phase flow through porous media. At the fine scale, we consider a pore-scale description of the flow based on the Volume Of Fluid method. In order to construct a global problem that describes the coarse-scale behavior, the equations are averaged numerically with respect to auxiliary control volumes, and a Darcy-like coarse-scale model is obtained. The space adaptivity is based on the idea that a fine-scale description is only required in the front region, whereas the resolution can be coarsened elsewhere. Temporal adaptivity relies on the fact that the fine-scale and the coarse-scale problems can be solved with different temporal resolution (longer time steps can be used at the coarse scale). By simulating drainage under unstable flow conditions, we show that the method is able to capture the coarse-scale behavior outside the front region and to reproduce complex fluid patterns in the front region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: The purpose of this study was to compare myocardial blood flow (MBF) and myocardial flow reserve (MFR) estimates from rubidium-82 positron emission tomography ((82)Rb PET) data using 10 software packages (SPs) based on 8 tracer kinetic models. BACKGROUND: It is unknown how MBF and MFR values from existing SPs agree for (82)Rb PET. METHODS: Rest and stress (82)Rb PET scans of 48 patients with suspected or known coronary artery disease were analyzed in 10 centers. Each center used 1 of 10 SPs to analyze global and regional MBF using the different kinetic models implemented. Values were considered to agree if they simultaneously had an intraclass correlation coefficient >0.75 and a difference <20% of the median across all programs. RESULTS: The most common model evaluated was the Ottawa Heart Institute 1-tissue compartment model (OHI-1-TCM). MBF values from 7 of 8 SPs implementing this model agreed best. Values from 2 other models (alternative 1-TCM and Axially distributed) also agreed well, with occasional differences. The MBF results from other models (e.g., 2-TCM and retention) were less in agreement with values from OHI-1-TCM. CONCLUSIONS: SPs using the most common kinetic model-OHI-1-TCM-provided consistent results in measuring global and regional MBF values, suggesting that they may be used interchangeably to process data acquired with a common imaging protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background We analyzed the relationship between cholelithiasis and cancer risk in a network of case-control studies conducted in Italy and Switzerland in 1982-2009. Methods The analyses included 1997 oropharyngeal, 917 esophageal, 999 gastric, 23 small intestinal, 3726 colorectal, 684 liver, 688 pancreatic, 1240 laryngeal, 6447 breast, 1458 endometrial, 2002 ovarian, 1582 prostate, 1125 renal cell, 741 bladder cancers, and 21 284 controls. The odds ratios (ORs) were estimated by multiple logistic regression models. Results The ORs for subjects with history of cholelithiasis compared with those without were significantly elevated for small intestinal (OR = 3.96), prostate (OR = 1.36), and kidney cancers (OR = 1.57). These positive associations were observed ≥10 years after diagnosis of cholelithiasis and were consistent across strata of age, sex, and body mass index. No relation was found with the other selected cancers. A meta-analysis including this and three other studies on the relation of cholelithiasis with small intestinal cancer gave a pooled relative risk of 2.35 [95% confidence interval (CI) 1.82-3.03]. Conclusion In subjects with cholelithiasis, we showed an appreciably increased risk of small intestinal cancer and suggested a moderate increased risk of prostate and kidney cancers. We found no material association with the other cancers considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Knowledge of cerebral blood flow (CBF) alterations in cases of acute stroke could be valuable in the early management of these cases. Among imaging techniques affording evaluation of cerebral perfusion, perfusion CT studies involve sequential acquisition of cerebral CT sections obtained in an axial mode during the IV administration of iodinated contrast material. They are thus very easy to perform in emergency settings. Perfusion CT values of CBF have proved to be accurate in animals, and perfusion CT affords plausible values in humans. The purpose of this study was to validate perfusion CT studies of CBF by comparison with the results provided by stable xenon CT, which have been reported to be accurate, and to evaluate acquisition and processing modalities of CT data, notably the possible deconvolution methods and the selection of the reference artery. METHODS: Twelve stable xenon CT and perfusion CT cerebral examinations were performed within an interval of a few minutes in patients with various cerebrovascular diseases. CBF maps were obtained from perfusion CT data by deconvolution using singular value decomposition and least mean square methods. The CBF were compared with the stable xenon CT results in multiple regions of interest through linear regression analysis and bilateral t tests for matched variables. RESULTS: Linear regression analysis showed good correlation between perfusion CT and stable xenon CT CBF values (singular value decomposition method: R(2) = 0.79, slope = 0.87; least mean square method: R(2) = 0.67, slope = 0.83). Bilateral t tests for matched variables did not identify a significant difference between the two imaging methods (P >.1). Both deconvolution methods were equivalent (P >.1). The choice of the reference artery is a major concern and has a strong influence on the final perfusion CT CBF map. CONCLUSION: Perfusion CT studies of CBF achieved with adequate acquisition parameters and processing lead to accurate and reliable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After incidentally learning about a hidden regularity, participants can either continue to solve the task as instructed or, alternatively, apply a shortcut. Past research suggests that the amount of conflict implied by adopting a shortcut seems to bias the decision for vs. against continuing instruction-coherent task processing. We explored whether this decision might transfer from one incidental learning task to the next. Theories that conceptualize strategy change in incidental learning as a learning-plus-decision phenomenon suggest that high demands to adhere to instruction-coherent task processing in Task 1 will impede shortcut usage in Task 2, whereas low control demands will foster it. We sequentially applied two established incidental learning tasks differing in stimuli, responses and hidden regularity (the alphabet verification task followed by the serial reaction task, SRT). While some participants experienced a complete redundancy in the task material of the alphabet verification task (low demands to adhere to instructions), for others the redundancy was only partial. Thus, shortcut application would have led to errors (high demands to follow instructions). The low control demand condition showed the strongest usage of the fixed and repeating sequence of responses in the SRT. The transfer results are in line with the learning-plus-decision view of strategy change in incidental learning, rather than with resource theories of self-control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Snow cover is an important control in mountain environments and a shift of the snow-free period triggered by climate warming can strongly impact ecosystem dynamics. Changing snow patterns can have severe effects on alpine plant distribution and diversity. It thus becomes urgent to provide spatially explicit assessments of snow cover changes that can be incorporated into correlative or empirical species distribution models (SDMs). Here, we provide for the first time a with a lower overestimation comparison of two physically based snow distribution models (PREVAH and SnowModel) to produce snow cover maps (SCMs) at a fine spatial resolution in a mountain landscape in Austria. SCMs have been evaluated with SPOT-HRVIR images and predictions of snow water equivalent from the two models with ground measurements. Finally, SCMs of the two models have been compared under a climate warming scenario for the end of the century. The predictive performances of PREVAH and SnowModel were similar when validated with the SPOT images. However, the tendency to overestimate snow cover was slightly lower with SnowModel during the accumulation period, whereas it was lower with PREVAH during the melting period. The rate of true positives during the melting period was two times higher on average with SnowModel with a lower overestimation of snow water equivalent. Our results allow for recommending the use of SnowModel in SDMs because it better captures persisting snow patches at the end of the snow season, which is important when modelling the response of species to long-lasting snow cover and evaluating whether they might survive under climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bark beetle outbreaks have a devastating effect on economically important forests worldwide, thus requiring extensive application of management control strategies. The presence of unmanaged protected areas in close proximity to managed forests can instigate concerns that bark beetle infestations may spread from unmanaged into managed stands. We studied the impact of differential management of forest stands on the dispersal dynamics of the European spruce bark beetle, Ips typographus, making use of inferential population genetics on mitochondrial and nuclear genomes. Bayesian inferences of migration rates and a most parsimonious dispersal tree show that outgoing gene flow was consistently higher from managed to unmanaged areas. Reason for that is likely the thorough removal of potential breeding material in managed forests and thus the dispersal of the base stock beetles from these areas to unmanaged areas where breeding material is available. Our study suggests that the potential threat posed by unmanaged to managed forests in regard to I. typographus infestation needs to be carefully re-considered.