888 resultados para Supervisory Control and Data Acquisition (SCADA) Topology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid vehicles (HV), comprising a conventional ICE-based powertrain and a secondary energy source, to be converted into mechanical power as well, represent a well-established alternative to substantially reduce both fuel consumption and tailpipe emissions of passenger cars. Several HV architectures are either being studied or already available on market, e.g. Mechanical, Electric, Hydraulic and Pneumatic Hybrid Vehicles. Among the others, Electric (HEV) and Mechanical (HSF-HV) parallel Hybrid configurations are examined throughout this Thesis. To fully exploit the HVs potential, an optimal choice of the hybrid components to be installed must be properly designed, while an effective Supervisory Control must be adopted to coordinate the way the different power sources are managed and how they interact. Real-time controllers can be derived starting from the obtained optimal benchmark results. However, the application of these powerful instruments require a simplified and yet reliable and accurate model of the hybrid vehicle system. This can be a complex task, especially when the complexity of the system grows, i.e. a HSF-HV system assessed in this Thesis. The first task of the following dissertation is to establish the optimal modeling approach for an innovative and promising mechanical hybrid vehicle architecture. It will be shown how the chosen modeling paradigm can affect the goodness and the amount of computational effort of the solution, using an optimization technique based on Dynamic Programming. The second goal concerns the control of pollutant emissions in a parallel Diesel-HEV. The emissions level obtained under real world driving conditions is substantially higher than the usual result obtained in a homologation cycle. For this reason, an on-line control strategy capable of guaranteeing the respect of the desired emissions level, while minimizing fuel consumption and avoiding excessive battery depletion is the target of the corresponding section of the Thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aging process is characterized by the progressive fitness decline experienced at all the levels of physiological organization, from single molecules up to the whole organism. Studies confirmed inflammaging, a chronic low-level inflammation, as a deeply intertwined partner of the aging process, which may provide the “common soil” upon which age-related diseases develop and flourish. Thus, albeit inflammation per se represents a physiological process, it can rapidly become detrimental if it goes out of control causing an excess of local and systemic inflammatory response, a striking risk factor for the elderly population. Developing interventions to counteract the establishment of this state is thus a top priority. Diet, among other factors, represents a good candidate to regulate inflammation. Building on top of this consideration, the EU project NU-AGE is now trying to assess if a Mediterranean diet, fortified for the elderly population needs, may help in modulating inflammaging. To do so, NU-AGE enrolled a total of 1250 subjects, half of which followed a 1-year long diet, and characterized them by mean of the most advanced –omics and non –omics analyses. The aim of this thesis was the development of a solid data management pipeline able to efficiently cope with the results of these assays, which are now flowing inside a centralized database, ready to be used to test the most disparate scientific hypotheses. At the same time, the work hereby described encompasses the data analysis of the GEHA project, which was focused on identifying the genetic determinants of longevity, with a particular focus on developing and applying a method for detecting epistatic interactions in human mtDNA. Eventually, in an effort to propel the adoption of NGS technologies in everyday pipeline, we developed a NGS variant calling pipeline devoted to solve all the sequencing-related issues of the mtDNA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was the final stage of a four-year study of managerial behaviour and company performance in Bulgaria and examined the influence of changing ownership and control structures of companies on managerial behaviour and initiative. It provides a theoretical summary of the specific types of ownership, control, governance structures and managerial strategies in the Bulgarian transitional economy during 1992-1996. It combines two theoretical approaches, the property-rights approach to show concentrated property-rights structure and private and majority types of control as determinants of efficient enterprise risk bearing and constrained managerial discretion, and the agency theory approach to reveal the efficient role of direct non-market governance mechanisms over managers. Mr. Peev also used empirical information collected from the Central Statistical office in Bulgaria, three different enterprise investigations of corporatised state-owned enterprises between 1992 and 1994, and his own data base of privatised and private de novo industrial companies in 1996-1996. The project gives a detailed description of the main property-rights structures in Bulgaria at the present time and of the various control structures related to these. It found that there is a strong owner type of control in private and privatised firms, although, contrary to expectations, 100% state -owned enterprises tended to be characterised by a separation of ownership from control, leaving scope for managerial discretion. Mr. Peev predicts that after the forthcoming mass privatisation, many companies will acquire a dispersed ownership structure and there will be a greater separation of ownership from control and potential or inefficient managerial behaviour. The next aspect considered in detail was governance structures and the influence of the generally unstable macroeconomic environment in the country during the period in question. In examining managerial strategies, Mr. Peev divided the years since 1990 into 3 periods. Even in the first period (1990-1992) there were some signs of a more efficient role for managers and between 1992 and 1994 the picture of control structures and different managerial behaviour in state-owned companies became more diversified. Managerial strategies identified included managerial initiatives for privatisation, where managers took initiative in resolving problems of property rights and introducing restructuring measures and privatisation proposals, managerial initiatives for restructuring without privatisation, and passive adjustment and passive management, where managers seek outside services for marketing, finance management, etc. in order to adjust to the new environment. During 1995-1996 some similarities and differences between the managerial behaviour of privatised and state-owned firms emerged. Firstly, the former have undergone many changes in investment and technology, while managers of state-owned companies have changed little in this field, indicating that the private property-rights structure is more efficient for the long-term adaptation of enterprises. In the area of strategies relating to product quality, marketing, and pricing policy there was little difference between managers of private, privatised and state-owned firms. The most passive managerial behaviour was found in non-incorporated state-owned firms, although these have only an insignificant stake in the economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigators interested in whether a disease aggregates in families often collect case-control family data, which consist of disease status and covariate information for families selected via case or control probands. Here, we focus on the use of case-control family data to investigate the relative contributions to the disease of additive genetic effects (A), shared family environment (C), and unique environment (E). To this end, we describe a ACE model for binary family data and then introduce an approach to fitting the model to case-control family data. The structural equation model, which has been described previously, combines a general-family extension of the classic ACE twin model with a (possibly covariate-specific) liability-threshold model for binary outcomes. Our likelihood-based approach to fitting involves conditioning on the proband’s disease status, as well as setting prevalence equal to a pre-specified value that can be estimated from the data themselves if necessary. Simulation experiments suggest that our approach to fitting yields approximately unbiased estimates of the A, C, and E variance components, provided that certain commonly-made assumptions hold. These assumptions include: the usual assumptions for the classic ACE and liability-threshold models; assumptions about shared family environment for relative pairs; and assumptions about the case-control family sampling, including single ascertainment. When our approach is used to fit the ACE model to Austrian case-control family data on depression, the resulting estimate of heritability is very similar to those from previous analyses of twin data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Interaction refers to the situation in which the effect of 1 exposure on an outcome differs across strata of another exposure. We did a survey of epidemiologic studies published in leading journals to examine how interaction is assessed and reported. METHODS: We selected 150 case-control and 75 cohort studies published between May 2001 and May 2007 in leading general medicine, epidemiology, and clinical specialist journals. Two reviewers independently extracted data on study characteristics. RESULTS: Of the 225 studies, 138 (61%) addressed interaction. Among these, 25 (18%) presented no data or only a P value or a statement of statistical significance; 40 (29%) presented stratum-specific effect estimates but no meaningful comparison of these estimates; and 58 (42%) presented stratum-specific estimates and appropriate tests for interaction. Fifteen articles (11%) presented the individual effects of both exposures and also their joint effect or a product term, providing sufficient information to interpret interaction on an additive and multiplicative scale. Reporting was poorest in articles published in clinical specialist articles and most adequate in articles published in general medicine journals, with epidemiology journals in an intermediate position. CONCLUSIONS: A majority of articles reporting cohort and case-control studies address possible interactions between exposures. However, in about half of these, the information provided was unsatisfactory, and only 1 in 10 studies reported data that allowed readers to interpret interaction effects on an additive and multiplicative scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Cloud computing service emerged as an essential component of the Enterprise {IT} infrastructure. Migration towards a full range and large-scale convergence of Cloud and network services has become the current trend for addressing requirements of the Cloud environment. Our approach takes the infrastructure as a service paradigm to build converged virtual infrastructures, which allow offering tailored performance and enable multi-tenancy over a common physical infrastructure. Thanks to virtualization, new exploitation activities of the physical infrastructures may arise for both transport network and Data Centres services. This approach makes network and Data Centres’ resources dedicated to Cloud Computing to converge on the same flexible and scalable level. The work presented here is based on the automation of the virtual infrastructure provisioning service. On top of the virtual infrastructures, a coordinated operation and control of the different resources is performed with the objective of automatically tailoring connectivity services to the Cloud service dynamics. Furthermore, in order to support elasticity of the Cloud services through the optical network, dynamic re-planning features have been provided to the virtual infrastructure service, which allows scaling up or down existing virtual infrastructures to optimize resource utilisation and dynamically adapt to users’ demands. Thus, the dynamic re-planning of the service becomes key component for the coordination of Cloud and optical network resource in an optimal way in terms of resource utilisation. The presented work is complemented with a use case of the virtual infrastructure service being adopted in a distributed Enterprise Information System, that scales up and down as a function of the application requests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces an area- and power-efficient approach for compressive recording of cortical signals used in an implantable system prior to transmission. Recent research on compressive sensing has shown promising results for sub-Nyquist sampling of sparse biological signals. Still, any large-scale implementation of this technique faces critical issues caused by the increased hardware intensity. The cost of implementing compressive sensing in a multichannel system in terms of area usage can be significantly higher than a conventional data acquisition system without compression. To tackle this issue, a new multichannel compressive sensing scheme which exploits the spatial sparsity of the signals recorded from the electrodes of the sensor array is proposed. The analysis shows that using this method, the power efficiency is preserved to a great extent while the area overhead is significantly reduced resulting in an improved power-area product. The proposed circuit architecture is implemented in a UMC 0.18 [Formula: see text]m CMOS technology. Extensive performance analysis and design optimization has been done resulting in a low-noise, compact and power-efficient implementation. The results of simulations and subsequent reconstructions show the possibility of recovering fourfold compressed intracranial EEG signals with an SNR as high as 21.8 dB, while consuming 10.5 [Formula: see text]W of power within an effective area of 250 [Formula: see text]m × 250 [Formula: see text]m per channel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With more experience in the labor market, some job characteristics increase, some decrease. For example, among young employees who just entered the labor market, job control may initially be low but increase with more routine and experience. Job control is a job resource that is valued in itself and is positively associated with job satisfaction; but job control also helps dealing with stressors at work. There is little research on correlated changes, but the existing evidence suggests a joint development over time. However, even less is known about the relevance of such changes for employees. Usually, research tends to use mean levels to predict mean levels in outcomes but development in job control and stressors may be as relevant for job satisfaction as having a certain level in those job characteristics. Job satisfaction typically is regarded as a positive attitude towards one’s work. What has received less attention is that some employees may lower their expectations if their job situation does not reflect their needs, resulting in a resigned attitude towards one’s job. The present study investigates the development of job control and task-related stressors over ten years and tests the predictive value of changes in job control and task-related stressors for resigned attitude towards one’s job. We used data from a Swiss panel study (N=356) ranging over ten years. Job control, task-related stressors (an index consisting of time pressure, concentration demands, performance constraints, interruptions, and uncertainty about tasks), and resigned attitude towards one’s job were assessed in 1998, 1999, 2001, and 2008. Latent growth modeling revealed that growth rates of job control and task-related stressors were not correlated with one another. We predicted resigned attitude towards one’s job in 2008 a) by initial levels, and b) by changes in job control and stressors, controlling for resigned attitude in 1998. There was some prediction by initial levels (job control: β = -.15, p < .05; task-related stressors: β = .12, p = .06). However, as expected, changes in control and stressors predicted resigned attitude much better, with β = -.37, p < .001, for changes in job control, and β = .31, p < .001, for changes in task-related stressors. Our data confirm the importance of having low levels of task-related stressors and higher levels of job control for job attitudes. However, development in these job characteristics seems even more important than initial levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE Vitamin D (D₃) status is reported to correlate negatively with insulin production and insulin sensitivity in patients with type 2 diabetes mellitus (T2DM). However, few placebo-controlled intervention data are available. We aimed to assess the effect of large doses of parenteral D3 on glycosylated haemoglobin (HbA(₁c)) and estimates of insulin action (homeostasis model assessment insulin resistance: HOMA-IR) in patients with stable T2DM. MATERIALS AND METHODS We performed a prospective, randomised, double-blind, placebo-controlled pilot study at a single university care setting in Switzerland. Fifty-five patients of both genders with T2DM of more than 10 years were enrolled and randomised to either 300,000 IU D₃ or placebo, intramuscularly. The primary endpoint was the intergroup difference in HbA(₁c) levels. Secondary endpoints were: changes in insulin sensitivity, albuminuria, calcium/phosphate metabolism, activity of the renin-aldosterone axis and changes in 24-hour ambulatory blood pressure values. RESULTS After 6 months of D₃ supply, there was a significant intergroup difference in the change in HbA(₁c) levels (relative change [mean ± standard deviation] +2.9% ± 1.5% in the D₃ group vs +6.9% ± 2.1% the in placebo group, p = 0.041) as HOMA-IR decreased by 12.8% ± 5.6% in the D₃ group and increased by 10% ± 5.4% in the placebo group (intergroup difference, p = 0.032). Twenty-four-hour urinary albumin excretion decreased in the D₃ group from 200 ± 41 to 126 ± 39, p = 0.021). There was no significant intergroup difference for the other secondary endpoints. CONCLUSIONS D₃ improved insulin sensitivity (based on HOMA-IR) and affected the course of HbA(₁c) positively compared with placebo in patients with T2DM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sample preparation procedures for AMS measurements of 129I and 127I in environmental materials and some methodological aspects of quality assurance are discussed. Measurements from analyses of some pre-nuclear soil and thyroid gland samples and of a systematic investigation of natural waters in Lower Saxony, Germany, are described. Although the up-to-now lowest 129I/127I ratios in soils and thyroid glands were observed, they are still suspect to contamination since they are significantly higher than the pre-nuclear equilibrium ratio in the marine hydrosphere. A survey on all available 129I/127I isotopic ratios in precipitation shows a dramatic increase until the middle of the 1980s and a stabilization since 1987 at high isotopic ratios of about (3.6–8.3)×10−7. In surface waters, ratios of (57–380)×10−10 are measured while shallow ground waters show with ratios of (1.3–200)×10−10 significantly lower values with a much larger spread. The data for 129I in soils and in precipitation are used to estimate pre-nuclear and modern 129I deposition densities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Academic and industrial research in the late 90s have brought about an exponential explosion of DNA sequence data. Automated expert systems are being created to help biologists to extract patterns, trends and links from this ever-deepening ocean of information. Two such systems aimed on retrieving and subsequently utilizing phylogenetically relevant information have been developed in this dissertation, the major objective of which was to automate the often difficult and confusing phylogenetic reconstruction process. ^ Popular phylogenetic reconstruction methods, such as distance-based methods, attempt to find an optimal tree topology (that reflects the relationships among related sequences and their evolutionary history) by searching through the topology space. Various compromises between the fast (but incomplete) and exhaustive (but computationally prohibitive) search heuristics have been suggested. An intelligent compromise algorithm that relies on a flexible “beam” search principle from the Artificial Intelligence domain and uses the pre-computed local topology reliability information to adjust the beam search space continuously is described in the second chapter of this dissertation. ^ However, sometimes even a (virtually) complete distance-based method is inferior to the significantly more elaborate (and computationally expensive) maximum likelihood (ML) method. In fact, depending on the nature of the sequence data in question either method might prove to be superior. Therefore, it is difficult (even for an expert) to tell a priori which phylogenetic reconstruction method—distance-based, ML or maybe maximum parsimony (MP)—should be chosen for any particular data set. ^ A number of factors, often hidden, influence the performance of a method. For example, it is generally understood that for a phylogenetically “difficult” data set more sophisticated methods (e.g., ML) tend to be more effective and thus should be chosen. However, it is the interplay of many factors that one needs to consider in order to avoid choosing an inferior method (potentially a costly mistake, both in terms of computational expenses and in terms of reconstruction accuracy.) ^ Chapter III of this dissertation details a phylogenetic reconstruction expert system that selects a superior proper method automatically. It uses a classifier (a Decision Tree-inducing algorithm) to map a new data set to the proper phylogenetic reconstruction method. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extreme winter warming events in the sub-Arctic have caused considerable vegetation damage due to rapid changes in temperature and loss of snow cover. The frequency of extreme weather is expected to increase due to climate change thereby increasing the potential for recurring vegetation damage in Arctic regions. Here we present data on vegetation recovery from one such natural event and multiple experimental simulations in the sub-Arctic using remote sensing, handheld passive proximal sensors and ground surveys. Normalized difference vegetation index (NDVI) recovered fast (2 years), from the 26% decline following one natural extreme winter warming event. Recovery was associated with declines in dead Empetrum nigrum (dominant dwarf shrub) from ground surveys. However, E. nigrum healthy leaf NDVI was also reduced (16%) following this winter warming event in experimental plots (both control and treatments), suggesting that non-obvious plant damage (i.e., physiological stress) had occurred in addition to the dead E. nigrum shoots that was considered responsible for the regional 26% NDVI decline. Plot and leaf level NDVI provided useful additional information that could not be obtained from vegetation surveys and regional remote sensing (MODIS) alone. The major damage of an extreme winter warming event appears to be relatively transitory. However, potential knock-on effects on higher trophic levels (e.g., rodents, reindeer, and bear) could be unpredictable and large. Repeated warming events year after year, which can be expected under winter climate warming, could result in damage that may take much longer to recover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystems at high northern latitudes are subject to strong climate change. Soil processes, such as carbon and nutrient cycles, which determine the functioning of these ecosystems, are controlled by soil fauna. Thus assessing the responses of soil fauna communities to environmental change will improve the predictability of the climate change impacts on ecosystem functioning. For this purpose, trait assessment is a promising method compared to the traditional taxonomic approach, but it has not been applied earlier. In this study the response of a sub-arctic soil Collembola community to long-term (16 years) climate manipulation by open top chambers was assessed. The drought-susceptible Collembola community responded strongly to the climate manipulation, which substantially reduced soil moisture and slightly increased soil temperature. The total density of Collembola decreased by 51% and the average number of species was reduced from 14 to 12. Although community assessment showed species-specific responses, taxonomically based community indices, species diversity and evenness, were not affected. However, morphological and ecological trait assessments were more sensitive in revealing community responses. Drought-tolerant, larger-sized, epiedaphic species survived better under the climate manipulation than their counterparts, the meso-hydrophilic, smaller-sized and euedaphic species. Moreover it also explained the significant responses shown by four taxa. This study shows that trait analysis can both reveal responses in a soil fauna community to climate change and improve the understanding of the mechanisms behind them.