988 resultados para Design Event


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Les tumeurs sont diverses et hétérogènes, mais toutes partagent la capacité de proliférer sans contrôle. Une prolifération dérégulée de cellules couplée à une insensibilité à une réponse apoptotique constitue une condition minimale pour que l'évolution d'une tumeur se produise. Un des traitements les plus utilisés pour traité le cancer à l'heure actuelle sont les chimiothérapies, qui sont fréquemment des composés chimiques qui induisent des dommages dans l'ADN. Les agents anticancéreux sont efficaces seulement quand les cellules tumorales sont plus aisément tuées que le tissu normal environnant. L'efficacité de ces agents est en partie déterminée par leur capacité à induire l'apoptose. Nous avons récemment démontré que la protéine RasGAP est un substrat non conventionnel des caspases parce elle peut induire à la fois des signaux anti et pro-apoptotiques, selon l'ampleur de son clivage par les caspases. A un faible niveau d'activité des caspases, RasGAP est clivé, générant deux fragments (le fragment N et le fragment C). Le fragment N semble être un inhibiteur général de l'apoptose en aval de l'activation des caspases. À des niveaux plus élevés d'activité des caspases, la capacité du fragment N de contrecarrer l'apoptose est supprimée quand il est clivé à nouveau par les caspases. Ce dernier clivage produit deux nouveaux fragments, N 1 et N2, qui contrairement au fragment N sensibilisent efficacement des cellules cancéreuses envers des agents chimiothérapeutiques. Dans cette étude nous avons prouvé qu'un peptide, appelé par la suite TAT-RasGAP317-326, qui est dérivé du fragment N2 de RasGAP et est rendu perméable aux cellules, sensibilise spécifiquement des cellules cancéreuses à trois génotoxines différentes utilisées couramment dans des traitements anticancéreux, et cela dans des modèles in vitro et in vivo. Il est important de noté que ce peptide semble ne pas avoir d'effet sur des cellules non cancéreuses. Nous avons également commencé à caractériser les mécanismes moléculaires expliquant les fonctions de sensibilisation de TAT-RasGAP317-326. Nous avons démontré que le facteur de transcription p53 et une protéine sous son activité transcriptionelle, nommée Puma, sont indispensables pour l'activité de TAT-RasGAP317-326. Nous avons également prouvé que TAT-RasGAP317-326 exige la présence d'une protéine appelée G3BP1, une protéine se liant a RasGAP, pour potentialisé les effets d'agents anticancéreux. Les données obtenues dans cette étude montrent qu'il pourrait être possible d'augmenter l'efficacité des chimiothérapies à l'aide d'un composé capable d'augmenter la sensibilité des tumeurs aux génotoxines et ainsi pourrait permettre de traiter de manière plus efficace des patients sous traitement chimiothérapeutiques. Summary Tumors are diverse and heterogeneous, but all share the ability to proliferate without control. Deregulated cell proliferation coupled with suppressed apoptotic sensitivity constitutes a minimal requirement upon which tumor evolution occurs. One of the most commonly used treatments is chemotherapy, which frequently uses chemical compounds that induce DNA damages. Anticancer agents are effective only when tumors cells are more readily killed than the surrounding normal tissue. The efficacy of these agents is partly determined by their ability to induce apoptosis. We have recently demonstrated that the protein RasGAP is an unconventional caspase substrate because it can induce both anti- and pro-apoptotic signals, depending on the extent of its cleavage by caspases. At low levels of caspase activity, RasGAP is cleaved, generating an N-terminal fragment (fragment N) and a C-terminal fragment (fragment C). Fragment N appears to be a general Mocker of apoptosis downstream of caspase activation. At higher levels of caspase activity, the ability of fragment N to counteract apoptosis is suppressed when it is further cleaved. This latter cleavage event generates two fragments, N1 and N2, which in contrast to fragment N potently sensitizes cancer cells toward DNA-damaging agents induced apoptosis. In the present study we show that a cell permeable peptide derived from the N2 fragment of RasGAP, thereafter called TAT-RasGAP317-326, specifically sensitizes cancer cells to three different genotoxins commonly used in chemotherapy in vitro and in vivo models. Importantly this peptide seems not to have any effect on non cancer cells. We have also started to characterize the molecular mechanisms underlying the sensitizing functions of TAT-RasGAP317-326. We have demonstrated that the p53 transcription factor and a protein under its transcriptional activity, called Puma, are required for the activity of TATRasGAP317-326. We have also showed that TAT-RasGAP317-326 requires the presence of a protein called G3BP1, which have been shown to interact with RasGAP, to increase the effect of the DNA-damaging drug cisplatin. The data obtained in this study showed that it is possible to increase the efficacy of current used chemotherapies with a compound able to increase the efficacy of genotoxins which could be beneficial for patients subjected to chemotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Spanish Government has established post-market environmental monitoring (PMEM) as mandatory for genetically modified (GM) crop varieties cultivated in Spain. In order to comply with this regulation, effects of Bt maize varieties derived from the event MON810 on the predatory fauna were monitored for two years in northeast and central Spain. The study was carried out with a randomized block design in maize fields of 3-4 ha on which the abundance of plant-dwelling predators and the activity-density of soil-dwelling predators in Bt vs. non-Bt near-isogenic varieties were compared. To this end, the plots were sampled by visual inspection of a certain number of plants and pitfall traps 6 or 7 times throughout two seasons. No significant differences in predator densities on plants were found between Bt and non-Bt varieties. In the pitfall traps, significant differences between the two types of maize were found only in Staphylinidae, in which trap catches in non-Bt maize were higher than in Bt maize in central Spain. Based on the statistical power of the assays, surrogate arthropods for PMEM purposes are proposed; Orius spp. and Araneae for visual sampling and Carabidae, Araneae, and Staphylinidae for pitfall trapping. The other predator groups recorded in the study, Nabis sp. and Coccinellidae in visual sampling and Dermaptera in pitfall trapping, gave very poor power results. To help to establish a standardized protocol for PMEM of genetically modified crops, the effect-detecting capacity with a power of 0.8 of each predator group is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement is a tool for researching. Therefore, it is important that the measuring process is carried out correctly, without distorting the signal or the measured event. Researches of thermoelectric phenomena have been focused more on transverse thermoelectric phenomena during recent decades. Transverse Seebeck effect enables to produce thinner and faster heat flux sensor than before. Studies about transverse Seebeck effect have so far focused on materials, so in this Master’s Thesis instrumentation of transverse Seebeck effect based heat flux sensor is studied, This Master’s Thesis examines an equivalent circuit of transverse Seebeck effect heat flux sensors, their connectivity to electronics and choosing and design a right type amplifier. The research is carried out with a case study which is Gradient Heat Flux Sensors and an electrical motor. In this work, a general equivalent circuit was presented for the transverse Seebeck effect-based heat flux sensor. An amplifier was designed for the sensor of the case study, and the solution was produced for the measurement of the local heat flux of the electric motor to improve the electromagnetic compatibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a just-in-time, assemble-to-order production environments the scheduling of material requirements and production tasks - even though difficult - is of paramount importance. Different enterprise resource planning solutions with master scheduling functionality have been created to ease this problem and work as expected unless there is a problem in the material flow. This case-based candidate’s thesis introduces a tool for Microsoft Dynamics AX multisite environment, that can be used by site managers and production coordinators to get an overview of the current open sales order base and prioritize production in the event of material shortouts to avoid part-deliveries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, life event approach has been widely used by governments all over the world for designing and providing web services to citizens through their e-government portals. Despite the wide usage of this approach, there is still a challenge of how to use this approach to design e-government portals in order to automatically provide personalised services to citizens. We propose a conceptual framework for e-government service provision based on life event approach and the use of citizen profile to capture the citizen needs, since the process of finding Web services from a government-to-citizen (G2C) system involves understanding the citizens’ needs and demands, selecting the relevant services, and delivering services that matches the requirements. The proposed framework that incorporates the citizen profile is based on three components that complement each other, namely, anticipatory life events, non-anticipatory life events and recurring services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research into design methodology is one of the most challenging issues in the field of persuasive technology. However, the introduction of the Persuasive Systems Design model, and the consideration of the 3-Dimensional Re-lationship between Attitude and Behavior, offer to make persuasive technolo-gies more practically viable. In this paper we demonstrate how the 3-Dimensional Relationship between Attitude and Behavior guides the analysis of the persuasion context in the Persuasive System Design model. As a result, we propose a modification of the persuasion context and assert that the technology should be analyzed as part of strategy instead of event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the primary features of modern government-to-citizen (G2C) service provision is the ability to offer a citizen-centric view of the e-government portal. Life-event approach is one of the most widely adopted paradigms supporting the idea of solving a complex event in a citizen’s life through a single service provision. Several studies have used this approach to design e-government portals. However, they were limited in terms of use and scalability. There were no mechanisms that show how to specify a life-event for structuring public e-services, or how to systematically match life-events with these services taking into consideration the citizen needs. We introduce the NOrm-Based Life-Event (NoBLE) framework for G2C e-service provision with a set of mechanisms as a guide for designing active life-event oriented e-government portals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article investigates the relation between stimulus-evoked neural activity and cerebral hemodynamics. Specifically, the hypothesis is tested that hemodynamic responses can be modeled as a linear convolution of experimentally obtained measures of neural activity with a suitable hemodynamic impulse response function. To obtain a range of neural and hemodynamic responses, rat whisker pad was stimulated using brief (less than or equal to2 seconds) electrical stimuli consisting of single pulses (0.3 millisecond, 1.2 mA) combined both at different frequencies and in a paired-pulse design. Hemodynamic responses were measured using concurrent optical imaging spectroscopy and laser Doppler flowmetry, whereas neural responses were assessed through current source density analysis of multielectrode recordings from a single barrel. General linear modeling was used to deconvolve the hemodynamic impulse response to a single "neural event" from the hemodynamic and neural responses to stimulation. The model provided an excellent fit to the empirical data. The implications of these results for modeling schemes and for physiologic systems coupling neural and hemodynamic activity are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architectural description languages (ADLs) are used to specify a high-level, compositional view of a software application, specifying how a system is to be composed from coarse-grain components. ADLs usually come equipped with a formal dynamic semantics, facilitating specification and analysis of distributed and event-based systems. In this paper, we describe the TrustME, an ADL framework that provides both a process and a structural view of web service-based systems. We use Petri-net descriptions to give a dynamic view of business workflow for web service collaboration. We adapt the approach of Schmidt to define a form of Meyer's design-by-contract for configuring workflow architectures. This serves as a configuration-level means of constructing safer, more robust systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the study and development of fault-tolerant techniques for programmable architectures, the well-known Field Programmable Gate Arrays (FPGAs), customizable by SRAM. FPGAs are becoming more valuable for space applications because of the high density, high performance, reduced development cost and re-programmability. In particular, SRAM-based FPGAs are very valuable for remote missions because of the possibility of being reprogrammed by the user as many times as necessary in a very short period. SRAM-based FPGA and micro-controllers represent a wide range of components in space applications, and as a result will be the focus of this work, more specifically the Virtex® family from Xilinx and the architecture of the 8051 micro-controller from Intel. The Triple Modular Redundancy (TMR) with voters is a common high-level technique to protect ASICs against single event upset (SEU) and it can also be applied to FPGAs. The TMR technique was first tested in the Virtex® FPGA architecture by using a small design based on counters. Faults were injected in all sensitive parts of the FPGA and a detailed analysis of the effect of a fault in a TMR design synthesized in the Virtex® platform was performed. Results from fault injection and from a radiation ground test facility showed the efficiency of the TMR for the related case study circuit. Although TMR has showed a high reliability, this technique presents some limitations, such as area overhead, three times more input and output pins and, consequently, a significant increase in power dissipation. Aiming to reduce TMR costs and improve reliability, an innovative high-level technique for designing fault-tolerant systems in SRAM-based FPGAs was developed, without modification in the FPGA architecture. This technique combines time and hardware redundancy to reduce overhead and to ensure reliability. It is based on duplication with comparison and concurrent error detection. The new technique proposed in this work was specifically developed for FPGAs to cope with transient faults in the user combinational and sequential logic, while also reducing pin count, area and power dissipation. The methodology was validated by fault injection experiments in an emulation board. The thesis presents comparison results in fault coverage, area and performance between the discussed techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Acute respiratory distress syndrome (ARDS) is associated with high in-hospital mortality. Alveolar recruitment followed by ventilation at optimal titrated PEEP may reduce ventilator-induced lung injury and improve oxygenation in patients with ARDS, but the effects on mortality and other clinical outcomes remain unknown. This article reports the rationale, study design, and analysis plan of the Alveolar Recruitment for ARDS Trial (ART). Methods/Design: ART is a pragmatic, multicenter, randomized (concealed), controlled trial, which aims to determine if maximum stepwise alveolar recruitment associated with PEEP titration is able to increase 28-day survival in patients with ARDS compared to conventional treatment (ARDSNet strategy). We will enroll adult patients with ARDS of less than 72 h duration. The intervention group will receive an alveolar recruitment maneuver, with stepwise increases of PEEP achieving 45 cmH(2)O and peak pressure of 60 cmH2O, followed by ventilation with optimal PEEP titrated according to the static compliance of the respiratory system. In the control group, mechanical ventilation will follow a conventional protocol (ARDSNet). In both groups, we will use controlled volume mode with low tidal volumes (4 to 6 mL/kg of predicted body weight) and targeting plateau pressure <= 30 cmH2O. The primary outcome is 28-day survival, and the secondary outcomes are: length of ICU stay; length of hospital stay; pneumothorax requiring chest tube during first 7 days; barotrauma during first 7 days; mechanical ventilation-free days from days 1 to 28; ICU, in-hospital, and 6-month survival. ART is an event-guided trial planned to last until 520 events (deaths within 28 days) are observed. These events allow detection of a hazard ratio of 0.75, with 90% power and two-tailed type I error of 5%. All analysis will follow the intention-to-treat principle. Discussion: If the ART strategy with maximum recruitment and PEEP titration improves 28-day survival, this will represent a notable advance to the care of ARDS patients. Conversely, if the ART strategy is similar or inferior to the current evidence-based strategy (ARDSNet), this should also change current practice as many institutions routinely employ recruitment maneuvers and set PEEP levels according to some titration method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous increase of genome sequencing projects produced a huge amount of data in the last 10 years: currently more than 600 prokaryotic and 80 eukaryotic genomes are fully sequenced and publically available. However the sole sequencing process of a genome is able to determine just raw nucleotide sequences. This is only the first step of the genome annotation process that will deal with the issue of assigning biological information to each sequence. The annotation process is done at each different level of the biological information processing mechanism, from DNA to protein, and cannot be accomplished only by in vitro analysis procedures resulting extremely expensive and time consuming when applied at a this large scale level. Thus, in silico methods need to be used to accomplish the task. The aim of this work was the implementation of predictive computational methods to allow a fast, reliable, and automated annotation of genomes and proteins starting from aminoacidic sequences. The first part of the work was focused on the implementation of a new machine learning based method for the prediction of the subcellular localization of soluble eukaryotic proteins. The method is called BaCelLo, and was developed in 2006. The main peculiarity of the method is to be independent from biases present in the training dataset, which causes the over‐prediction of the most represented examples in all the other available predictors developed so far. This important result was achieved by a modification, made by myself, to the standard Support Vector Machine (SVM) algorithm with the creation of the so called Balanced SVM. BaCelLo is able to predict the most important subcellular localizations in eukaryotic cells and three, kingdom‐specific, predictors were implemented. In two extensive comparisons, carried out in 2006 and 2008, BaCelLo reported to outperform all the currently available state‐of‐the‐art methods for this prediction task. BaCelLo was subsequently used to completely annotate 5 eukaryotic genomes, by integrating it in a pipeline of predictors developed at the Bologna Biocomputing group by Dr. Pier Luigi Martelli and Dr. Piero Fariselli. An online database, called eSLDB, was developed by integrating, for each aminoacidic sequence extracted from the genome, the predicted subcellular localization merged with experimental and similarity‐based annotations. In the second part of the work a new, machine learning based, method was implemented for the prediction of GPI‐anchored proteins. Basically the method is able to efficiently predict from the raw aminoacidic sequence both the presence of the GPI‐anchor (by means of an SVM), and the position in the sequence of the post‐translational modification event, the so called ω‐site (by means of an Hidden Markov Model (HMM)). The method is called GPIPE and reported to greatly enhance the prediction performances of GPI‐anchored proteins over all the previously developed methods. GPIPE was able to predict up to 88% of the experimentally annotated GPI‐anchored proteins by maintaining a rate of false positive prediction as low as 0.1%. GPIPE was used to completely annotate 81 eukaryotic genomes, and more than 15000 putative GPI‐anchored proteins were predicted, 561 of which are found in H. sapiens. In average 1% of a proteome is predicted as GPI‐anchored. A statistical analysis was performed onto the composition of the regions surrounding the ω‐site that allowed the definition of specific aminoacidic abundances in the different considered regions. Furthermore the hypothesis that compositional biases are present among the four major eukaryotic kingdoms, proposed in literature, was tested and rejected. All the developed predictors and databases are freely available at: BaCelLo http://gpcr.biocomp.unibo.it/bacello eSLDB http://gpcr.biocomp.unibo.it/esldb GPIPE http://gpcr.biocomp.unibo.it/gpipe

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.