840 resultados para Control and Optimization
Resumo:
Background: Exercise training is a non-pharmacological strategy for treatment of heart failure. Exercise training improves functional capacity and quality of life in patients. Moreover, exercise training reduces muscle sympathetic nerve activity (MSNA) and peripheral vasoconstriction. However, most of these studies have been conducted in middle-aged patients. Thus, the effects of exercise training in older patients are much less understood. The present study was undertaken to investigate whether exercise training improves functional capacity, muscular sympathetic activation and muscular blood flow in older heart failure patients, as it does in middle-aged heart failure patients. Design: Fifty-two consecutive outpatients with heart failure from the database of the Unit of Cardiovascular Rehabilitation and Physiology Exercise were divided by age (middle-aged, defined as 45-59 years, and older, defined as 60-75 years) and exercise status (trained and untrained). Methods: MSNA was recorded directly from the peroneal nerve using the microneurography technique. Forearm Blood Flow (FBF) was measured by venous occlusion plethysmography. Functional capacity was evaluated by cardiopulmonary exercise test. Results: Exercise training significantly and similarly increased FBF and peak VO2 in middle-aged and older heart failure patients. In addition, exercise training significantly and similarly reduced MSNA and forearm vascular resistance in these patients. No significant changes were found in untrained patients. Conclusion: Exercise training improves neurovascular control and functional capacity in heart failure patients regardless of age.
Resumo:
The objective of this study is to retrospectively report the results of interventions for controlling a vancomycin-resistant enterococcus (VRE) outbreak in a tertiary-care pediatric intensive care unit (PICU) of a University Hospital. After identification of the outbreak, interventions were made at the following levels: patient care, microbiological surveillance, and medical and nursing staff training. Data were collected from computer-based databases and from the electronic prescription system. Vancomycin use progressively increased after March 2008, peaking in August 2009. Five cases of VRE infection were identified, with 3 deaths. After the interventions, we noted a significant reduction in vancomycin prescription and use (75% reduction), and the last case of VRE infection was identified 4 months later. The survivors remained colonized until hospital discharge. After interventions there was a transient increase in PICU length-of-stay and mortality. Since then, the use of vancomycin has remained relatively constant and strict, no other cases of VRE infection or colonization have been identified and length-of-stay and mortality returned to baseline. In conclusion, we showed that a bundle intervention aiming at a strict control of vancomycin use and full compliance with the Hospital Infection Control Practices Advisory Committee guidelines, along with contact precautions and hand-hygiene promotion, can be effective in reducing vancomycin use and the emergence and spread of vancomycin-resistant bacteria in a tertiary-care PICU.
Resumo:
This paper studies the asymptotic optimality of discrete-time Markov decision processes (MDPs) with general state space and action space and having weak and strong interactions. By using a similar approach as developed by Liu, Zhang, and Yin [Appl. Math. Optim., 44 (2001), pp. 105-129], the idea in this paper is to consider an MDP with general state and action spaces and to reduce the dimension of the state space by considering an averaged model. This formulation is often described by introducing a small parameter epsilon > 0 in the definition of the transition kernel, leading to a singularly perturbed Markov model with two time scales. Our objective is twofold. First it is shown that the value function of the control problem for the perturbed system converges to the value function of a limit averaged control problem as epsilon goes to zero. In the second part of the paper, it is proved that a feedback control policy for the original control problem defined by using an optimal feedback policy for the limit problem is asymptotically optimal. Our work extends existing results of the literature in the following two directions: the underlying MDP is defined on general state and action spaces and we do not impose strong conditions on the recurrence structure of the MDP such as Doeblin's condition.
Resumo:
The influence of glycerol concentration (C-g), process temperature (T-p), drying temperature (T-s), and relative humidity (RH) on the properties of achira flour films was initially assessed. The optimized process conditions were C-g of 17g glycerol/100g flour, T-p of 90 degrees C, T-s of 44.8 degrees C, and RH of 36.4%. The films produced under these conditions displayed high mechanical strength (7.0 MPa), low solubility (38.3%). and satisfactory elongation values (14.6%). This study showed that achira flour is a promising source for the development of biodegradable films with good mechanical properties, low water vapor permeability, and solubility compared to films based on other tubers. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.
Resumo:
Background ArtinM is a D-mannose-specific lectin from Artocarpus integrifolia seeds that induces neutrophil migration and activation, degranulation of mast cells, acceleration of wound healing, induction of interleukin-12 production by macrophages and dendritic cells, and protective T helper 1 immune response against Leishmania major, Leishmania amazonensis and Paracoccidioides brasiliensis infections. Considering the important biological properties of ArtinM and its therapeutic applicability, this study was designed to produce high-level expression of active recombinant ArtinM (rArtinM) in Escherichia coli system. Results The ArtinM coding region was inserted in pET29a(+) vector and expressed in E. coli BL21(DE3)-Codon Plus-RP. The conditions for overexpression of soluble ArtinM were optimized testing different parameters: temperatures (20, 25, 30 or 37°C) and shaking speeds (130, 200 or 220 rpm) during induction, concentrations of the induction agent IPTG (0.01-4 mM) and periods of induction (1-19 h). BL21-CodonPlus(DE3)-RP cells induced under the optimized conditions (incubation at 20°C, at a shaking speed of 130 rpm, induction with 0.4 mM IPTG for 19 h) resulted in the accumulation of large amounts of soluble rArtinM. The culture provided 22.4 mg/L of rArtinM, which activity was determined by its one-step purification through affinity chromatography on immobilized D-mannose and glycoarray analysis. Gel filtration showed that rArtinM is monomeric, contrasting with the tetrameric form of the plant native protein (jArtinM). The analysis of intact rArtinM by mass spectrometry revealed a 16,099.5 Da molecular mass, and the peptide mass fingerprint and esi-cid-ms/ms of amino acid sequences of peptides from a tryptic digest covered 41% of the total ArtinM amino acid sequence. In addition, circular dichroism and fluorescence spectroscopy of rArtinM indicated that its global fold comprises β-sheet structure. Conclusions Overall, the optimized process to express rArtinM in E. coli provided high amounts of soluble, correctly folded and active recombinant protein, compatible with large scale production of the lectin.
Resumo:
This paper presents the new active absorption wave basin, named Hydrodynamic Calibrator (HC), constructed at the University of São Paulo (USP), in the Laboratory facilities of the Numerical Offshore Tank (TPN). The square (14 m 14 m) tank is able to generate and absorb waves from 0.5 Hz to 2.0 Hz, by means of 148 active hinged flap wave makers. An independent mechanical system drives each flap by means of a 1HP servo-motor and a ball-screw based transmission system. A customized ultrasonic wave probe is installed in each flap, and is responsible for measuring wave elevation in the flap. A complex automation architecture was implemented, with three Programmable Logic Computers (PLCs), and a low-level software is responsible for all the interlocks and maintenance functions of the tank. Furthermore, all the control algorithms for the generation and absorption are implemented using higher level software (MATLAB /Simulink block diagrams). These algorithms calculate the motions of the wave makers both to generate and absorb the required wave field by taking into account the layout of the flaps and the limits of wave generation. The experimental transfer function that relates the flap amplitude to the wave elevation amplitude is used for the calculation of the motion of each flap. This paper describes the main features of the tank, followed by a detailed presentation of the whole automation system. It includes the measuring devices, signal conditioning, PLC and network architecture, real-time and synchronizing software and motor control loop. Finally, a validation of the whole automation system is presented, by means of the experimental analysis of the transfer function of the waves generated and the calculation of all the delays introduced by the automation system.
Resumo:
Programa de doctorado: Acuicultura
Resumo:
Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.
Resumo:
60 strains (belonging to the genera Lactobacillus, Bifidobacterium, Leuconostoc and Enterococcus) were tested for their capacity to inhibit the growth of 3 strains of Campylobacter jejuni: Lactobacilli and bifidobacteria were left to grow in MRS or TPY broth at 37°C overnight in anaerobic conditions; Campylobacter jejuni was inoculated in blood agar plates at 37°C for 24-48 hours in microaerophilic conditions. The inhibition experiments were carried out in vitro using ”Spot agar test” and “Well diffusion assay” techniques testing both cellular activity and that of the surnatant. 11 strains proved to inhibit the growth of Campylobacter jejuni. These strains were subsequently analised analised in order to evaluate the resistance to particular situations of stress which are found in the gastrointestinal tract and during the industrial transformation processes (Starvation stress, osmotic stress, heat stress, resistance to pH and to bile salts). Resistance to starvation stress: all strains seemed to resist the stress (except one strain). Resistance to osmotic stress: all strains were relatively resistant to the concentrations of 6% w/v of NaCl (except one strain). Resistance to heat stress: only one strain showed little resistance to the 55°C temperature. Resistance to pH: In the presence of a low pH (2.5), many strains rapidly lost their viability after approximately 1 hour. Resistance to bile salts: Except for one strain, all strains seemed to be relatively resistant to the 2% w/v concentration of bile salts. Afterward, strains were identified by using phenotipic and molecular techniques. Phenotipic identification was carried out by using API 50 CHL (bioMérieux) and API 20 STREP identification system (bioMérieux); molecular identification with species-specific PCR: the molecular techniques confirmed the results by phenotipic identification. For testing the antibiotic resistance profile, bacterial strains were subcultured in MRS or TPY broth and incubated for 18 h at 37°C under anaerobic conditions. Antibiotics tested (Tetracycline, Trimethoprim, Cefuroxime, Kanamycin, Chloramphenicol, Vancomycin, Ampycillin, Sterptomycin, Erythromycin) were diluted to the final concentrations of: 2,4,8,16,32,64,128,256 mg/ml. Then, 20 μl fresh bacterial culture (final concentration in the plates approximately 106 cfu/ml) were added to 160 μl MRS or TPY broth and 20 μl antibiotic solution. As positive control the bacterial culture (20 ul) was added to broth (160 ul) and water (20 ul). Test was performed on plates P96, that after the inoculum were incubated for 24 h at 37oC, then the antibiotic resistance was determined by measuring the Optical Density (OD) at 620 nm with Multiscan EX. All strains showed a similar behaviour: resistance to all antibiotic tested. Further studies are needed.
Resumo:
This Phd thesis was entirely developed at the Telescopio Nazionale Galileo (TNG, Roque de los Muchachos, La Palma Canary Islands) with the aim of designing, developing and implementing a new Graphical User Interface (GUI) for the Near Infrared Camera Spectrometer (NICS) installed on the Nasmyth A of the telescope. The idea of a new GUI for NICS has risen for optimizing the astronomers work through a set of powerful tools not present in the existing GUI, such as the possibility to move automatically, an object on the slit or do a very preliminary images analysis and spectra extraction. The new GUI also provides a wide and versatile image display, an automatic procedure to find out the astronomical objects and a facility for the automatic image crosstalk correction. In order to test the overall correct functioning of the new GUI for NICS, and providing some information on the atmospheric extinction at the TNG site, two telluric standard stars have been spectroscopically observed within some engineering time, namely Hip031303 and Hip031567. The used NICS set-up is as follows: Large Field (0.25'' /pixel) mode, 0.5'' slit and spectral dispersion through the AMICI prism (R~100), and the higher resolution (R~1000) JH and HK grisms.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
The research activities described in the present thesis have been oriented to the design and development of components and technological processes aimed at optimizing the performance of plasma sources in advanced in material treatments. Consumables components for high definition plasma arc cutting (PAC) torches were studied and developed. Experimental activities have in particular focussed on the modifications of the emissive insert with respect to the standard electrode configuration, which comprises a press fit hafnium insert in a copper body holder, to improve its durability. Based on a deep analysis of both the scientific and patent literature, different solutions were proposed and tested. First, the behaviour of Hf cathodes when operating at high current levels (250A) in oxidizing atmosphere has been experimentally investigated optimizing, with respect to expected service life, the initial shape of the electrode emissive surface. Moreover, the microstructural modifications of the Hf insert in PAC electrodes were experimentally investigated during first cycles, in order to understand those phenomena occurring on and under the Hf emissive surface and involved in the electrode erosion process. Thereafter, the research activity focussed on producing, characterizing and testing prototypes of composite inserts, combining powders of a high thermal conductibility (Cu, Ag) and high thermionic emissivity (Hf, Zr) materials The complexity of the thermal plasma torch environment required and integrated approach also involving physical modelling. Accordingly, a detailed line-by-line method was developed to compute the net emission coefficient of Ar plasmas at temperatures ranging from 3000 K to 25000 K and pressure ranging from 50 kPa to 200 kPa, for optically thin and partially autoabsorbed plasmas. Finally, prototypal electrodes were studied and realized for a newly developed plasma source, based on the plasma needle concept and devoted to the generation of atmospheric pressure non-thermal plasmas for biomedical applications.