978 resultados para backstepping control concept


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main concern of the A4 parity violation experiment at the Mainzer Microtron accelerator facility is to study the electric and magnetic contributions of strange quarks to the charge and magnetism of the nucleons at the low momentum transfer region. More precisely, the A4 collaboration investigates the strange quarks' contribution to the electric and magnetic vector form factors of the nucleons. Thus, it is important that the A4 experiment uses an adequate and precise non-destructive online monitoring tool for the electron beam polarization when measuring single spin asymmetries in elastic scattering of polarized electrons from unpolarized nucleons. As a consequence, the A4 Compton backscattering polarimeter was designed and installed such that we can take the absolute measurement of the electron beam polarization without interruption to the parity violation experiment. The present study shows the development of an electron beam line that is called the chicane for the A4 Compton backscattering polarimeter. The chicane is an electron beam transport line and provides an interaction region where the electron beam and the laser beam overlap. After studying the properties of beam line components carefully, we developed an electron beam control system that makes a beam overlap between the electron beam and the laser beam. Using the system, we can easily achieve the beam overlap in a short time. The electron control system, of which the performance is outstanding, is being used in production beam times. And the study presents the development of a scintillating fiber electron detector that reduces the statistical error in the electron polarization measurement. We totally redesigned the scintillating fiber detector. The data that were taken during a 2008 beam time shows a huge background suppression, approximately 80 percent, while leaving the Compton spectra almost unchanged when a coincidence between the fiber detector and the photon detector is used. Thus, the statistical error of the polarization measurement is reduced by about 40 percent in the preliminary result. They are the significant progress in measuring a degree of polarization of the electron beam.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Da nicht-synonyme tumorspezifische Punktmutationen nur in malignen Geweben vorkommen und das veränderte Proteinprodukt vom Immunsystem als „fremd“ erkannt werden kann, stellen diese einen bisher ungenutzten Pool von Zielstrukturen für die Immuntherapie dar. Menschliche Tumore können individuell bis zu tausenden nicht-synonymer Punktmutationen in ihrem Genom tragen, welche nicht der zentralen Immuntoleranz unterliegen. Ziel der vorliegenden Arbeit war die Hypothese zu untersuchen, dass das Immunsystem in der Lage sein sollte, mutierte Epitope auf Tumorzellen zu erkennen und zu klären, ob auf dieser Basis eine wirksame mRNA (RNA) basierte anti-tumorale Vakzinierung etabliert werden kann. Hierzu wurde von Ugur Sahin und Kollegen, das gesamte Genom des murinen B16-F10 Melanoms sequenziert und bioinformatisch analysiert. Im Rahmen der NGS Sequenzierung wurden mehr als 500 nicht-synonyme Punktmutationen identifiziert, von welchen 50 Mutationen selektiert und durch Sanger Sequenzierung validiert wurden. rnNach der Etablierung des immunologischen Testsysteme war eine Hauptfragestellung dieser Arbeit, die selektierten nicht-synonyme Punktmutationen in einem in vivo Ansatz systematisch auf Antigenität zu testen. Für diese Studien wurden mutierte Sequenzen in einer Länge von 27 Aminosäuren genutzt, in denen die mutierte Aminosäure zentral positioniert war. Durch die Länge der Peptide können prinzipiell alle möglichen MHC Klasse-I und -II Epitope abgedeckt werden, welche die Mutation enthalten. Eine Grundidee des Projektes Ansatzes ist es, einen auf in vitro transkribierter RNA basierten oligotopen Impfstoff zu entwickeln. Daher wurden die Impfungen naiver Mäuse sowohl mit langen Peptiden, als auch in einem unabhängigen Ansatz mit peptidkodierender RNA durchgeführt. Die Immunphänotypisierung der Impfstoff induzierten T-Zellen zeigte, dass insgesamt 16 der 50 (32%) mutierten Sequenzen eine T-Zellreaktivität induzierten. rnDie Verwendung der vorhergesagten Epitope in therapeutischen Vakzinierungsstudien bestätigten die Hypothese das mutierte Neo-Epitope potente Zielstrukturen einer anti-tumoralen Impftherapie darstellen können. So wurde in therapeutischen Tumorstudien gezeigt, dass auf Basis von RNA 9 von 12 bestätigten Epitopen einen anti-tumoralen Effekt zeigte.rnÜberaschenderweise wurde bei einem MHC Klasse-II restringierten mutiertem Epitop (Mut-30) sowohl in einem subkutanen, als auch in einem unabhängigen therapeutischen Lungenmetastasen Modell ein starker anti-tumoraler Effekt auf B16-F10 beobachtet, der dieses Epitop als neues immundominantes Epitop für das B16-F10 Melanom etabliert. Um den immunologischen Mechanismus hinter diesem Effekt näher zu untersuchen wurde in verschieden Experimenten die Rolle von CD4+, CD8+ sowie NK-Zellen zu verschieden Zeitpunkten der Tumorentwicklung untersucht. Die Analyse des Tumorgewebes ergab, eine signifikante erhöhte Frequenz von NK-Zellen in den mit Mut-30 RNA vakzinierten Tieren. Das NK Zellen in der frühen Phase der Therapie eine entscheidende Rolle spielen wurde anhand von Depletionsstudien bestätigt. Daran anschließend wurde gezeigt, dass im fortgeschrittenen Tumorstadium die NK Zellen keinen weiteren relevanten Beitrag zum anti-tumoralen Effekt der RNA Vakzinierung leisten, sondern die Vakzine induzierte adaptive Immunantwort. Durch die Isolierung von Lymphozyten aus dem Tumorgewebe und deren Einsatz als Effektorzellen im IFN-γ ELISPOT wurde nachgewiesen, dass Mut-30 spezifische T-Zellen das Tumorgewebe infiltrieren und dort u.a. IFN-γ sekretieren. Dass diese spezifische IFN-γ Ausschüttung für den beobachteten antitumoralen Effekt eine zentrale Rolle einnimmt wurde unter der Verwendung von IFN-γ -/- K.O. Mäusen bestätigt.rnDas Konzept der individuellen RNA basierten mutationsspezifischen Vakzine sieht vor, nicht nur mit einem mutations-spezifischen Epitop, sondern mit mehreren RNA-kodierten Mutationen Patienten zu impfen um der Entstehung von „escape“-Mutanten entgegenzuwirken. Da es nur Erfahrung mit der Herstellung und Verabreichung von Monotop-RNA gab, also RNA die für ein Epitop kodiert, war eine wichtige Fragestellungen, inwieweit Oligotope, welche die mutierten Sequenzen sequentiell durch Linker verbunden als Fusionsprotein kodieren, Immunantworten induzieren können. Hierzu wurden Pentatope mit variierender Position des einzelnen Epitopes hinsichtlich ihrer in vivo induzierten T-Zellreaktivitäten charakterisiert. Die Experimente zeigten, dass es möglich ist, unabhängig von der Position im Pentatop eine Immunantwort gegen ein Epitop zu induzieren. Des weiteren wurde beobachtet, dass die induzierten T-Zellfrequenzen nach Pentatop Vakzinierung im Vergleich zur Nutzung von Monotopen signifikant gesteigert werden kann.rnZusammenfassend wurde im Rahmen der vorliegenden Arbeit präklinisch erstmalig nachgewiesen, dass nicht-synonyme Mutationen eine numerisch relevante Quelle von Zielstrukturen für die anti-tumorale Immuntherapie darstellen. Überraschenderweise zeigte sich eine dominante Induktion MHC-II restringierter Immunantworten, welche partiell in der Lage waren massive Tumorabstoßungsreaktionen zu induzieren. Im Sinne einer Translation der gewonnenen Erkenntnisse wurde ein RNA basiertes Oligotop-Format etabliert, welches Eingang in die klinische Testung des Konzeptes fand.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aortic stenosis has become the most frequent type of valvular heart disease in Europe and North America and presents in the large majority of patients as calcified aortic stenosis in adults of advanced age. Surgical aortic valve replacement has been recognized to be the definitive therapy which improves considerably survival for severe aortic stenosis since more than 40 years. In the most recent period, operative mortality of isolated aortic valve replacement for aortic stenosis varies between 1–3% in low-risk patients younger than 70 years and between 4 and 8% in selected older adults. Long-term survival following aortic valve replacement is close to that observed in a control population of similar age. Numerous observational studies have consistently demonstrated that corrective surgery in symptomatic patients is invariably followed by a subjective improvement in quality of life and a substantial increase in survival rates. More recently, transcatheter aortic valve implantation (TAVI) has been demonstrated to be feasible in patients with high surgical risk using either a retrograde transfemoral or transsubclavian approach or an antegrade, transapical access. Reported 30-day mortality ranges between 5 and 15%) and is acceptable when compared to the risk predicted by the logistic EuroSCORE (varying between 20 and 35%) and the STS Score, although the EuroScore has been shown to markedly overestimate the effective operative risk. One major concern remains the high rate of paravalvular regurgitation which is observed in up to 85% of the patients and which requires further follow-up and critical evaluation. In addition, long-term durability of these valves with a focus on the effects of crimping remains to be addressed, although 3-5 year results are promising. Sutureless biological valves were designed to simplify and significantly accelerate the surgical replacement of a diseased valve and allow complete excision of the calcified native valve. Until now, there are 3 different sutureless prostheses that have been approved. The 3f Enable valve from ATS-Medtronic received CE market approval in 2010, the Perceval S from Sorin during Q1 of 2011 and the intuity sutureless prosthesis from Edwards in 2012. All these devices aim to facilitate valve surgery and therefore have the potential to decrease the invasivness and to shorten the conventional procedure without compromise in term of excision of the diseased valve. This review summarizes the history and the current knowledge of sutureless valve technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arterio-venous malformations (AVMs) are congenital vascular malformations (CVMs) that result from birth defects involving the vessels of both arterial and venous origins, resulting in direct communications between the different size vessels or a meshwork of primitive reticular networks of dysplastic minute vessels which have failed to mature to become 'capillary' vessels termed "nidus". These lesions are defined by shunting of high velocity, low resistance flow from the arterial vasculature into the venous system in a variety of fistulous conditions. A systematic classification system developed by various groups of experts (Hamburg classification, ISSVA classification, Schobinger classification, angiographic classification of AVMs,) has resulted in a better understanding of the biology and natural history of these lesions and improved management of CVMs and AVMs. The Hamburg classification, based on the embryological differentiation between extratruncular and truncular type of lesions, allows the determination of the potential of progression and recurrence of these lesions. The majority of all AVMs are extra-truncular lesions with persistent proliferative potential, whereas truncular AVM lesions are exceedingly rare. Regardless of the type, AV shunting may ultimately result in significant anatomical, pathophysiological and hemodynamic consequences. Therefore, despite their relative rarity (10-20% of all CVMs), AVMs remain the most challenging and potentially limb or life-threatening form of vascular anomalies. The initial diagnosis and assessment may be facilitated by non- to minimally invasive investigations such as duplex ultrasound, magnetic resonance imaging (MRI), MR angiography (MRA), computerized tomography (CT) and CT angiography (CTA). Arteriography remains the diagnostic gold standard, and is required for planning subsequent treatment. A multidisciplinary team approach should be utilized to integrate surgical and non-surgical interventions for optimum care. Currently available treatments are associated with significant risk of complications and morbidity. However, an early aggressive approach to elimiate the nidus (if present) may be undertaken if the benefits exceed the risks. Trans-arterial coil embolization or ligation of feeding arteries where the nidus is left intact, are incorrect approaches and may result in proliferation of the lesion. Furthermore, such procedures would prevent future endovascular access to the lesions via the arterial route. Surgically inaccessible, infiltrating, extra-truncular AVMs can be treated with endovascular therapy as an independent modality. Among various embolo-sclerotherapy agents, ethanol sclerotherapy produces the best long term outcomes with minimum recurrence. However, this procedure requires extensive training and sufficient experience to minimize complications and associated morbidity. For the surgically accessible lesions, surgical resection may be the treatment of choice with a chance of optimal control. Preoperative sclerotherapy or embolization may supplement the subsequent surgical excision by reducing the morbidity (e.g. operative bleeding) and defining the lesion borders. Such a combined approach may provide an excellent potential for a curative result. Conclusion. AVMs are high flow congenital vascular malformations that may occur in any part of the body. The clinical presentation depends on the extent and size of the lesion and can range from an asymptomatic birthmark to congestive heart failure. Detailed investigations including duplex ultrasound, MRI/MRA and CT/CTA are required to develop an appropriate treatment plan. Appropriate management is best achieved via a multi-disciplinary approach and interventions should be undertaken by appropriately trained physicians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Predicting asthma episodes is notoriously difficult but has potentially significant consequences for the individual, as well as for healthcare services. The purpose of this review is to describe recent insights into the prediction of acute asthma episodes in relation to classical clinical, functional or inflammatory variables, as well as present a new concept for evaluating asthma as a dynamically regulated homeokinetic system. RECENT FINDINGS: Risk prediction for asthma episodes or relapse has been attempted using clinical scoring systems, considerations of environmental factors and lung function, as well as inflammatory and immunological markers in induced sputum or exhaled air, and these are summarized here. We have recently proposed that newer mathematical methods derived from statistical physics may be used to understand the complexity of asthma as a homeokinetic, dynamic system consisting of a network comprising multiple components, and also to assess the risk for future asthma episodes based on fluctuation analysis of long time series of lung function. SUMMARY: Apart from the classical analysis of risk factor and functional parameters, this new approach may be used to assess asthma control and treatment effects in the individual as well as in future research trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research was to develop a high-fidelity dynamic model of a parafoilpayload system with respect to its application for the Ship Launched Aerial Delivery System (SLADS). SLADS is a concept in which cargo can be transfered from ship to shore using a parafoil-payload system. It is accomplished in two phases: An initial towing phase when the glider follows the towing vessel in a passive lift mode and an autonomous gliding phase when the system is guided to the desired point. While many previous researchers have analyzed the parafoil-payload system when it is released from another airborne vehicle, limited work has been done in the area of towing up the system from ground or sea. One of the main contributions of this research was the development of a nonlinear dynamic model of a towed parafoil-payload system. After performing an extensive literature review of the existing methods of modeling a parafoil-payload system, a five degree-of-freedom model was developed. The inertial and geometric properties of the system were investigated to predict accurate results in the simulation environment. Since extensive research has been done in determining the aerodynamic characteristics of a paraglider, an existing aerodynamic model was chosen to incorporate the effects of air flow around the flexible paraglider wing. During the towing phase, it is essential that the parafoil-payload system follow the line of the towing vessel path to prevent an unstable flight condition called ‘lockout’. A detailed study of the causes of lockout, its mathematical representation and the flight conditions and the parameters related to lockout, constitute another contribution of this work. A linearized model of the parafoil-payload system was developed and used to analyze the stability of the system about equilibrium conditions. The relationship between the control surface inputs and the stability was investigated. In addition to stability of flight, one more important objective of SLADS is to tow up the parafoil-payload system as fast as possible. The tension in the tow cable is directly proportional to the rate of ascent of the parafoil-payload system. Lockout instability is more favorable when tow tensions are large. Thus there is a tradeoff between susceptibility to lockout and rapid deployment. Control strategies were also developed for optimal tow up and to maintain stability in the event of disturbances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor network is an emerging research topic due to its vast and ever-growing applications. Wireless sensor networks are made up of small nodes whose main goal is to monitor, compute and transmit data. The nodes are basically made up of low powered microcontrollers, wireless transceiver chips, sensors to monitor their environment and a power source. The applications of wireless sensor networks range from basic household applications, such as health monitoring, appliance control and security to military application, such as intruder detection. The wide spread application of wireless sensor networks has brought to light many research issues such as battery efficiency, unreliable routing protocols due to node failures, localization issues and security vulnerabilities. This report will describe the hardware development of a fault tolerant routing protocol for railroad pedestrian warning system. The protocol implemented is a peer to peer multi-hop TDMA based protocol for nodes arranged in a linear zigzag chain arrangement. The basic working of the protocol was derived from Wireless Architecture for Hard Real-Time Embedded Networks (WAHREN).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free-radical retrograde-precipitation polymerization, FRRPP in short, is a novel polymerization process discovered by Dr. Gerard Caneba in the late 1980s. The current study is aimed at gaining a better understanding of the reaction mechanism of the FRRPP and its thermodynamically-driven features that are predominant in controlling the chain reaction. A previously developed mathematical model to represent free radical polymerization kinetics was used to simulate a classic bulk polymerization system from the literature. Unlike other existing models, such a sparse-matrix-based representation allows one to explicitly accommodate the chain length dependent kinetic parameters. Extrapolating from the past results, mixing was experimentally shown to be exerting a significant influence on reaction control in FRRPP systems. Mixing alone drives the otherwise severely diffusion-controlled reaction propagation in phase-separated polymer domains. Therefore, in a quiescent system, in the absence of mixing, it is possible to retard the growth of phase-separated domains, thus producing isolated polymer nanoparticles (globules). Such a diffusion-controlled, self-limiting phenomenon of chain growth was also observed using time-resolved small angle x-ray scattering studies of reaction kinetics in quiescent systems of FRRPP. Combining the concept of self-limiting chain growth in quiescent FRRPP systems with spatioselective reaction initiation of lithography, microgel structures were synthesized in a single step, without the use of molds or additives. Hard x-rays from the bending magnet radiation of a synchrotron were used as an initiation source, instead of the more statistally-oriented chemical initiators. Such a spatially-defined reaction was shown to be self-limiting to the irradiated regions following a polymerization-induced self-assembly phenomenon. The pattern transfer aspects of this technique were, therefore, studied in the FRRP polymerization of N-isopropylacrylamide (NIPAm) and methacrylic acid (MAA), a thermoreversible and ionic hydrogel, respectively. Reaction temperature increases the contrast between the exposed and unexposed zones of the formed microgels, while the irradiation dose is directly proportional to the extent of phase separation. The response of Poly (NIPAm) microgels prepared from the technique described in this study was also characterized by small angle neutron scattering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Chair of Transportation and Ware-housing at the University of Dortmund together with its industrial partner has developed and implemented a decentralized control system based on embedded technology and Internet standards. This innovative, highly flexible system uses autonomous software modules to control the flow of unit loads in real-time. The system is integrated into Chair’s test facility consisting of a wide range of conveying and sorting equipment. It is built for proof of concept purposes and will be used for further research in the fields of decentralized automation and embedded controls. This presentation describes the implementation of this decentralized control system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decentralised controls offer advantages for the implementation as well as the operation of controls of steady conveyors. Such concepts are mainly based on RFID. Due to the reduced expense for appliances and software, however, the plant behaviour cannot be determined as accurately as in centrally controlled systems. This article describes a simulation-based method by which the performances of these two control concepts can easily be evaluated in order to determine the suitability of the decentralised concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to widespread development of anthelmintic resistance in equine parasites, recommendations for their control are currently undergoing marked changes with a shift of emphasis toward more coprological surveillance and reduced treatment intensity. Denmark was the first nation to introduce prescription-only restrictions of anthelmintic drugs in 1999, but other European countries have implemented similar legislations over recent years. A questionnaire survey was performed in 2008 among Danish horse owners to provide a current status of practices and perceptions with relation to parasite control. Questions aimed at describing the current use of coprological surveillance and resulting anthelmintic treatment intensities, evaluating knowledge and perceptions about the importance of various attributes of parasite control, and assessing respondents' willingness to pay for advice and parasite surveillance services from their veterinarians. A total of 1060 respondents completed the questionnaire. A large majority of respondents (71.9%) were familiar with the concept of selective therapy. Results illustrated that the respondents' self-evaluation of their knowledge about parasites and their control associated significantly with their level of interest in the topic and their type of education (P<0.0001). The large majority of respondents either dewormed their horses twice a year and/or performed two fecal egg counts per horse per year. This approach was almost equally pronounced in foals, horses aged 1-3 years old, and adult horses. The respondents rated prevention of parasitic disease and prevention of drug resistance as the most important attributes, while cost and frequent fecal testing were rated least important. Respondents' actual spending on parasite control per horse in the previous year correlated significantly with the amount they declared themselves willing to spend (P<0.0001). However, 44.4% declared themselves willing to pay more than what they were spending. Altogether, results indicate that respondents were generally familiar with equine parasites and the concept of selective therapy, although there was some confusion over the terms small and large strongyles. They used a large degree of fecal surveillance in all age groups, with a majority of respondents sampling and/or treating around twice a year. Finally, respondents appeared willing to spend money on parasite control for their horses. It is of concern that the survey suggested that foals and young horses are treated in a manner very similar to adult horses, which is against current recommendations. Thus, the survey illustrates the importance of clear communication of guidelines for equine parasite control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distinguishing between physical and social aggression, this study examined whether the predictive effect of aggression on resource control a) is moderated by prosocial behavior and b) corresponds to a linear or a curvilinear trend. Moderating effects of children’s social preference among peers and child sex in this context were also tested. Based on a sample of 682 kindergarten children (348 girls; average age 72.7 months, 3.6 SD), multilevel regressions revealed additive linear effects of social preference and prosociality on resource control. Moderate (but not high) levels of social aggression also facilitated resource control for disliked children. There was no such threshold effect for well liked children, who increasingly controlled the resource the more socially aggressive they were. In contrast, physical aggression hampered resource control unless used very modestly. The present study has a number of positive features. First, the distinction between physical and social aggression improves our understanding of the relation between aggression and social competence and sketches a more differentiated picture of the role of different forms of aggression in resource control. Second, this study combines the concept of resource control with the concept of social preference and investigates curvilinear effects of aggression. Third, the direct observation of resource control in the Movie Viewer increases the internal validity of this study.