926 resultados para Optimal Control Problems
                                
Resumo:
The Canadian Wildlife Service has had twenty-five years experience with the problem caused by bird contacts with aircraft. I experienced my first bird strike, while flying as an observer on a waterfowl survey in August, 1940. Officers of the Service investigated bird problems at airports at Yarmouth, Nova Scotia, and Cartierville, Quebec, in the late 1940's. Those incidents involving gulls and low speed piston-engined aircraft caused minor damage to the aircraft but considerable disturbance to the operators. As aircraft speeds increased and airports became more numerous and busier the problem increased in extent and complexity. By 1960 it was apparent that the problem would grow worse and that work should be directed toward reducing the number of incidents. In 1960 an electra aircraft crashed at Boston, Massachusetts, killing 61 passengers. Starlings were involved in the engine malfunction which preceded the crash. In November, 1962 a viscount aircraft was damaged by collision with two swans between Baltimore and Washington and crashed with a loss of 17 lives. Those incidents focused attention on the bird hazard problem in the United States.
                                
Resumo:
SUMMARY: Federal Urban Rat Control Program grants were awarded to cities in different areas of the United States. Severe problems of rat infestations have been detected in many of the cities by the Environmental Health Service. Approximately 20% of 3.8 million people in the project areas were occupying homes infested with rats. Control operations are now in effect in all cities, and the living conditions of the people have been substantially improved. An increase in interest in rodent control also is evident in countries outside of the United States. The Technical Development Laboratories of the National Communicable Disease Center are participating in the World Health Organization program of research on new rodenticides. The evaluation program involves five steps which carry a candidate toxi¬cant from laboratory phase through field testing. Acceptability and suitable concentrations of both acute and accumulative rodenticides are determined. Observations are made on the hazard of the compound to pets and to other nontarget vertebrates. Laboratory and field studies have been completed on a new, promising stabilized scilliroside glycoside which has given excellent control of the Norway rat in 16 out of 19 premises. Another new coded compound has shown a unique specificity for roof rats as compared to Norway rats. Although anticoagulant resistant rat populations have occurred in several countries in Europe, as yet no evidence has been noted of such resistance in rats in the United States.
Manipulation of follicle development to ensure optimal oocyte quality and conception rates in cattle
                                
Resumo:
Over the last several decades, a number of therapies have been developed that manipulate ovarian follicle growth to improve oocyte quality and conception rates in cattle. Various strategies have been proposed to improve the responses to reproductive biotechnologies following timed artificial insemination (TAI), superovulation (SOV) or ovum pickup (OPU) programmes. During TAI protocols, final follicular growth and size of the ovulatory follicle are key factors that may significantly influence oocyte quality, ovulation, the uterine environment and consequently pregnancy outcomes. Progesterone concentrations during SOV protocols influence follicular growth, oocyte quality and embryo quality; therefore, several adjustments to SOV protocols have been proposed depending on the animal category and breed. In addition, the success of in vitro embryo production is directly related to the number and quality of cumulus oocyte complexes harvested by OPU. Control of follicle development has a significant impact on the OPU outcome. This article discusses a number of key points related to the manipulation of ovarian follicular growth to maximize oocyte quality and improve conception rates following TAI and embryo transfer of in vivo-and in vitro-derived embryos in cattle.
                                
Resumo:
It is well known that control systems are the core of electronic differential systems (EDSs) in electric vehicles (EVs)/hybrid HEVs (HEVs). However, conventional closed-loop control architectures do not completely match the needed ability to reject noises/disturbances, especially regarding the input acceleration signal incoming from the driver's commands, which makes the EDS (in this case) ineffective. Due to this, in this paper, a novel EDS control architecture is proposed to offer a new approach for the traction system that can be used with a great variety of controllers (e. g., classic, artificial intelligence (AI)-based, and modern/robust theory). In addition to this, a modified proportional-integral derivative (PID) controller, an AI-based neuro-fuzzy controller, and a robust optimal H-infinity controller were designed and evaluated to observe and evaluate the versatility of the novel architecture. Kinematic and dynamic models of the vehicle are briefly introduced. Then, simulated and experimental results were presented and discussed. A Hybrid Electric Vehicle in Low Scale (HELVIS)-Sim simulation environment was employed to the preliminary analysis of the proposed EDS architecture. Later, the EDS itself was embedded in a dSpace 1103 high-performance interface board so that real-time control of the rear wheels of the HELVIS platform was successfully achieved.
                                
Resumo:
Background The optimal revascularization strategy for diabetic patients with multivessel coronary artery disease (MVD) remains uncertain for lack of an adequately powered, randomized trial. The FREEDOM trial was designed to compare contemporary coronary artery bypass grafting (CABG) to percutaneous coronary intervention (PCI) with drug-eluting stents in diabetic patients with MVD against a background of optimal medical therapy. Methods A total of 1,900 diabetic participants with MVD were randomized to PCI or CABG worldwide from April 2005 to March 2010. FREEDOM is a superiority trial with a mean follow-up of 4.37 years (minimum 2 years) and 80% power to detect a 27.0% relative reduction. We present the baseline characteristics of patients screened and randomized, and provide a comparison with other MVD trials involving diabetic patients. Results The randomized cohort was 63.1 +/- 9.1 years old and 29% female, with a median diabetes duration of 10.2 +/- 8.9 years. Most (83%) had 3-vessel disease and on average took 5.5 +/- 1.7 vascular medications, with 32% on insulin therapy. Nearly all had hypertension and/or dyslipidemia, and 26% had a prior myocardial infarction. Mean hemoglobin A1c was 7.8 +/- 1.7 mg/dL, 29% had low-density lipoprotein <70 mg/dL, and mean systolic blood pressure was 134 +/- 20 mm Hg. The mean SYNTAX score was 26.2 with a symmetric distribution. FREEDOM trial participants have baseline characteristics similar to those of contemporary multivessel and diabetes trial cohorts. Conclusions The FREEDOM trial has successfully recruited a high-risk diabetic MVD cohort. Follow-up efforts include aggressive monitoring to optimize background risk factor control. FREEDOM will contribute significantly to the PCI versus CABG debate in diabetic patients with MVD. (Am Heart J 2012;164:591-9.)
                                
Resumo:
This work studies the optimization and control of a styrene polymerization reactor. The proposed strategy deals with the case where, because of market conditions and equipment deterioration, the optimal operating point of the continuous reactor is modified significantly along the operation time and the control system has to search for this optimum point, besides keeping the reactor system stable at any possible point. The approach considered here consists of three layers: the Real Time Optimization (RTO), the Model Predictive Control (MPC) and a Target Calculation (TC) that coordinates the communication between the two other layers and guarantees the stability of the whole structure. The proposed algorithm is simulated with the phenomenological model of a styrene polymerization reactor, which has been widely used as a benchmark for process control. The complete optimization structure for the styrene process including disturbances rejection is developed. The simulation results show the robustness of the proposed strategy and the capability to deal with disturbances while the economic objective is optimized.
                                
Resumo:
This work proposes the development of an Adaptive Neuro-fuzzy Inference System (ANFIS) estimator applied to speed control in a three-phase induction motor sensorless drive. Usually, ANFIS is used to replace the traditional PI controller in induction motor drives. The evaluation of the estimation capability of the ANFIS in a sensorless drive is one of the contributions of this work. The ANFIS speed estimator is validated in a magnetizing flux oriented control scheme, consisting in one more contribution. As an open-loop estimator, it is applied to moderate performance drives and it is not the proposal of this work to solve the low and zero speed estimation problems. Simulations to evaluate the performance of the estimator considering the vector drive system were done from the Matlab/Simulink(R) software. To determine the benefits of the proposed model, a practical system was implemented using a voltage source inverter (VSI) to drive the motor and the vector control including the ANFIS estimator, which is carried out by the Real Time Toolbox from Matlab/Simulink(R) software and a data acquisition card from National Instruments.
                                
Resumo:
Abstract Background In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method The cost-effectiveness of the Optimal® and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U.S. dollars. Sensitivity analysis was performed considering key model parameters. Results In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion Microscopy is more cost-effective than OptiMal® in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.
                                
Resumo:
Many engineering sectors are challenged by multi-objective optimization problems. Even if the idea behind these problems is simple and well established, the implementation of any procedure to solve them is not a trivial task. The use of evolutionary algorithms to find candidate solutions is widespread. Usually they supply a discrete picture of the non-dominated solutions, a Pareto set. Although it is very interesting to know the non-dominated solutions, an additional criterion is needed to select one solution to be deployed. To better support the design process, this paper presents a new method of solving non-linear multi-objective optimization problems by adding a control function that will guide the optimization process over the Pareto set that does not need to be found explicitly. The proposed methodology differs from the classical methods that combine the objective functions in a single scale, and is based on a unique run of non-linear single-objective optimizers.
                                
Resumo:
This paper proposes two new approaches for the sensitivity analysis of multiobjective design optimization problems whose performance functions are highly susceptible to small variations in the design variables and/or design environment parameters. In both methods, the less sensitive design alternatives are preferred over others during the multiobjective optimization process. While taking the first approach, the designer chooses the design variable and/or parameter that causes uncertainties. The designer then associates a robustness index with each design alternative and adds each index as an objective function in the optimization problem. For the second approach, the designer must know, a priori, the interval of variation in the design variables or in the design environment parameters, because the designer will be accepting the interval of variation in the objective functions. The second method does not require any law of probability distribution of uncontrollable variations. Finally, the authors give two illustrative examples to highlight the contributions of the paper.
                                
Resumo:
This work studies the optimization and control of a styrene polymerization reactor. The proposed strategy deals with the case where, because of market conditions and equipment deterioration, the optimal operating point of the continuous reactor is modified significantly along the operation time and the control system has to search for this optimum point, besides keeping the reactor system stable at any possible point. The approach considered here consists of three layers: the Real Time Optimization (RTO), the Model Predictive Control (MPC) and a Target Calculation (TC) that coordinates the communication between the two other layers and guarantees the stability of the whole structure. The proposed algorithm is simulated with the phenomenological model of a styrene polymerization reactor, which has been widely used as a benchmark for process control. The complete optimization structure for the styrene process including disturbances rejection is developed. The simulation results show the robustness of the proposed strategy and the capability to deal with disturbances while the economic objective is optimized.
                                
Resumo:
The world of communication has changed quickly in the last decade resulting in the the rapid increase in the pace of peoples’ lives. This is due to the explosion of mobile communication and the internet which has now reached all levels of society. With such pressure for access to communication there is increased demand for bandwidth. Photonic technology is the right solution for high speed networks that have to supply wide bandwidth to new communication service providers. In particular this Ph.D. dissertation deals with DWDM optical packet-switched networks. The issue introduces a huge quantity of problems from physical layer up to transport layer. Here this subject is tackled from the network level perspective. The long term solution represented by optical packet switching has been fully explored in this years together with the Network Research Group at the department of Electronics, Computer Science and System of the University of Bologna. Some national as well as international projects supported this research like the Network of Excellence (NoE) e-Photon/ONe, funded by the European Commission in the Sixth Framework Programme and INTREPIDO project (End-to-end Traffic Engineering and Protection for IP over DWDM Optical Networks) funded by the Italian Ministry of Education, University and Scientific Research. Optical packet switching for DWDM networks is studied at single node level as well as at network level. In particular the techniques discussed are thought to be implemented for a long-haul transport network that connects local and metropolitan networks around the world. The main issues faced are contention resolution in a asynchronous variable packet length environment, adaptive routing, wavelength conversion and node architecture. Characteristics that a network must assure as quality of service and resilience are also explored at both node and network level. Results are mainly evaluated via simulation and through analysis.
                                
Resumo:
Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.
                                
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.
                                
Resumo:
Providing support for multimedia applications on low-power mobile devices remains a significant research challenge. This is primarily due to two reasons: • Portable mobile devices have modest sizes and weights, and therefore inadequate resources, low CPU processing power, reduced display capabilities, limited memory and battery lifetimes as compared to desktop and laptop systems. • On the other hand, multimedia applications tend to have distinctive QoS and processing requirementswhichmake themextremely resource-demanding. This innate conflict introduces key research challenges in the design of multimedia applications and device-level power optimization. Energy efficiency in this kind of platforms can be achieved only via a synergistic hardware and software approach. In fact, while System-on-Chips are more and more programmable thus providing functional flexibility, hardwareonly power reduction techniques cannot maintain consumption under acceptable bounds. It is well understood both in research and industry that system configuration andmanagement cannot be controlled efficiently only relying on low-level firmware and hardware drivers. In fact, at this level there is lack of information about user application activity and consequently about the impact of power management decision on QoS. Even though operating system support and integration is a requirement for effective performance and energy management, more effective and QoSsensitive power management is possible if power awareness and hardware configuration control strategies are tightly integratedwith domain-specificmiddleware services. The main objective of this PhD research has been the exploration and the integration of amiddleware-centric energymanagement with applications and operating-system. We choose to focus on the CPU-memory and the video subsystems, since they are the most power-hungry components of an embedded system. A second main objective has been the definition and implementation of software facilities (like toolkits, API, and run-time engines) in order to improve programmability and performance efficiency of such platforms. Enhancing energy efficiency and programmability ofmodernMulti-Processor System-on-Chips (MPSoCs) Consumer applications are characterized by tight time-to-market constraints and extreme cost sensitivity. The software that runs on modern embedded systems must be high performance, real time, and even more important low power. Although much progress has been made on these problems, much remains to be done. Multi-processor System-on-Chip (MPSoC) are increasingly popular platforms for high performance embedded applications. This leads to interesting challenges in software development since efficient software development is a major issue for MPSoc designers. An important step in deploying applications on multiprocessors is to allocate and schedule concurrent tasks to the processing and communication resources of the platform. The problem of allocating and scheduling precedenceconstrained tasks on processors in a distributed real-time system is NP-hard. There is a clear need for deployment technology that addresses thesemulti processing issues. This problem can be tackled by means of specific middleware which takes care of allocating and scheduling tasks on the different processing elements and which tries also to optimize the power consumption of the entire multiprocessor platform. This dissertation is an attempt to develop insight into efficient, flexible and optimalmethods for allocating and scheduling concurrent applications tomultiprocessor architectures. It is a well-known problem in literature: this kind of optimization problems are very complex even in much simplified variants, therefore most authors propose simplified models and heuristic approaches to solve it in reasonable time. Model simplification is often achieved by abstracting away platform implementation ”details”. As a result, optimization problems become more tractable, even reaching polynomial time complexity. Unfortunately, this approach creates an abstraction gap between the optimization model and the real HW-SW platform. The main issue with heuristic or, more in general, with incomplete search is that they introduce an optimality gap of unknown size. They provide very limited or no information on the distance between the best computed solution and the optimal one. The goal of this work is to address both abstraction and optimality gaps, formulating accurate models which accounts for a number of ”non-idealities” in real-life hardware platforms, developing novel mapping algorithms that deterministically find optimal solutions, and implementing software infrastructures required by developers to deploy applications for the targetMPSoC platforms. Energy Efficient LCDBacklightAutoregulation on Real-LifeMultimediaAp- plication Processor Despite the ever increasing advances in Liquid Crystal Display’s (LCD) technology, their power consumption is still one of the major limitations to the battery life of mobile appliances such as smart phones, portable media players, gaming and navigation devices. There is a clear trend towards the increase of LCD size to exploit the multimedia capabilities of portable devices that can receive and render high definition video and pictures. Multimedia applications running on these devices require LCD screen sizes of 2.2 to 3.5 inches andmore to display video sequences and pictures with the required quality. LCD power consumption is dependent on the backlight and pixel matrix driving circuits and is typically proportional to the panel area. As a result, the contribution is also likely to be considerable in future mobile appliances. To address this issue, companies are proposing low power technologies suitable for mobile applications supporting low power states and image control techniques. On the research side, several power saving schemes and algorithms can be found in literature. Some of them exploit software-only techniques to change the image content to reduce the power associated with the crystal polarization, some others are aimed at decreasing the backlight level while compensating the luminance reduction by compensating the user perceived quality degradation using pixel-by-pixel image processing algorithms. The major limitation of these techniques is that they rely on the CPU to perform pixel-based manipulations and their impact on CPU utilization and power consumption has not been assessed. This PhDdissertation shows an alternative approach that exploits in a smart and efficient way the hardware image processing unit almost integrated in every current multimedia application processors to implement a hardware assisted image compensation that allows dynamic scaling of the backlight with a negligible impact on QoS. The proposed approach overcomes CPU-intensive techniques by saving system power without requiring either a dedicated display technology or hardware modification. Thesis Overview The remainder of the thesis is organized as follows. The first part is focused on enhancing energy efficiency and programmability of modern Multi-Processor System-on-Chips (MPSoCs). Chapter 2 gives an overview about architectural trends in embedded systems, illustrating the principal features of new technologies and the key challenges still open. Chapter 3 presents a QoS-driven methodology for optimal allocation and frequency selection for MPSoCs. The methodology is based on functional simulation and full system power estimation. Chapter 4 targets allocation and scheduling of pipelined stream-oriented applications on top of distributed memory architectures with messaging support. We tackled the complexity of the problem by means of decomposition and no-good generation, and prove the increased computational efficiency of this approach with respect to traditional ones. Chapter 5 presents a cooperative framework to solve the allocation, scheduling and voltage/frequency selection problem to optimality for energyefficient MPSoCs, while in Chapter 6 applications with conditional task graph are taken into account. Finally Chapter 7 proposes a complete framework, called Cellflow, to help programmers in efficient software implementation on a real architecture, the Cell Broadband Engine processor. The second part is focused on energy efficient software techniques for LCD displays. Chapter 8 gives an overview about portable device display technologies, illustrating the principal features of LCD video systems and the key challenges still open. Chapter 9 shows several energy efficient software techniques present in literature, while Chapter 10 illustrates in details our method for saving significant power in an LCD panel. Finally, conclusions are drawn, reporting the main research contributions that have been discussed throughout this dissertation.
 
                    