945 resultados para open robot control
Resumo:
This dissertation presents the competitive control methodologies for small-scale power system (SSPS). A SSPS is a collection of sources and loads that shares a common network which can be isolated during terrestrial disturbances. Micro-grids, naval ship electric power systems (NSEPS), aircraft power systems and telecommunication system power systems are typical examples of SSPS. The analysis and development of control systems for small-scale power systems (SSPS) lacks a defined slack bus. In addition, a change of a load or source will influence the real time system parameters of the system. Therefore, the control system should provide the required flexibility, to ensure operation as a single aggregated system. In most of the cases of a SSPS the sources and loads must be equipped with power electronic interfaces which can be modeled as a dynamic controllable quantity. The mathematical formulation of the micro-grid is carried out with the help of game theory, optimal control and fundamental theory of electrical power systems. Then the micro-grid can be viewed as a dynamical multi-objective optimization problem with nonlinear objectives and variables. Basically detailed analysis was done with optimal solutions with regards to start up transient modeling, bus selection modeling and level of communication within the micro-grids. In each approach a detail mathematical model is formed to observe the system response. The differential game theoretic approach was also used for modeling and optimization of startup transients. The startup transient controller was implemented with open loop, PI and feedback control methodologies. Then the hardware implementation was carried out to validate the theoretical results. The proposed game theoretic controller shows higher performances over traditional the PI controller during startup. In addition, the optimal transient surface is necessary while implementing the feedback controller for startup transient. Further, the experimental results are in agreement with the theoretical simulation. The bus selection and team communication was modeled with discrete and continuous game theory models. Although players have multiple choices, this controller is capable of choosing the optimum bus. Next the team communication structures are able to optimize the players’ Nash equilibrium point. All mathematical models are based on the local information of the load or source. As a result, these models are the keys to developing accurate distributed controllers.
Resumo:
Polycarbonate (PC) is an important engineering thermoplastic that is currently produced in large industrial scale using bisphenol A and monomers such as phosgene. Since phosgene is highly toxic, a non-phosgene approach using diphenyl carbonate (DPC) as an alternative monomer, as developed by Asahi Corporation of Japan, is a significantly more environmentally friendly alternative. Other advantages include the use of CO2 instead of CO as raw material and the elimination of major waste water production. However, for the production of DPC to be economically viable, reactive-distillation units are needed to obtain the necessary yields by shifting the reaction-equilibrium to the desired products and separating the products at the point where the equilibrium reaction occurs. In the field of chemical reaction engineering, there are many reactions that are suffering from the low equilibrium constant. The main goal of this research is to determine the optimal process needed to shift the reactions by using appropriate control strategies of the reactive distillation system. An extensive dynamic mathematical model has been developed to help us investigate different control and processing strategies of the reactive distillation units to increase the production of DPC. The high-fidelity dynamic models include extensive thermodynamic and reaction-kinetics models while incorporating the necessary mass and energy balance of the various stages of the reactive distillation units. The study presented in this document shows the possibility of producing DPC via one reactive distillation instead of the conventional two-column, with a production rate of 16.75 tons/h corresponding to start reactants materials of 74.69 tons/h of Phenol and 35.75 tons/h of Dimethyl Carbonate. This represents a threefold increase over the projected production rate given in the literature based on a two-column configuration. In addition, the purity of the DPC produced could reach levels as high as 99.5% with the effective use of controls. These studies are based on simulation done using high-fidelity dynamic models.
Resumo:
A novel solution to the long standing issue of chip entanglement and breakage in metal cutting is presented in this dissertation. Through this work, an attempt is made to achieve universal chip control in machining by using chip guidance and subsequent breakage by backward bending (tensile loading of the chip's rough top surface) to effectively control long continuous chips into small segments. One big limitation of using chip breaker geometries in disposable carbide inserts is that the application range is limited to a narrow band depending on cutting conditions. Even within a recommended operating range, chip breakers do not function effectively as designed due to the inherent variations of the cutting process. Moreover, for a particular process, matching the chip breaker geometry with the right cutting conditions to achieve effective chip control is a very iterative process. The existence of a large variety of proprietary chip breaker designs further exacerbates the problem of easily implementing a robust and comprehensive chip control technique. To address the need for a robust and universal chip control technique, a new method is proposed in this work. By using a single tool top form geometry coupled with a tooling system for inducing chip breaking by backward bending, the proposed method achieves comprehensive chip control over a wide range of cutting conditions. A geometry based model is developed to predict a variable edge inclination angle that guides the chip flow to a predetermined target location. Chip kinematics for the new tool geometry is examined via photographic evidence from experimental cutting trials. Both qualitative and quantitative methods are used to characterize the chip kinematics. Results from the chip characterization studies indicate that the chip flow and final form show a remarkable consistency across multiple levels of workpiece and tool configurations as well as cutting conditions. A new tooling system is then designed to comprehensively break the chip by backward bending. Test results with the new tooling system prove that by utilizing the chip guidance and backward bending mechanism, long continuous chips can be more consistently broken into smaller segments that are generally deemed acceptable or good chips. It is found that the proposed tool can be applied effectively over a wider range of cutting conditions than present chip breakers thus taking possibly the first step towards achieving universal chip control in machining.
Resumo:
This report presents a study on the problem of spacecraft attitude control using magnetic actuators. Several existing approaches are reviewed and one control strategy is implemented and simulated. A time-varying feedback control law achieving inertial pointing for magnetically actuated spacecraft is implemented. The report explains the modeling of the spacecraft rigid body dynamics, kinematics and attitude control in detail. Besides the fact that control laws have been established for stabilization around local equilibrium, this report presents the results of a control law that yields a generic, global solution for attitude stabilization of a magnetically actuated spacecraft. The report also involves the use MATLAB as a tool for both modeling and simulation of the spacecraft and controller. In conclusion, the simulation outlines the performance of the controller in independently stabilizing the spacecraft in three mutually perpendicular directions.
Resumo:
This research evaluated an Intelligent Compaction (IC) unit on the M-189 highway reconstruction project at Iron River, Michigan. The results from the IC unit were compared to several traditional compaction measurement devices including Nuclear Density Gauge (NDG), Geogauge, Light Weight Deflectometer (LWD), Dynamic Cone Penetrometer (DCP), and Modified Clegg Hammer (MCH). The research collected point measurements data on a test section in which 30 test locations on the final Class II sand base layer and the 22A gravel layer. These point measurements were compared with the IC measurements (ICMVs) on a point-to-point basis through a linear regression analysis. Poor correlations were obtained among different measurements points using simple regression analysis. When comparing the ICMV to the compaction measurements points. Factors attributing to the weak correlation include soil heterogeneity, variation in IC roller operation parameters, in-place moisture content, the narrow range of the compaction devices measurement ranges and support conditions of the support layers. After incorporating some of the affecting factors into a multiple regression analysis, the strength of correlation significantly improved, especially on the stiffer gravel layer. Measurements were also studied from an overall distribution perspective in terms of average, measurement range, standard deviation, and coefficient of variance. Based on data analysis, on-site project observation and literature review, conclusions were made on how IC performed in regards to compaction control on the M-189 reconstruction project.
Resumo:
Two important and upcoming technologies, microgrids and electricity generation from wind resources, are increasingly being combined. Various control strategies can be implemented, and droop control provides a simple option without requiring communication between microgrid components. Eliminating the single source of potential failure around the communication system is especially important in remote, islanded microgrids, which are considered in this work. However, traditional droop control does not allow the microgrid to utilize much of the power available from the wind. This dissertation presents a novel droop control strategy, which implements a droop surface in higher dimension than the traditional strategy. The droop control relationship then depends on two variables: the dc microgrid bus voltage, and the wind speed at the current time. An approach for optimizing this droop control surface in order to meet a given objective, for example utilizing all of the power available from a wind resource, is proposed and demonstrated. Various cases are used to test the proposed optimal high dimension droop control method, and demonstrate its function. First, the use of linear multidimensional droop control without optimization is demonstrated through simulation. Next, an optimal high dimension droop control surface is implemented with a simple dc microgrid containing two sources and one load. Various cases for changing load and wind speed are investigated using simulation and hardware-in-the-loop techniques. Optimal multidimensional droop control is demonstrated with a wind resource in a full dc microgrid example, containing an energy storage device as well as multiple sources and loads. Finally, the optimal high dimension droop control method is applied with a solar resource, and using a load model developed for a military patrol base application. The operation of the proposed control is again investigated using simulation and hardware-in-the-loop techniques.
Resumo:
For a microgrid with a high penetration level of renewable energy, energy storage use becomes more integral to the system performance due to the stochastic nature of most renewable energy sources. This thesis examines the use of droop control of an energy storage source in dc microgrids in order to optimize a global cost function. The approach involves using a multidimensional surface to determine the optimal droop parameters based on load and state of charge. The optimal surface is determined using knowledge of the system architecture and can be implemented with fully decentralized source controllers. The optimal surface control of the system is presented. Derivations of a cost function along with the implementation of the optimal control are included. Results were verified using a hardware-in-the-loop system.
Resumo:
Semi-active damping devices have been shown to be effective in mitigating unwanted vibrations in civil structures. These devices impart force indirectly through real-time alterations to structural properties. Simulating the complex behavior of these devices for laboratory-scale experiments is a major challenge. Commercial devices for seismic applications typically operate in the 2-10 kN range; this force is too high for small-scale testing applications where requirements typically range from 0-10 N. Several challenges must be overcome to produce damping forces at this level. In this study, a small-scale magneto-rheological (MR) damper utilizing a fluid absorbent metal foam matrix is developed and tested to accomplish this goal. This matrix allows magneto-rheological (MR) fluid to be extracted upon magnetic excitation in order to produce MR-fluid shear stresses and viscosity effects between an electromagnetic piston, the foam, and the damper housing. Dampers for uniaxial seismic excitation are traditionally positioned in the horizontal orientation allowing MR-fluid to gather in the lower part of the damper housing when partially filled. Thus, the absorbent matrix is placed in the bottom of the housing relieving the need to fill the entire device with MR-fluid, a practice that requires seals that add significant unwanted friction to the desired low-force device. The damper, once constructed, can be used in feedback control applications to reduce seismic vibrations and to test structural control algorithms and wireless command devices. To validate this device, a parametric study was performed utilizing force and acceleration measurements to characterize damper performance and controllability for this actuator. A discussion of the results is presented to demonstrate the attainment of the damper design objectives.
Resumo:
As microgrid power systems gain prevalence and renewable energy comprises greater and greater portions of distributed generation, energy storage becomes important to offset the higher variance of renewable energy sources and maximize their usefulness. One of the emerging techniques is to utilize a combination of lead-acid batteries and ultracapacitors to provide both short and long-term stabilization to microgrid systems. The different energy and power characteristics of batteries and ultracapacitors imply that they ought to be utilized in different ways. Traditional linear controls can use these energy storage systems to stabilize a power grid, but cannot effect more complex interactions. This research explores a fuzzy logic approach to microgrid stabilization. The ability of a fuzzy logic controller to regulate a dc bus in the presence of source and load fluctuations, in a manner comparable to traditional linear control systems, is explored and demonstrated. Furthermore, the expanded capabilities (such as storage balancing, self-protection, and battery optimization) of a fuzzy logic system over a traditional linear control system are shown. System simulation results are presented and validated through hardware-based experiments. These experiments confirm the capabilities of the fuzzy logic control system to regulate bus voltage, balance storage elements, optimize battery usage, and effect self-protection.
Resumo:
Free-radical retrograde-precipitation polymerization, FRRPP in short, is a novel polymerization process discovered by Dr. Gerard Caneba in the late 1980s. The current study is aimed at gaining a better understanding of the reaction mechanism of the FRRPP and its thermodynamically-driven features that are predominant in controlling the chain reaction. A previously developed mathematical model to represent free radical polymerization kinetics was used to simulate a classic bulk polymerization system from the literature. Unlike other existing models, such a sparse-matrix-based representation allows one to explicitly accommodate the chain length dependent kinetic parameters. Extrapolating from the past results, mixing was experimentally shown to be exerting a significant influence on reaction control in FRRPP systems. Mixing alone drives the otherwise severely diffusion-controlled reaction propagation in phase-separated polymer domains. Therefore, in a quiescent system, in the absence of mixing, it is possible to retard the growth of phase-separated domains, thus producing isolated polymer nanoparticles (globules). Such a diffusion-controlled, self-limiting phenomenon of chain growth was also observed using time-resolved small angle x-ray scattering studies of reaction kinetics in quiescent systems of FRRPP. Combining the concept of self-limiting chain growth in quiescent FRRPP systems with spatioselective reaction initiation of lithography, microgel structures were synthesized in a single step, without the use of molds or additives. Hard x-rays from the bending magnet radiation of a synchrotron were used as an initiation source, instead of the more statistally-oriented chemical initiators. Such a spatially-defined reaction was shown to be self-limiting to the irradiated regions following a polymerization-induced self-assembly phenomenon. The pattern transfer aspects of this technique were, therefore, studied in the FRRP polymerization of N-isopropylacrylamide (NIPAm) and methacrylic acid (MAA), a thermoreversible and ionic hydrogel, respectively. Reaction temperature increases the contrast between the exposed and unexposed zones of the formed microgels, while the irradiation dose is directly proportional to the extent of phase separation. The response of Poly (NIPAm) microgels prepared from the technique described in this study was also characterized by small angle neutron scattering.
Resumo:
This study evaluates the clinical applicability of administering sodium nitroprusside by a closed-loop titration system compared with a manually adjusted system. The mean arterial pressure (MAP) was registered every 10 and 30 sec during the first 150 min after open heart surgery in 20 patients (group 1: computer regulation) and in ten patients (group 2: manual regulation). The results (16,343 and 2,912 data points in groups 1 and 2, respectively), were then analyzed in four time frames and five pressure ranges to indicate clinical efficacy. Sixty percent of the measured MAP in both groups was within the desired +/- 10% during the first 10 min. Thereafter until the end of observation, the MAP was maintained within +/- 10% of the desired set-point 90% of the time in group 1 vs. 60% of the time in group 2. One percent and 11% of data points were +/- 20% from the set-point in groups 1 and 2, respectively (p less than .05, chi-square test). The computer-assisted therapy provided better control of MAP, was safe to use, and helped to reduce nursing demands.
Resumo:
BACKGROUND: Gene expression analysis has emerged as a major biological research area, with real-time quantitative reverse transcription PCR (RT-QPCR) being one of the most accurate and widely used techniques for expression profiling of selected genes. In order to obtain results that are comparable across assays, a stable normalization strategy is required. In general, the normalization of PCR measurements between different samples uses one to several control genes (e.g. housekeeping genes), from which a baseline reference level is constructed. Thus, the choice of the control genes is of utmost importance, yet there is not a generally accepted standard technique for screening a large number of candidates and identifying the best ones. RESULTS: We propose a novel approach for scoring and ranking candidate genes for their suitability as control genes. Our approach relies on publicly available microarray data and allows the combination of multiple data sets originating from different platforms and/or representing different pathologies. The use of microarray data allows the screening of tens of thousands of genes, producing very comprehensive lists of candidates. We also provide two lists of candidate control genes: one which is breast cancer-specific and one with more general applicability. Two genes from the breast cancer list which had not been previously used as control genes are identified and validated by RT-QPCR. Open source R functions are available at http://www.isrec.isb-sib.ch/~vpopovic/research/ CONCLUSION: We proposed a new method for identifying candidate control genes for RT-QPCR which was able to rank thousands of genes according to some predefined suitability criteria and we applied it to the case of breast cancer. We also empirically showed that translating the results from microarray to PCR platform was achievable.
Resumo:
BACKGROUND: Unlike most antihyperglycaemic drugs, glucagon-like peptide-1 (GLP-1) receptor agonists have a glucose-dependent action and promote weight loss. We compared the efficacy and safety of liraglutide, a human GLP-1 analogue, with exenatide, an exendin-based GLP-1 receptor agonist. METHODS: Adults with inadequately controlled type 2 diabetes on maximally tolerated doses of metformin, sulphonylurea, or both, were stratified by previous oral antidiabetic therapy and randomly assigned to receive additional liraglutide 1.8 mg once a day (n=233) or exenatide 10 microg twice a day (n=231) in a 26-week open-label, parallel-group, multinational (15 countries) study. The primary outcome was change in glycosylated haemoglobin (HbA(1c)). Efficacy analyses were by intention to treat. The trial is registered with ClinicalTrials.gov, number NCT00518882. FINDINGS: Mean baseline HbA(1c) for the study population was 8.2%. Liraglutide reduced mean HbA(1c) significantly more than did exenatide (-1.12% [SE 0.08] vs -0.79% [0.08]; estimated treatment difference -0.33; 95% CI -0.47 to -0.18; p<0.0001) and more patients achieved a HbA(1c) value of less than 7% (54%vs 43%, respectively; odds ratio 2.02; 95% CI 1.31 to 3.11; p=0.0015). Liraglutide reduced mean fasting plasma glucose more than did exenatide (-1.61 mmol/L [SE 0.20] vs -0.60 mmol/L [0.20]; estimated treatment difference -1.01 mmol/L; 95% CI -1.37 to -0.65; p<0.0001) but postprandial glucose control was less effective after breakfast and dinner. Both drugs promoted similar weight losses (liraglutide -3.24 kg vs exenatide -2.87 kg). Both drugs were well tolerated, but nausea was less persistent (estimated treatment rate ratio 0.448, p<0.0001) and minor hypoglycaemia less frequent with liraglutide than with exenatide (1.93 vs 2.60 events per patient per year; rate ratio 0.55; 95% CI 0.34 to 0.88; p=0.0131; 25.5%vs 33.6% had minor hypoglycaemia). Two patients taking both exenatide and a sulphonylurea had a major hypoglycaemic episode. INTERPRETATION: Liraglutide once a day provided significantly greater improvements in glycaemic control than did exenatide twice a day, and was generally better tolerated. The results suggest that liraglutide might be a treatment option for type 2 diabetes, especially when weight loss and risk of hypoglycaemia are major considerations.
Resumo:
„Open source and European antitrust laws: An analysis of copyleft and the prohibition of software license fees on the basis of art. 101 TFEU and the block exemptions“ Open source software and open source licenses (like the GNU GPL) are not only relevant for computer nerds or activists – they are already business. They are for example the fundament of LINUX, the only real rival of MICROSOFT’s WINDOWS-line in the field of operating systems for IBM PC compatibles. Art. 101 TFEU (like the identical predecessor art. 81 TEC) as part of the EU antitrust laws prohibits contract terms like price fixing and some forms of technology control. Are copyleft – the „viral effect“, the „cancer“ – and the interdiction of software license fees in the cross hairs of this legal rule? On the other side the European Union has since 2004 a new Technology Transfer Block Exemption with software license agreements for the first time in its scope: a safe harbour and a dry place under a umbrella for open source software? After the introduction (A) with a description of open source software the following text analyses the system of the European Unions competition law respectivley antitrust law and the requirements of the block exemptions (B). Starting point of antitrust analysis are undertakings – but who are the untertakings (C) in the field of widespread, independent developers as part of the „bazar organization“? To see how much open source has to fear from the law of the European Union, at the end the anti competitive and pro competitive effects of open source are totalized within the legal framework (D). The conclusion (E) shows: not nothing, but not much.