16 resultados para Field Oriented Control

em Digital Commons at Florida International University


Relevância:

80.00% 80.00%

Publicador:

Resumo:

High efficiency of power converters placed between renewable energy sources and the utility grid is required to maximize the utilization of these sources. Power quality is another aspect that requires large passive elements (inductors, capacitors) to be placed between these sources and the grid. The main objective is to develop higher-level high frequency-based power converter system (HFPCS) that optimizes the use of hybrid renewable power injected into the power grid. The HFPCS provides high efficiency, reduced size of passive components, higher levels of power density realization, lower harmonic distortion, higher reliability, and lower cost. The dynamic modeling for each part in this system is developed, simulated and tested. The steady-state performance of the grid-connected hybrid power system with battery storage is analyzed. Various types of simulations were performed and a number of algorithms were developed and tested to verify the effectiveness of the power conversion topologies. A modified hysteresis-control strategy for the rectifier and the battery charging/discharging system was developed and implemented. A voltage oriented control (VOC) scheme was developed to control the energy injected into the grid. The developed HFPCS was compared experimentally with other currently available power converters. The developed HFPCS was employed inside a microgrid system infrastructure, connecting it to the power grid to verify its power transfer capabilities and grid connectivity. Grid connectivity tests verified these power transfer capabilities of the developed converter in addition to its ability of serving the load in a shared manner. In order to investigate the performance of the developed system, an experimental setup for the HF-based hybrid generation system was constructed. We designed a board containing a digital signal processor chip on which the developed control system was embedded. The board was fabricated and experimentally tested. The system's high precision requirements were verified. Each component of the system was built and tested separately, and then the whole system was connected and tested. The simulation and experimental results confirm the effectiveness of the developed converter system for grid-connected hybrid renewable energy systems as well as for hybrid electric vehicles and other industrial applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Access control (AC) limits access to the resources of a system only to authorized entities. Given that information systems today are increasingly interconnected, AC is extremely important. The implementation of an AC service is a complicated task. Yet the requirements to an AC service vary a lot. Accordingly, the design of an AC service should be flexible and extensible in order to save development effort and time. Unfortunately, with conventional object-oriented techniques, when an extension has not been anticipated at the design time, the modification incurred by the extension is often invasive. Invasive changes destroy design modularity, further deteriorate design extensibility, and even worse, they reduce product reliability. ^ A concern is crosscutting if it spans multiple object-oriented classes. It was identified that invasive changes were due to the crosscutting nature of most unplanned extensions. To overcome this problem, an aspect-oriented design approach for AC services was proposed, as aspect-oriented techniques could effectively encapsulate crosscutting concerns. The proposed approach was applied to develop an AC framework that supported role-based access control model. In the framework, the core role-based access control mechanism is given in an object-oriented design, while each extension is captured as an aspect. The resulting framework is well-modularized, flexible, and most importantly, supports noninvasive adaptation. ^ In addition, a process to formalize the aspect-oriented design was described. The purpose is to provide high assurance for AC services. Object-Z was used to specify the static structure and Predicate/Transition net was used to model the dynamic behavior. Object-Z was extended to facilitate specification in an aspect-oriented style. The process of formal modeling helps designers to enhance their understanding of the design, hence to detect problems. Furthermore, the specification can be mathematically verified. This provides confidence that the design is correct. It was illustrated through an example that the model was ready for formal analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Internet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearest-neighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We tested the relative importance of top-down and bottom-up effects by experimentally evaluating the combined and separate effects of nutrient availability and grazer species composition on epiphyte communities and seagrass condition in Florida Bay. Although we succeeded in substantially enriching our experimental cylinders, as indicated by elevated nitrogen concentrations in epiphytes and seagrass leaves, we did not observe any major increases in epiphyte biomass or major loss of Thalassia testudinum by algal overgrowth. Additionally, we did not detect any strong grazer effects and found very few significant nutrient-grazer interactions. While this might suggest that there was no important differential response to nutrients by individual grazer species or by various combinations of grazers, our results were complicated by the lack of significant differences between control and grazer treatments, and as such, these results are best explained by the presence of unwanted amphipod grazers (mean = 471 ind. m–2) in the control cylinders. Our estimates of grazing rates and epiphyte productivities indicate that amphipods in the control cylinders could have lowered epiphyte biomass to the same level that the experimental grazers did, thus effectively transforming the control treatments into grazer treatments. If so, our experiments suggest that the effects of invertebrate grazing (and those of amphipods alone) were stronger than the effects of nutrient enrichment on epiphytic algae, and that it does not require a large density

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large read-only or read-write transactions with a large read set and a small write set constitute an important class of transactions used in such applications as data mining, data warehousing, statistical applications, and report generators. Such transactions are best supported with optimistic concurrency, because locking of large amounts of data for extended periods of time is not an acceptable solution. The abort rate in regular optimistic concurrency algorithms increases exponentially with the size of the transaction. The algorithm proposed in this dissertation solves this problem by using a new transaction scheduling technique that allows a large transaction to commit safely with significantly greater probability that can exceed several orders of magnitude versus regular optimistic concurrency algorithms. A performance simulation study and a formal proof of serializability and external consistency of the proposed algorithm are also presented.^ This dissertation also proposes a new query optimization technique (lazy queries). Lazy Queries is an adaptive query execution scheme which optimizes itself as the query runs. Lazy queries can be used to find an intersection of sub-queries in a very efficient way, which does not require full execution of large sub-queries nor does it require any statistical knowledge about the data.^ An efficient optimistic concurrency control algorithm used in a massively parallel B-tree with variable-length keys is introduced. B-trees with variable-length keys can be effectively used in a variety of database types. In particular, we show how such a B-tree was used in our implementation of a semantic object-oriented DBMS. The concurrency control algorithm uses semantically safe optimistic virtual "locks" that achieve very fine granularity in conflict detection. This algorithm ensures serializability and external consistency by using logical clocks and backward validation of transactional queries. A formal proof of correctness of the proposed algorithm is also presented. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This investigation reports the magnetic field effect on natural convection heat transfer in a curved-shape enclosure. The numerical investigation is carried out using the control volume-based-finite element method (CVFEM). The numerical investigations are performed for various values of Hartmann number and Rayleigh number. The obtained results are depicted in terms of streamlines and isotherms which show the significant effects of Hartmann number on the fluid flow and temperature distribution inside the enclosure. Also, it was found that the Nusselt number decreases with an increase in the Hartmann number.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological detectors, such as canines, are valuable tools used for the rapid identification of illicit materials. However, recent increased scrutiny over the reliability, field accuracy, and the capabilities of each detection canine is currently being evaluated in the legal system. For example, the Supreme Court case, State of Florida v. Harris, discussed the need for continuous monitoring of canine abilities, thresholds, and search capabilities. As a result, the fallibility of canines for detection was brought to light, as well as a need for further research and understanding of canine detection. This study is two-fold, as it looks to not only create new training aids for canines that can be manipulated for dissipation control, but also investigates canine field accuracy to objects with similar odors to illicit materials. It was the goal of this research to improve upon current canine training aid mimics. Sol-gel polymer training aids, imprinted with the active odor of cocaine, were developed. This novel training aid improved upon the longevity of currently existing training aids, while also provided a way to manipulate the polymer network to alter the dissipation rate of the imprinted active odors. The manipulation of the polymer network could allow handlers to control the abundance of odors presented to their canines, familiarizing themselves to their canine’s capabilities and thresholds, thereby increasing the canines’ strength in court. The field accuracy of detection canines was recently called into question during the Supreme Court case, State of Florida v. Jardines, where it was argued that if cocaine’s active odor, methyl benzoate, was found to be produced by the popular landscaping flower, snapdragons, canines will false alert to said flowers. Therefore, snapdragon flowers were grown and tested both in the laboratory and in the field to determine the odors produced by snapdragon flowers; the persistence of these odors once flowers have been cut; and whether detection canines will alert to both growing and cut flowers during a blind search scenario. Results revealed that although methyl benzoate is produced by snapdragon flowers, certified narcotics detection canines can distinguish cocaine’s odor profile from that of snapdragon flowers and will not alert.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supervisory Control & Data Acquisition (SCADA) systems are used by many industries because of their ability to manage sensors and control external hardware. The problem with commercially available systems is that they are restricted to a local network of users that use proprietary software. There was no Internet development guide to give remote users out of the network, control and access to SCADA data and external hardware through simple user interfaces. To solve this problem a server/client paradigm was implemented to make SCADAs available via the Internet. Two methods were applied and studied: polling of a text file as a low-end technology solution and implementing a Transmission Control Protocol (TCP/IP) socket connection. Users were allowed to login to a website and control remotely a network of pumps and valves interfaced to a SCADA. This enabled them to sample the water quality of different reservoir wells. The results were based on real time performance, stability and ease of use of the remote interface and its programming. These indicated that the most feasible server to implement is the TCP/IP connection. For the user interface, Java applets and Active X controls provide the same real time access.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Intemet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearestneighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of High-Temperature Superconductors (HTSCs) has spurred the need for the fabrication of superconducting electronic devices able to match the performance of today's semiconductor devices. While there are several HTSCs in use today, YBaCuO7-x (YBCO) is the better characterized and more widely used material for small electronic applications. This thesis explores the fabrication of a Two-Terminal device with a superconductor and a painted on electrode as the terminals and a ferroelectric, BaTiO 3 (BTO), in between. The methods used to construct such a device and the challenges faced with the fabrication of a viable device will be examined. The ferroelectric layer of the devices that proved adequate for use were poled by the application of an electric field. Temperature Bias Poling used an applied field of 105V/cm at a temperature of approximately 135*C. High Potential Poling used an applied field of 106V/cm at room temperature (20*C). The devices were then tested for a change in their superconducting critical temperature, Tc. A shift of 1-2K in the Tc(onset) of YBCO was observed for Temperature Bias Poling and a shift of 2-6K for High Potential Poling. These are the first reported results of the field effect using BTO on YBCO. The mechanism involved in the shifting of Tc will be discussed along with possible applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to find out the impact of field experience in hospitality education and whether such field experience and others such as semi-practicum, the cooperative, and the work study programs will not play an important role of a closer alliance between the academic and the hospitality industry. II. If it is justifiable to say that it is possible to provide field experience which will enhance the professionally oriented course work, while educators and employers strive to design curriculum that is needed to meet the educational and the industry demands and goals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.