12 resultados para Speed and torque observers

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advantages and popularity of Permanent Magnet (PM) motors due to their high power density, there is an increasing incentive to use them in variety of applications including electric actuation. These applications have strict noise emission standards. The generation of audible noise and associated vibration modes are characteristics of all electric motors, it is especially problematic in low speed sensorless control rotary actuation applications using high frequency voltage injection technique. This dissertation is aimed at solving the problem of optimizing the sensorless control algorithm for low noise and vibration while achieving at least 12 bit absolute accuracy for speed and position control. The low speed sensorless algorithm is simulated using an improved Phase Variable Model, developed and implemented in a hardware-in-the-loop prototyping environment. Two experimental testbeds were developed and built to test and verify the algorithm in real time.^ A neural network based modeling approach was used to predict the audible noise due to the high frequency injected carrier signal. This model was created based on noise measurements in an especially built chamber. The developed noise model is then integrated into the high frequency based sensorless control scheme so that appropriate tradeoffs and mitigation techniques can be devised. This will improve the position estimation and control performance while keeping the noise below a certain level. Genetic algorithms were used for including the noise optimization parameters into the developed control algorithm.^ A novel wavelet based filtering approach was proposed in this dissertation for the sensorless control algorithm at low speed. This novel filter was capable of extracting the position information at low values of injection voltage where conventional filters fail. This filtering approach can be used in practice to reduce the injected voltage in sensorless control algorithm resulting in significant reduction of noise and vibration.^ Online optimization of sensorless position estimation algorithm was performed to reduce vibration and to improve the position estimation performance. The results obtained are important and represent original contributions that can be helpful in choosing optimal parameters for sensorless control algorithm in many practical applications.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied-- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton--are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy--executive orders--redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deception research has traditionally focused on three methods of identifying liars and truth tellers: observing non-verbal or behavioral cues, analyzing verbal cues, and monitoring changes in physiological arousal during polygraph tests. Research shows that observers are often incapable of discriminating between liars and truth tellers with better than chance accuracy when they use these methods. One possible explanation for observers' poor performance is that they are not properly applying existing lie detection methods. An alternative explanation is that the cues on which these methods — and observers' judgments — are based do not reliably discriminate between liars and truth tellers. It may be possible to identify more reliable cues, and potentially improve observers' ability to discriminate, by developing a better understanding of how liars and truth tellers try to tell a convincing story. ^ This research examined (a) the verbal strategies used by truthful and deceptive individuals during interviews concerning an assigned activity, and (b) observers' ability to discriminate between them based on their verbal strategies. In Experiment I, pre-interview instructions manipulated participants' expectations regarding verifiability; each participant was led to believe that the interviewer could check some types of details, but not others, before deciding whether the participant was being truthful or deceptive. Interviews were then transcribed and scored for quantity and type of information provided. In Experiment II, observers listened to a random sample of the Experiment I interviews and rendered veracity judgments; half of the observers were instructed to judge the interviews according to the verbal strategies used by liars and truth tellers and the other half were uninstructed. ^ Results of Experiment I indicate that liars and truth tellers use different verbal strategies, characterized by a differential amount of detail. Overall, truthful participants provided more information than deceptive participants. This effect was moderated by participants' expectations regarding verifiability such that truthful participants provided more information only with regard to verifiable details. Results of Experiment II indicate that observers instructed about liars' and truth tellers' verbal strategies identify them with greater accuracy than uninstructed observers. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glycogen Synthase Kinase 3 (GSK3), a serine/threonine kinase initially characterized in the context of glycogen metabolism, has been repeatedly realized as a multitasking protein that can regulate numerous cellular events in both metazoa and protozoa. I recently found GSK3 plays a role in regulating chemotaxis, a guided cell movement in response to an external chemical gradient, in one of the best studied model systems for chemotaxis - Dictyostelium discoideum. ^ It was initially found that comparing to wild type cells, gsk3 - cells showed aberrant chemotaxis with a significant decrease in both speed and chemotactic indices. In Dictyostelium, phosphatidylinositol 3,4,5-triphosphate (PIP3) signaling is one of the best characterized pathways that regulate chemotaxis. Molecular analysis uncovered that gsk3- cells suffer from high basal level of PIP3, the product of PI3K. Upon chemoattractant cAMP stimulation, wild type cells displayed a transient increase in the level of PIP3. In contrast, gsk3- cells exhibited neither significant increase nor adaptation. On the other hand, no aberrant dynamic of phosphatase and tensin homolog (PTEN), which antagonizes PI3K function, was observed. Upon membrane localization of PI3K, PI3K become activated by Ras, which will in turn further facilitate membrane localization of PI3K in an F-Actin dependent manner. The gsk3- cells treated with F-Actin inhibitor Latrunculin-A showed no significant difference in the PIP3 level. ^ I also showed GSK3 affected the phosphorylation level of the localization domain of PI3K1 (PI3K1-LD). PI3K1-LD proteins from gsk3- cells displayed less phosphorylation on serine residues compared to that from wild type cells. When the potential GSK3 phosphorylation sites of PI3K1-LD were substituted with aspartic acids (Phosphomimetic substitution), its membrane localization was suppressed in gsk3- cells. When these serine residues of PI3K1-LD were substituted with alanine, aberrantly high level of membrane localization of the PI3K1-LD was monitored in wild type cells. Wild type, phosphomimetic, and alanine substitution of PI3K1-LD fused with GFP proteins also displayed identical localization behavior as suggested by the cell fraction studies. Lastly, I identified that all three potential GSK3 phosphorylation sites on PI3K1-LD could be phosphorylated in vitro by GSK3.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. ^ Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. ^ Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building's energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. ^ In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. ^ An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indigenous movements have become increasingly powerful in the last couple of decades and they are now important political actors in some South American countries, such as Bolivia, Ecuador, and, to a lesser extent, Peru and Chile. The rise of indigenous movements has provoked concern among U.S. policymakers and other observers who have feared that these movements will exacerbate ethnic polarization, undermine democracy, and jeopardize U.S. interests in the region. This paper argues that concern over the rise of indigenous movements is greatly exaggerated. It maintains that the rise of indigenous movements has not brought about a market increase in ethnic polarization in the region because most indigenous organizations have been ethnically inclusive and have eschewed violence. Although the indigenous movements have at times demonstrated a lack of regard for democratic institutions and procedures, they have also helped deepen democracy in the Andean region by promoting greater political inclusion and participation and by aggressively combating ethnic discrimination and inequality. Finally, this study suggests that the indigenous population has opposed some U.S. –sponsored initiatives, such as coca eradication and market reform, for legitimate reasons. Such policies have had some negative environmental, cultural, and economic consequences for indigenous people, which U.S. policymakers should try to address. The conclusion provides some specific policy recommendations on how to go about this.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton-are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy-executive orders redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building’s energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glycogen Synthase Kinase 3 (GSK3), a serine/threonine kinase initially characterized in the context of glycogen metabolism, has been repeatedly realized as a multitasking protein that can regulate numerous cellular events in both metazoa and protozoa. I recently found GSK3 plays a role in regulating chemotaxis, a guided cell movement in response to an external chemical gradient, in one of the best studied model systems for chemotaxis - Dictyostelium discoideum. It was initially found that comparing to wild type cells, gsk3- cells showed aberrant chemotaxis with a significant decrease in both speed and chemotactic indices. In Dictyostelium, phosphatidylinositol 3,4,5-triphosphate (PIP3) signaling is one of the best characterized pathways that regulate chemotaxis. Molecular analysis uncovered that gsk3- cells suffer from high basal level of PIP3, the product of PI3K. Upon chemoattractant cAMP stimulation, wild type cells displayed a transient increase in the level of PIP3. In contrast, gsk3- cells exhibited neither significant increase nor adaptation. On the other hand, no aberrant dynamic of phosphatase and tensin homolog (PTEN), which antagonizes PI3K function, was observed. Upon membrane localization of PI3K, PI3K become activated by Ras, which will in turn further facilitate membrane localization of PI3K in an F-Actin dependent manner. The gsk3- cells treated with F-Actin inhibitor Latrunculin-A showed no significant difference in the PIP3 level. I also showed GSK3 affected the phosphorylation level of the localization domain of PI3K1 (PI3K1-LD). PI3K1-LD proteins from gsk3- cells displayed less phosphorylation on serine residues compared to that from wild type cells. When the potential GSK3 phosphorylation sites of PI3K1-LD were substituted with aspartic acids (Phosphomimetic substitution), its membrane localization was suppressed in gsk3- cells. When these serine residues of PI3K1-LD were substituted with alanine, aberrantly high level of membrane localization of the PI3K1-LD was monitored in wild type cells. Wild type, phosphomimetic, and alanine substitution of PI3K1-LD fused with GFP proteins also displayed identical localization behavior as suggested by the cell fraction studies. Lastly, I identified that all three potential GSK3 phosphorylation sites on PI3K1-LD could be phosphorylated in vitro by GSK3.