914 resultados para DYNAMIC-ANALYSIS
Resumo:
This thesis presents an approach to cutting dynamics during turning based upon the mechanism of deformation of work material around the tool nose known as "ploughing". Starting from the shearing process in the cutting zone and accounting for "ploughing", new mathematical models relating turning force components to cutting conditions, tool geometry and tool vibration are developed. These models are developed separately for steady state and for oscillatory turning with new and worn tools. Experimental results are used to determine mathematical functions expressing the parameters introduced by the steady state model in the case of a new tool. The form of these functions are of general validity though their coefficients are dependent on work and tool materials. Good agreement is achieved between experimental and predicted forces. The model is extended on one hand to include different work material by introducing a hardness factor. The model provides good predictions when predicted forces are compared to present and published experimental results. On the other hand, the extension of the ploughing model to taming with a worn edge showed the ability of the model in predicting machining forces during steady state turning with the worn flank of the tool. In the development of the dynamic models, the dynamic turning force equations define the cutting process as being a system for which vibration of the tool tip in the feed direction is the input and measured forces are the output The model takes into account the shear plane oscillation and the cutting configuration variation in response to tool motion. Theoretical expressions of the turning forces are obtained for new and worn cutting edges. The dynamic analysis revealed the interaction between the cutting mechanism and the machine tool structure. The effect of the machine tool and tool post is accounted for by using experimental data of the transfer function of the tool post system. Steady state coefficients are corrected to include the changes in the cutting configuration with tool vibration and are used in the dynamic model. A series of oscillatory cutting tests at various conditions and various tool flank wear levels are carried out and experimental results are compared with model—predicted forces. Good agreement between predictions and experiments were achieved over a wide range of cutting conditions. This research bridges the gap between the analysis of vibration and turning forces in turning. It offers an explicit expression of the dynamic turning force generated during machining and highlights the relationships between tool wear, tool vibration and turning force. Spectral analysis of tool acceleration and turning force components led to define an "Inertance Power Ratio" as a flank wear monitoring factor. A formulation of an on—line flank wear monitoring methodology is presented and shows how the results of the present model can be applied to practical in—process tool wear monitoring in • turning operations.
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household's evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household's optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.
Resumo:
The relationship between trade policy and productivity growth is regarded as ambiguous in the literature. This dissertation examines under what condition the relationship would be positive (or negative). Through the use of static and dynamic analysis, we find two conflicting effects (the pro-protection effect and the pro-competitive effect) that cause the relationship to be ambiguous. If there exists a productivity gap between the import-competing and foreign industries, and if the level of protection is low (high), the relationship is positive (negative). We also show that the import-competing firm responds to a change in the protection level by choosing a level of investment in innovation which yields a different rate of productivity growth. The policy implication, therefore, is that a trade-policy maker should set the trade protection at a level which induces the firm to choose the highest rate of productivity growth, and, as a result, leading the firm to close the initial productivity gap in the most efficient way. ^
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.
Resumo:
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^
Resumo:
Human development requires a broad balance between ecological, social and economic factors in order to ensure its own sustainability. In this sense, the search for new sources of energy generation, with low deployment and operation costs, which cause the least possible impact to the environment, has been the focus of attention of all society segments. To do so, the reduction in exploration of fossil fuels and the encouragement of using renewable energy resources for distributed generation have proved interesting alternatives to the expansion of the energy matrix of various countries in the world. In this sense, the wind energy has acquired an increasingly significant role, presenting increasing rates of power grid penetration and highlighting technological innovations such as the use of permanent magnet synchronous generators (PMSG). In Brazil, this fact has also been noted and, as a result, the impact of the inclusion of this source in the distribution and sub-transmission power grid has been a major concern of utilities and agents connected to Brazilian electrical sector. Thus, it is relevant the development of appropriate computational tools that allow detailed predictive studies about the dynamic behavior of wind farms, either operating with isolated load, either connected to the main grid, taking also into account the implementation of control strategies for active/reactive power generation and the keeping of adequate levels of voltage and frequency. This work fits in this context since it comprises mathematical and computational developments of a complete wind energy conversion system (WECS) endowed with PMSG using time domain techniques of Alternative Transients Program (ATP), which prides itself a recognized reputation by scientific and academic communities as well as by electricity professionals in Brazil and elsewhere. The modeling procedures performed allowed the elaboration of blocks representing each of the elements of a real WECS, comprising the primary source (the wind), the wind turbine, the PMSG, the frequency converter, the step up transformer, the load composition and the power grid equivalent. Special attention is also given to the implementation of wind turbine control techniques, mainly the pitch control responsible for keeping the generator under the maximum power operation point, and the vector theory that aims at adjusting the active/reactive power flow between the wind turbine and the power grid. Several simulations are performed to investigate the dynamic behavior of the wind farm when subjected to different operating conditions and/or on the occurrence of wind intensity variations. The results have shown the effectiveness of both mathematical and computational modeling developed for the wind turbine and the associated controls.
Resumo:
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.
Resumo:
Abstract: Highway bridges have great values in a country because in case of any natural disaster they may serve as lines to save people’s lives. Being vulnerable under significant seismic loads, different methods can be considered to design resistant highway bridges and rehabilitate the existing ones. In this study, base isolation has been considered as one efficient method in this regards which in some cases reduces significantly the seismic load effects on the structure. By reducing the ductility demand on the structure without a notable increase of strength, the structure is designed to remain elastic under seismic loads. The problem associated with the isolated bridges, especially with elastomeric bearings, can be their excessive displacements under service and seismic loads. This can defy the purpose of using elastomeric bearings for small to medium span typical bridges where expansion joints and clearances may result in significant increase of initial and maintenance cost. Thus, supplementing the structure with dampers with some stiffness can serve as a solution which in turn, however, may increase the structure base shear. The main objective of this thesis is to provide a simplified method for the evaluation of optimal parameters for dampers in isolated bridges. Firstly, performing a parametric study, some directions are given for the use of simple isolation devices such as elastomeric bearings to rehabilitate existing bridges with high importance. Parameters like geometry of the bridge, code provisions and the type of soil on which the structure is constructed have been introduced to a typical two span bridge. It is concluded that the stiffness of the substructure, soil type and special provisions in the code can determine the employment of base isolation for retrofitting of bridges. Secondly, based on the elastic response coefficient of isolated bridges, a simplified design method of dampers for seismically isolated regular highway bridges has been presented in this study. By setting objectives for reduction of displacement and base shear variation, the required stiffness and damping of a hysteretic damper can be determined. By modelling a typical two span bridge, numerical analyses have followed to verify the effectiveness of the method. The method has been used to identify equivalent linear parameters and subsequently, nonlinear parameters of hysteretic damper for various designated scenarios of displacement and base shear requirements. Comparison of the results of the nonlinear numerical model without damper and with damper has shown that the method is sufficiently accurate. Finally, an innovative and simple hysteretic steel damper was designed. Five specimens were fabricated from two steel grades and were tested accompanying a real scale elastomeric isolator in the structural laboratory of the Université de Sherbrooke. The test procedure was to characterize the specimens by cyclic displacement controlled tests and subsequently to test them by real-time dynamic substructuring (RTDS) method. The test results were then used to establish a numerical model of the system which went through nonlinear time history analyses under several earthquakes. The outcome of the experimental and numerical showed an acceptable conformity with the simplified method.
Resumo:
The aim has been to review the literature about the risk factors of hamstring injury in soccer from a biomechanical point of view. METHODOLOGY. Data bases of bibliography references were Medline, Scopus and SportDiscuss. RESULTS AND DISCUSSION. Many prospective studies have shown that the previous injury is the greatest risk factor of sustaining the injury. However the primary causes of the injury are unclear in soccer. A lack of hamstring flexibility has been one of the main injury risk factors with controversies on the results. Imbalance of isokinetic force is a risk factor but electrical coactivation of all muscles participating during knee flexion and extension are unknown in football. While the importance of lumbopelvic-hamstrings muscles synchronization during running seems to be crucial for understanding the risk of injury, no research has been developed in this topic in football. CONCLUSIONS. More research using new data recording procedures as Dynamic Scanners, Surface EMG, Inverse Dynamic Analysis are needed. The analysis of more specific movements as running, kicking or jumping is clearly required. Managers, coaches, physical trainers, physiotherapists, sport physicians and researchers should work together in order to improve the injury prevention and rehabilitation programs of football players. Key Words: sports biomechanics, soccer, hamstring injury, risk factors
Resumo:
Este trabajo estima el coeficiente de pass through del tipo de cambio en los precios de bienes transables y no transables en Costa Rica, para el corto y el largo plazo. Se utiliza el análisis de mínimos cuadrados para estimar los coeficientes, y se explora la dinámica de ajuste de los modelos utilizando el análisis de vectores auto regresivo. Dentro de los principales resultados del modelo se encontró un coeficiente de pass through para los bienes transables de 13% en el corto plazo y de 68% en el largo plazo; para los bienes no transables, el pass through es de 10% y 52% en el corto y largo plazo respectivamente. En el largo plazo se incluye un 7% de pass through indirecto producto del efecto de los precios de los transables en los de no transables. El estudio de la dinámica de ajuste de los precios de transables y no transables ante un choque del tipo de cambio mostró una duración de 17 y 27 meses respectivamente. Además se realizaron pruebas de causalidad de Granger y estabilidad del modelo. La primera mostró una relación de precedencia entre las variaciones de tipo de cambio e inflación, y entre inflación de los transables y de los no transables. La segunda evidencia un cambio estructural en el modelo de los no transables entre fines de 1995 e inicio de 1996. AbstractThis paper estimates short run and long run coefficients of exchange rate pass through in to the prices of tradable and non tradable goods in Costa Rica. The coefficients are estimated by OLS. A VAR analysis is conducted in order to estimate the dynamic process between exchange rate and inflation. Granger causality test and a stability test are conducted too. The short run pass through coefficients are 13% and 10%, for tradable and non tradable goods respectively and the long run coefficients are 68% and 52% in the same order. There is a second stage pass through of 7% included in the long run coefficient for non tradable goods. The dynamic analysis shows that the adjustment process of prices as a result of an exchange rate shock takes 17 months for tradable goods and 27 months for non tradable goods. The Granger causality test shows precedence between variation in the exchange rate and inflation, and between the prices of tradable and non tradable goods. There is statistical evidence of a structural change in the non tradable model between the end of 1995 and the beginning of 1996.
Resumo:
Conventional threading operations involve two distinct machining processes: drilling and threading. Therefore, it is time consuming for the tools must be changed and the workpiece has to be moved to another machine. This paper presents an analysis of the combined process (drilling followed by threading) using a single tool for both operations: the tap-milling tool. Before presenting the methodology used to evaluate this hybrid tool, the ODS (operating deflection shapes) basics is shortly described. ODS and finite element modeling (FEM) were used during this research to optimize the process aiming to achieve higher stable machining conditions and increasing the tool life. Both methods allowed the determination of the natural frequencies and displacements of the machining center and optimize the workpiece fixture system. The results showed that there is an excellent correlation between the dynamic stability of the machining center-tool holder and the tool life, avoiding a tool premature catastrophic failure. Nevertheless, evidence showed that the tool is very sensitive to work conditions. Undoubtedly, the use of ODS and FEM eliminate empiric decisions concerning the optimization of machining conditions and increase drastically the tool life. After the ODS and FEM studies, it was possible to optimize the process and work material fixture system and machine more than 30,000 threaded holes without reaching the tool life limit and catastrophic fail.
Resumo:
The aim of this work is to study the wheel/workpiece dynamic interactions in high-speed grinding using vitrified CBN wheel and DTG (difficult to grind) work materials. This problem is typical in the grinding of engine valve heads. The influence of tangential force per abrasive grain was investigated as an important control variable for the determination of G ratio. Experiments were carried out to observe the influence of vibrations in the wheel wear. The measurements of acoustic emission (AE) and vibration signals helped in identifying the correlation between the dynamic interactions (produced by forced random excitation) and the wheel wear. The wheel regenerative chatter phenomenon was observed by using the wheel mapping technique. (c) 2008 CIRP.
Resumo:
Vessel dynamic positioning (DP) systems are based on conventional PID-type controllers and an extended Kalman filter. However, they present a difficult tuning procedure, and the closed-loop performance varies with environmental or loading conditions since the dynamics of the vessel are eminently nonlinear. Gain scheduling is normally used to address the nonlinearity of the system. To overcome these problems, a sliding mode control was evaluated. This controller is robust to variations in environmental and loading conditions, it maintains performance and stability for a large range of conditions, and presents an easy tuning methodology. The performance of the controller was evaluated numerically and experimentally in order to address its effectiveness. The results are compared with those obtained from conventional PID controller. (c) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this second paper, the three structural measures which have been developed are used in the modelling of a three stage centrifugal synthesis gas compressor. The goal of this case study is to determine the essential mathematical structure which must be incorporated into the compressor model to accurately model the shutdown of this system. A simple, accurate and functional model of the system is created via three structural measures. It was found that the model can be correctly reduced into its basic modes and that the order of the differential system can be reduced from 51(st) to 20(th). Of the 31 differential equational 21 reduce to algebraic relations, 8 become constants and 2 can be deleted thereby increasing the algebraic set from 70 to 91 equations. An interpretation is also obtained as to which physical phenomena are dominating the dynamics of the compressor add whether the compressor will enter surge during the shutdown. Comparisons of the reduced model performance against the full model are given, showing the accuracy and applicability of the approach. Copyright (C) 1996 Elsevier Science Ltd