917 resultados para Simulation in robotcs


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the dynamic and mutihop nature of the Mobile Ad-hoc Network (MANET), voice communication over MANET may encounter many challenges. We set up a subjective quality evaluation model using ITU-T E-model with extension. And through simulation in NS-2, we evaluate how the following factors impact voice quality in MANET: the number of hops, the number of route breakages, the number of communication pairs and the background traffic. Using AODV as the underlying routing protocol, and with the MAC layer changed from 802.11 DCF to 802.11e EDCF, we observe that 802.11e is more suitable for implementating voice communication over MANET. © 2005 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, a new open-winding control strategy is proposed for a brushless doubly fed reluctance generator (BDFRG) used for stand-alone wind turbine or ship generators. The BDFRG is characterized with two windings on the stator: a power winding and a control winding. The control winding is fed with dual two-level three-phase converters, and a vector control scheme based on space vector pulsewidth modulation is designed. Compared with traditional three-level inverter systems, the dc-link voltage and the voltage rating of power devices in the proposed system are reduced by 50% while still greatly improving the reliability, redundancy, and fault tolerance of the proposed system by increasing the switching modes. Its performance is evaluated by simulation in MATLAB/Simulink and an experimental study on a 42-kW prototype machine.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The first chapter analizes conditional assistance programs. They generate conflicting relationships between international financial institutions (IFIs) and member countries. The experience of IFIs with conditionality in the 1990s led them to allow countries more latitude in the design of their reform programs. A reformist government does not need conditionality and it is useless if it does not want to reform. A government that faces opposition may use conditionality and the help of pro-reform lobbies as a lever to counteract anti-reform groups and succeed in implementing reforms.^ The second chapter analizes economies saddled with taxes and regulations. I consider an economy in which many taxes, subsidies, and other distortionary restrictions are in place simultaneously. If I start from an inefficient laissez-faire equilibrium because of some domestic distortion, a small trade tax or subsidy can yield a first-order welfare improvement, even if the instrument itself creates distortions of its own. This may result in "welfare paradoxes". The purpose of the chapter is to quantify the welfare effects of changes in tax rates in a small open economy. I conduct the simulation in the context of an intertemporal utility maximization framework. I apply numerical methods to the model developed by Karayalcin. I introduce changes in the tax rates and quantify both the impact on welfare, consumption and foreign assets, and the path to the new steady-state values.^ The third chapter studies the role of stock markets and adjustment costs in the international transmission of supply shocks. The analysis of the transmission of a positive supply shock that originates in one of the countries shows that on impact the shock leads to an inmediate stock market boom enjoying the technological advance, while the other country suffers from depress stock market prices as demand for its equity declines. A period of adjustment begins culminating in a steady state capital and output level that is identical to the one before the shock. The the capital stock of one country undergoes a non-monotonic adjustment. The model is tested with plausible values of the variables and the numeric results confirm the predictions of the theory.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The first chapter analizes conditional assistance programs. They generate conflicting relationships between international financial institutions (IFIs) and member countries. The experience of IFIs with conditionality in the 1990s led them to allow countries more latitude in the design of their reform programs. A reformist government does not need conditionality and it is useless if it does not want to reform. A government that faces opposition may use conditionality and the help of pro-reform lobbies as a lever to counteract anti-reform groups and succeed in implementing reforms. The second chapter analizes economies saddled with taxes and regulations. I consider an economy in which many taxes, subsidies, and other distortionary restrictions are in place simultaneously. If I start from an inefficient laissez-faire equilibrium because of some domestic distortion, a small trade tax or subsidy can yield a first-order welfare improvement, even if the instrument itself creates distortions of its own. This may result in "welfare paradoxes". The purpose of the chapter is to quantify the welfare effects of changes in tax rates in a small open economy. I conduct the simulation in the context of an intertemporal utility maximization framework. I apply numerical methods to the model developed by Karayalcin. I introduce changes in the tax rates and quantify both the impact on welfare, consumption and foreign assets, and the path to the new steady-state values. The third chapter studies the role of stock markets and adjustment costs in the international transmission of supply shocks. The analysis of the transmission of a positive supply shock that originates in one of the countries shows that on impact the shock leads to an inmediate stock market boom enjoying the technological advance, while the other country suffers from depress stock market prices as demand for its equity declines. A period of adjustment begins culminating in a steady state capital and output level that is identical to the one before the shock. The the capital stock of one country undergoes a non-monotonic adjustment. The model is tested with plausible values of the variables and the numeric results confirm the predictions of the theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study aims to examine the methodology of realistic simulation as facilitator of the teaching-learning process in nursing, and is justified by the possibility to propose conditions that envisage improvements in the training process with a view to assess the impacts attributed to new teaching strategies and learning in the formative areas of health and nursing. Descriptive study with quantitative and qualitative approach, as action research, and focus on teaching from the realistic simulation of Nursing in Primary Care in an institution of public higher education. . The research was developed in the Comprehensive Care Health discipline II, this is offered in the third year of the course in order to prepare the nursing student to the stage of Primary Health Care The study population comprised 40 subjects: 37 students and 3 teachers of that discipline. Data collection was held from February to May 2014 and was performed by using questionnaires and semi structured interviews. To do so, we followed the following sequence: identification of the use of simulation in the discipline target of intervention; consultation with professors about the possibility of implementing the survey; investigation of the syllabus of discipline, objectives, skills and abilities; preparing the plan for the execution of the intervention; preparing the checklist for skills training; construction and execution of simulation scenarios and evaluation of scenarios. Quantitative data were analyzed using simple descriptive statistics, percentage, and qualitative data through collective subject discourse. A high fidelity simulation was inserted in the curriculum of the course of the research object, based on the use of standard patient. Three cases were created and executed. In the students’ view, the simulation contributed to the synthesis of the contents worked at Integral Health Care II discipline (100%), scoring between 8 and 10 (100%) to executed scenarios. In addition, the simulation has generated a considerable percentage of high expectations for the activities of the discipline (70.27%) and is also shown as a strategy for generating student satisfaction (97.30%). Of the 97.30% that claimed to be quite satisfied with the activities proposed by the academic discipline of Integral Health Care II, 94.59% of the sample indicated the simulation as a determinant factor for the allocation of such gratification. Regarding the students' perception about the strategy of simulation, the most prominent category was the possibility of prior experience of practice (23.91%). The nervousness was one of the most cited negative aspects from the experience in simulated scenarios (50.0%). The most representative positive point (63.89%) pervades the idea of approximation with the reality of Primary Care. In addition, professors of the discipline, totaling 3, were trained in the methodology of the simulation. The study highlighted the contribution of realistic simulation in the context of teaching and learning in nursing and highlighted this strategy while mechanism to generate expectation and satisfaction among undergraduate nursing students

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The understanding of the occurrence and flow of groundwater in the subsurface is of fundamental importance in the exploitation of water, just like knowledge of all associated hydrogeological context. These factors are primarily controlled by geometry of a certain pore system, given the nature of sedimentary aquifers. Thus, the microstructural characterization, as the interconnectivity of the system, it is essential to know the macro properties porosity and permeability of reservoir rock, in which can be done on a statistical characterization by twodimensional analysis. The latter is being held on a computing platform, using image thin sections of reservoir rock, allowing the prediction of the properties effective porosity and hydraulic conductivity. For Barreiras Aquifer to obtain such parameters derived primarily from the interpretation of tests of aquifers, a practice that usually involves a fairly complex logistics in terms of equipment and personnel required in addition to high cost of operation. Thus, the analysis and digital image processing is presented as an alternative tool for the characterization of hydraulic parameters, showing up as a practical and inexpensive method. This methodology is based on a flowchart work involving sampling, preparation of thin sections and their respective images, segmentation and geometric characterization, three-dimensional reconstruction and flow simulation. In this research, computational image analysis of thin sections of rocks has shown that aquifer storage coefficients ranging from 0,035 to 0,12 with an average of 0,076, while its hydrogeological substrate (associated with the top of the carbonate sequence outcropping not region) presents effective porosities of the order of 2%. For the transport regime, it is evidenced that the methodology presents results below of those found in the bibliographic data relating to hydraulic conductivity, mean values of 1,04 x10-6 m/s, with fluctuations between 2,94 x10-6 m/s and 3,61x10-8 m/s, probably due to the larger scale study and the heterogeneity of the medium studied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The past years have witnessed an increased use of applied games for developing and evaluating communication skills. These skills benefit from in-terpersonal interactions. Providing feedback to students practicing communica-tion skills is difficult in a traditional class setting with one teacher and many students. This logistic challenge may be partly overcome by providing training using a simulation in which a student practices with communication scenarios. A scenario is a description of a series of interactions, where at each step the player is faced with a choice. We have developed a scenario editor that enables teachers to develop scenarios for practicing communication skills. A teacher can develop a scenario without knowledge of the implementation. This paper presents the implementation architecture for such a scenario-based simulation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real Estate is by nature a hands-on business in which real-world experience and new challenges are the best teacher. With this in mind, graduate real estate education has embraced case competitions as a way to apply education-based learning to real world project simulation. In recent years, teams from Cornell have consistently stood out in these competitions, making impressions and forming relationships that they will carry with them over their careers. In this issue of the Review, we recognize a composite of previous winners of the four major real estate-focused case competitions, and look back on what was a very successful year for case competition teams at Cornell. The case competitions draw students from all the constituent programs of Real Estate at Cornell, including the Baker Program, Johnson Graduate School of Management, City and Regional Planning, Architecture, and Landscape Architecture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The occurrence frequency of failure events serve as critical indexes representing the safety status of dam-reservoir systems. Although overtopping is the most common failure mode with significant consequences, this type of event, in most cases, has a small probability. Estimation of such rare event risks for dam-reservoir systems with crude Monte Carlo (CMC) simulation techniques requires a prohibitively large number of trials, where significant computational resources are required to reach the satisfied estimation results. Otherwise, estimation of the disturbances would not be accurate enough. In order to reduce the computation expenses and improve the risk estimation efficiency, an importance sampling (IS) based simulation approach is proposed in this dissertation to address the overtopping risks of dam-reservoir systems. Deliverables of this study mainly include the following five aspects: 1) the reservoir inflow hydrograph model; 2) the dam-reservoir system operation model; 3) the CMC simulation framework; 4) the IS-based Monte Carlo (ISMC) simulation framework; and 5) the overtopping risk estimation comparison of both CMC and ISMC simulation. In a broader sense, this study meets the following three expectations: 1) to address the natural stochastic characteristics of the dam-reservoir system, such as the reservoir inflow rate; 2) to build up the fundamental CMC and ISMC simulation frameworks of the dam-reservoir system in order to estimate the overtopping risks; and 3) to compare the simulation results and the computational performance in order to demonstrate the ISMC simulation advantages. The estimation results of overtopping probability could be used to guide the future dam safety investigations and studies, and to supplement the conventional analyses in decision making on the dam-reservoir system improvements. At the same time, the proposed methodology of ISMC simulation is reasonably robust and proved to improve the overtopping risk estimation. The more accurate estimation, the smaller variance, and the reduced CPU time, expand the application of Monte Carlo (MC) technique on evaluating rare event risks for infrastructures.