983 resultados para Composite Dynamic Object
Resumo:
Highly redundant or statically undetermined structures, such as a cable-stayed bridge, have been of particular concern to the engineering community nowadays because of the complex parameters that must be taken into account for healthy monitoring. The purpose of this study was to verify the reliability and practicability of using GPS to characterize dynamic oscillations of small span bridges. The test was carried out on a cable-stayed wood footbridge at Escola de Engenharia de Sao Carlos-Universidade de Sao Paulo, Brazil. Initially a static load trial was carried out to get an idea of the deck amplitude and oscillation frequency. After that, a calibration trial was carried out by applying a well known oscillation on the rover antenna to check the environment detectable limits for the method used. Finally, a dynamic load trial was carried out by using GPS and a displacement transducer to measure the deck oscillation. The displacement transducer was used just to confirm the results obtained by the GPS. The results have shown that the frequencies and amplitude displacements obtained by the GPS are in good agreement with the displacement transducer responses. GPS can be used as a reliable tool to characterize the dynamic behavior of large structures such as cable-stayed footbridges undergoing dynamic loads.
Resumo:
Vessel dynamic positioning (DP) systems are based on conventional PID-type controllers and an extended Kalman filter. However, they present a difficult tuning procedure, and the closed-loop performance varies with environmental or loading conditions since the dynamics of the vessel are eminently nonlinear. Gain scheduling is normally used to address the nonlinearity of the system. To overcome these problems, a sliding mode control was evaluated. This controller is robust to variations in environmental and loading conditions, it maintains performance and stability for a large range of conditions, and presents an easy tuning methodology. The performance of the controller was evaluated numerically and experimentally in order to address its effectiveness. The results are compared with those obtained from conventional PID controller. (c) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The Piracicaba, Capivari, and Jundiai River Basins (RB-PCJ) are mainly located in the State of So Paulo, Brazil. Using a dynamics systems simulation model (WRM-PCJ) to assess water resources sustainability, five 50-year simulations were run. WRM-PCJ was developed as a tool to aid decision and policy makers on the RB-PCJ Watershed Committee. The model has 254 variables. The model was calibrated and validated using available information from the 80s. Falkenmark Water Stress Index went from 1,403 m(3) person (-aEuro parts per thousand 1) year (-aEuro parts per thousand 1) in 2004 to 734 m(3) P (-aEuro parts per thousand 1) year (-aEuro parts per thousand 1) in 2054, and Xu Sustainability Index from 0.44 to 0.20. In 2004, the Keller River Basin Development Phase was Conservation, and by 2054 was Augmentation. The three criteria used to evaluate water resources showed that the watershed is at crucial water resources management turning point. The WRM-PCJ performed well, and it proved to be an excellent tool for decision and policy makers at RB-PCJ.
Resumo:
Using a dynamic systems model specifically developed for Piracicaba, Capivari and Jundia River Water Basins (BH-PCJ) as a tool to help to analyze water resources management alternatives for policy makers and decision takers, five simulations for 50 years timeframe were performed. The model estimates water supply and demand, as well as wastewater generation from the consumers at BH-PCJ. A run was performed using mean precipitation value constant, and keeping the actual water supply and demand rates, the business as usual scenario. Under these considerations, it is expected an increment of about similar to 76% on water demand, that similar to 39% of available water volume will come from wastewater reuse, and that waste load increases to similar to 91%. Falkenmark Index will change from 1,403 m(3) person(-1) year(-1) in 2004, to 734 m(3) P(-1) year(-1) by 2054, and the Sustainability Index from 0.44 to 0.20. Another four simulations were performed by affecting the annual precipitation by 90 and 110%; considering an ecological flow equal to 30% of the mean daily flow; and keeping the same rates for all other factors except for ecological flow and household water consumption. All of them showed a tendency to a water crisis in the near future at BH-PCJ.
Resumo:
This article presents the results obtained from an experimental device designed for the accurate determination of wood/water relationship on microsamples. The moisture content of the sample is measured with a highly sensitive electronic microbalance and two dimensions of the sample are collected continuously without contact using high-speed laser scan micrometers. The whole device is placed in a climatic chamber. The microsamples investigated were prepared with a diamond wire saw. The unique ability of this device to work with small samples allowed normal, opposite, and reaction wood to be characterized separately. Experiments were carried out on three wood species (beech, spruce, and poplar). In the case of beech, a deviation from the linear relation between tangential shrinkage and moisture content between 40 and 20% is particularly noticeable for the first desorption. A localized collapse of ray cells could explain this result. Compared to normal wood, an important longitudinal shrinkage and a low tangential shrinkage were observed in compression wood of spruce. Both the tension wood and opposite wood of poplar exhibit a high longitudinal shrinkage, but no significant difference between the three types of wood is noticeable in the tangential direction.
Resumo:
This study evaluated the influence of gastrointestinal environmental factors (pH, digestive enzymes, food components, medicaments) on the survival of Lactobacillus casei Shirota and Lactobacillus casei LC01, using a semi-dynamic in vitro model that simulates the transit of microorganisms through the human GIT. The strains were first exposed to different simulated gastric juices for different periods of time (0, 30, 60 and 120 min), and then to simulated intestinal fluids for zero, 120, 180 and 240 min, in a step-wise format. The number of viable cells was determined after each step. The influence of food residues (skim milk) in the fluids and resistance to medicaments commonly used for varied therapeutic purposes (analgesics, antiarrhythmics, antibiotics, antihistaminics, proton pump inhibitors, etc.) were also evaluated. Results indicated that survival of both cultures was pH and time dependent, and digestive enzymes had little influence. Milk components presented a protective effect, and medicaments, especially anti-inflammatory drugs, influenced markedly the viability of the probiotic cultures, indicating that the beneficial effects of the two probiotic cultures to health are dependent of environmental factors encountered in the human gastrointestinal tract.
Resumo:
This paper is concerned with the problem of argument-function mismatch observed in the apparent subject-object inversion in Chinese consumption verbs, e.g., chi 'eat' and he 'drink', and accommodation verbs, e.g., zhu 'live' and shui 'sleep'. These verbs seem to allow the linking of [agent-SUBJ theme-OBJ] as well as [agent-OBJ theme-SUBJ], but only when the agent is also the semantic role denoting the measure or extent of the action. The account offered is formulated within LFG's lexical mapping theory. Under the simplest and also the strictest interpretation of the one-to-one argument-function mapping principle (or the theta-criterion), a composite role such as ag-ext receives syntactic assignment via one composing role only. One-to-one linking thus entails the suppression of the other composing role. Apparent subject-object inversion occurs when the more prominent agent role is suppressed and thus allows the less prominent extent role to dictate the linking of the entire ag-ext composite role. This LMT account also potentially facilitates a natural explanation of markedness among the competing syntactic structures.
Resumo:
Granule impact deformation has long been recognised as important in determining whether or not two colliding granules will coalesce. Work in the last 10 years has highlighted the fact that viscous effects are significant in granulation. The relative strengths of different formulations can vary with strain rate. Therefore, traditional strength measurements made at pseudo-static conditions give no indication, even qualitatively, of how materials will behave at high strain rates, and hence are actually misleading when used to model granule coalescence. This means that new standard methods need to be developed for determining the strain rates encountered by granules inside industrial equipment and also for measuring the mechanical properties of granules at these strain rates. The constitutive equations used in theoretical models of granule coalescence also need to be extended to include strain-rate dependent components.
Resumo:
Results of two experiments are reported that examined how people respond to rectangular targets of different sizes in simple hitting tasks. If a target moves in a straight line and a person is constrained to move along a linear track oriented perpendicular to the targetrsquos motion, then the length of the target along its direction of motion constrains the temporal accuracy and precision required to make the interception. The dimensions of the target perpendicular to its direction of motion place no constraints on performance in such a task. In contrast, if the person is not constrained to move along a straight track, the targetrsquos dimensions may constrain the spatial as well as the temporal accuracy and precision. The experiments reported here examined how people responded to targets of different vertical extent (height): the task was to strike targets that moved along a straight, horizontal path. In experiment 1 participants were constrained to move along a horizontal linear track to strike targets and so target height did not constrain performance. Target height, length and speed were co-varied. Movement time (MT) was unaffected by target height but was systematically affected by length (briefer movements to smaller targets) and speed (briefer movements to faster targets). Peak movement speed (Vmax) was influenced by all three independent variables: participants struck shorter, narrower and faster targets harder. In experiment 2, participants were constrained to move in a vertical plane normal to the targetrsquos direction of motion. In this task target height constrains the spatial accuracy required to contact the target. Three groups of eight participants struck targets of different height but of constant length and speed, hence constant temporal accuracy demand (different for each group, one group struck stationary targets = no temporal accuracy demand). On average, participants showed little or no systematic response to changes in spatial accuracy demand on any dependent measure (MT, Vmax, spatial variable error). The results are interpreted in relation to previous results on movements aimed at stationary targets in the absence of visual feedback.
Resumo:
Studies concerning the processing of natural scenes using eye movement equipment have revealed that observers retain surprisingly little information from one fixation to the next. Other studies, in which fixation remained constant while elements within the scene were changed, have shown that, even without refixation, objects within a scene are surprisingly poorly represented. Although this effect has been studied in some detail in static scenes, there has been relatively little work on scenes as we would normally experience them, namely dynamic and ever changing. This paper describes a comparable form of change blindness in dynamic scenes, in which detection is performed in the presence of simulated observer motion. The study also describes how change blindness is affected by the manner in which the observer interacts with the environment, by comparing detection performance of an observer as the passenger or driver of a car. The experiments show that observer motion reduces the detection of orientation and location changes, and that the task of driving causes a concentration of object analysis on or near the line of motion, relative to passive viewing of the same scene.
Resumo:
This review reflects the state of the art in study of contact and dynamic phenomena occurring in cold roll forming. The importance of taking these phenomena into account is determined by significant machine time and tooling costs spent on worn out forming rolls replacement and equipment adjustment in cold roll forming. Predictive modelling of the tool wear caused by contact and dynamic phenomena can reduce the production losses in this technological process.
Resumo:
While a number of studies have shown that object-extracted relative clauses are more difficult to understand than subject-extracted counterparts for second language (L2) English learners (e.g., Izumi, 2003), less is known about why this is the case and how they process these complex sentences. This exploratory study examines the potential applicability of Gibson's (1998, 2000) Syntactic Prediction Locality Theory (SPLT), a theory proposed to predict first language (L1) processing difficulty, to L2 processing and considers whether the theory might also account for the processing difficulties of subject- and object-extracted relative clauses encountered by L2 learners. Results of a self-paced reading time experiment from 15 Japanese learners of English are mainly consistent with the reading time profile predicted by the SPLT and thus suggest that the L1 processing theory might also be able to account for L2 processing difficulty.
Resumo:
In an open channel, the transition from super- to sub-critical flow is a flow singularity (the hydraulic jump) characterised by a sharp rise in free-surface elevation, strong turbulence and air entrainment in the roller. A key feature of the hydraulic jump flow is the strong free-surface aeration and air-water flow turbulence. In the present study, similar experiments were conducted with identical inflow Froude numbers Fr1 using a geometric scaling ratio of 2:1. The results of the Froude-similar experiments showed some drastic scale effects in the smaller hydraulic jumps in terms of void fraction, bubble count rate and bubble chord time distributions. Void fraction distributions implied comparatively greater detrainment at low Reynolds numbers yielding some lesser aeration of the jump roller. The dimensionless bubble count rates were significantly lower in the smaller channel, especially in the mixing layer. The bubble chord time distributions were quantitatively close in both channels, and they were not scaled according to a Froude similitude. Simply the hydraulic jump remains a fascinating two-phase flow motion that is still poorly understood.
Resumo:
This paper reports the results of an experiment involving a sample of 204 members of the public who were assessed on three occasions about their willingness to pay for the conservation of the mahogany glider. They were asked this question prior to information being provided to them about the glider and other focal wildlife species; after such information was provided, and finally after participants had had an opportunity to see live specimens of this glider. The mean willingness to pay of the relevant samples are compared and found to show significant variations. Theories are considered that help explain the dynamics of these variations. Serious concerns are raised about the capacity of information provision to reveal ‘true’ contingent valuations of public goods.