876 resultados para user behavior model
Resumo:
Aircraft systems are highly nonlinear and time varying. High-performance aircraft at high angles of incidence experience undesired coupling of the lateral and longitudinal variables, resulting in departure from normal controlled flight. The aim of this work is to construct a robust closed-loop control that optimally extends the stable and decoupled flight envelope. For the study of these systems nonlinear analysis methods are needed. Previously, bifurcation techniques have been used mainly to analyze open-loop nonlinear aircraft models and investigate control effects on dynamic behavior. In this work linear feedback control designs calculated by eigenstructure assignment methods are investigated for a simple aircraft model at a fixed flight condition. Bifurcation analysis in conjunction with linear control design methods is shown to aid control law design for the nonlinear system.
Resumo:
The theoretical understanding of online shopping behavior has received much attention. Less focus has been given to the formation of the customer experience (CE) that results from online shopper interactions with e-retailers. This study develops and empirically tests a model of the relationship between antecedents and outcomes of online customer experience (OCE) within Internet shopping websites using an international sample. The study identifies and provides operational measures of these variables plus the cognitive and affective components of OCE. The paper makes contributions towards new knowledge and understanding of how e-
Resumo:
The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9– 12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children’s self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children’s perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children’s self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children.
Resumo:
Research into design methodology is one of the most challenging issues in the field of persuasive technology. However, the introduction of the Persuasive Systems Design model, and the consideration of the 3-Dimensional Re-lationship between Attitude and Behavior, offer to make persuasive technolo-gies more practically viable. In this paper we demonstrate how the 3-Dimensional Relationship between Attitude and Behavior guides the analysis of the persuasion context in the Persuasive System Design model. As a result, we propose a modification of the persuasion context and assert that the technology should be analyzed as part of strategy instead of event.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
We present a model of market participation in which the presence of non-negligible fixed costs leads to random censoring of the traditional double-hurdle model. Fixed costs arise when household resources must be devoted a priori to the decision to participate in the market. These costs, usually of time, are manifested in non-negligible minimum-efficient supplies and supply correspondence that requires modification of the traditional Tobit regression. The costs also complicate econometric estimation of household behavior. These complications are overcome by application of the Gibbs sampler. The algorithm thus derived provides robust estimates of the fixed-costs, double-hurdle model. The model and procedures are demonstrated in an application to milk market participation in the Ethiopian highlands.
Resumo:
Revealing the evolution of well-organized social behavior requires understanding a mechanism by which collective behavior is produced. A well-organized group may be produced by two possible mechanisms, namely, a central control and a distributed control. In the second case, local interactions between interchangeable components function at the bottom of the collective behavior. We focused on a simple behavior of an individual ant and analyzed the interactions between a pair of ants. In an experimental set-up, we placed the workers in a hemisphere without a nest, food, and a queen, and recorded their trajectories. The temporal pattern of velocity of each ant was obtained. From this bottom-up approach, we found the characteristic behavior of a single worker and a pair of workers as follows: (1) Activity of each individual has a rhythmic component. (2) Interactions between a pair of individuals result in two types of coupling, namely the anti-phase and the in-phase coupling. The direct physical contacts between the pair of workers might cause a phase shift of the rhythmic components in individual ants. We also build up a simple model based on the coupled oscillators toward the understanding of the whole colony behavior.
Resumo:
The gamergate (generally called the “queen”) of the Diacamma sp. walks around in the nest and comes into contact with the workers. The gamergate informs the workers of its presence by physical contact. This behavior is called a “patrol.” In previous work, it was reported that the gamergate controls its patrolling time depending on the colony size. How does the gamergate know the colony size, and how does it control the patrolling time? In this article, we propose a simple dynamics to explain this behavior. We assume that the gamergate and the workers have internal states which interact by physical contacts. By numerical simulations, we confirm that the patrol time of the proposed model depends on the size of the colony.
Resumo:
Health care provision is significantly impacted by the ability of the health providers to engineer a viable healthcare space to support care stakeholders needs. In this paper we discuss and propose use of organisational semiotics as a set of methods to link stakeholders to systems, which allows us to capture clinician activity, information transfer, and building use; which in tern allows us to define the value of specific systems in the care environment to specific stakeholders and the dependence between systems in a care space. We suggest use of a semantically enhanced building information model (BIM) to support the linking of clinician activity to the physical resource objects and space; and facilitate the capture of quantifiable data, over time, concerning resource use by key stakeholders. Finally we argue for the inclusion of appropriate stakeholder feedback and persuasive mechanism, to incentivise building user behaviour to support organisational level sustainability policy.
Resumo:
The interannual variability of the stratospheric polar vortex during winter in both hemispheres is observed to correlate strongly with the phase of the quasi-biennial oscillation (QBO) in tropical stratospheric winds. It follows that the lack of a spontaneously generated QBO in most atmospheric general circulation models (AGCMs) adversely affects the nature of polar variability in such models. This study examines QBO–vortex coupling in an AGCM in which a QBO is spontaneously induced by resolved and parameterized waves. The QBO–vortex coupling in the AGCM compares favorably to that seen in reanalysis data [from the 40-yr ECMWF Re-Analysis (ERA-40)], provided that careful attention is given to the definition of QBO phase. A phase angle representation of the QBO is employed that is based on the two leading empirical orthogonal functions of equatorial zonal wind vertical profiles. This yields a QBO phase that serves as a proxy for the vertical structure of equatorial winds over the whole depth of the stratosphere and thus provides a means of subsampling the data to select QBO phases with similar vertical profiles of equatorial zonal wind. Using this subsampling, it is found that the QBO phase that induces the strongest polar vortex response in early winter differs from that which induces the strongest late-winter vortex response. This is true in both hemispheres and for both the AGCM and ERA-40. It follows that the strength and timing of QBO influence on the vortex may be affected by the partial seasonal synchronization of QBO phase transitions that occurs both in observations and in the model. This provides a mechanism by which changes in the strength of QBO–vortex correlations may exhibit variability on decadal time scales. In the model, such behavior occurs in the absence of external forcings or interannual variations in sea surface temperatures.
Resumo:
Background. Within a therapeutic gene by environment (GxE) framework, we recently demonstrated that variation in the Serotonin Transporter Promoter Polymorphism; 5HTTLPR and marker rs6330 in Nerve Growth Factor gene; NGF is associated with poorer outcomes following cognitive behaviour therapy (CBT) for child anxiety disorders. The aim of this study was to explore one potential means of extending the translational reach of G×E data in a way that may be clinically informative. We describe a ‘risk-index’ approach combining genetic, demographic and clinical data and test its ability to predict diagnostic outcome following CBT in anxious children. Method. DNA and clinical data were collected from 384 children with a primary anxiety disorder undergoing CBT. We tested our risk model in five cross-validation training sets. Results. In predicting treatment outcome, six variables had a minimum mean beta value of 0.5: 5HTTLPR, NGF rs6330, gender, primary anxiety severity, comorbid mood disorder and comorbid externalising disorder. A risk index (range 0-8) constructed from these variables had moderate predictive ability (AUC = .62-.69) in this study. Children scoring high on this index (5-8) were approximately three times as likely to retain their primary anxiety disorder at follow-up as compared to those children scoring 2 or less. Conclusion. Significant genetic, demographic and clinical predictors of outcome following CBT for anxiety-disordered children were identified. Combining these predictors within a risk-index could be used to identify which children are less likely to be diagnosis free following CBT alone or thus require longer or enhanced treatment. The ‘risk-index’ approach represents one means of harnessing the translational potential of G×E data.
Resumo:
Existing research has given little attention to the relationship between culture characteristics and consumer’s internal beliefs particularly in the pre-purchase stage, and how this relationship affects consumer’s purchase decision. This paper considers the theory of cognitive dissonance and its extended model (the 3D-RAB), as a means to study the current distribution of consumer’s pre-purchase cognitive dissonance, which allows us to investigate the effects of culture characteristics on this distribution. Results revealed that individualism versus collectivism and high power distance dimensions, from Hofstede’s cultural model, influence consumer’s pre-purchase cognitive dissonance. These dimensions must be considered in the design of e-commerce website, by tailoring motivational/influences methods and techniques to reflect targeted consumers culture.
Resumo:
We use observations of N2O and mean age to identify realistic transport in models in order to explain their ozone predictions. The results are applied to 15 chemistry climate models (CCMs) participating in the 2010 World Meteorological Organization ozone assessment. Comparison of the observed and simulated N2O, mean age and their compact correlation identifies models with fast or slow circulations and reveals details of model ascent and tropical isolation. This process‐oriented diagnostic is more useful than mean age alone because it identifies models with compensating transport deficiencies that produce fortuitous agreement with mean age. The diagnosed model transport behavior is related to a model’s ability to produce realistic lower stratosphere (LS) O3 profiles. Models with the greatest tropical transport problems compare poorly with O3 observations. Models with the most realistic LS transport agree more closely with LS observations and each other. We incorporate the results of the chemistry evaluations in the Stratospheric Processes and their Role in Climate (SPARC) CCMVal Report to explain the range of CCM predictions for the return‐to‐1980 dates for global (60°S–60°N) and Antarctic column ozone. Antarctic O3 return dates are generally correlated with vortex Cly levels, and vortex Cly is generally correlated with the model’s circulation, although model Cl chemistry and conservation problems also have a significant effect on return date. In both regions, models with good LS transport and chemistry produce a smaller range of predictions for the return‐to‐1980 ozone values. This study suggests that the current range of predicted return dates is unnecessarily broad due to identifiable model deficiencies.
Resumo:
The behavior of the ensemble Kalman filter (EnKF) is examined in the context of a model that exhibits a nonlinear chaotic (slow) vortical mode coupled to a linear (fast) gravity wave of a given amplitude and frequency. It is shown that accurate recovery of both modes is enhanced when covariances between fast and slow normal-mode variables (which reflect the slaving relations inherent in balanced dynamics) are modeled correctly. More ensemble members are needed to recover the fast, linear gravity wave than the slow, vortical motion. Although the EnKF tends to diverge in the analysis of the gravity wave, the filter divergence is stable and does not lead to a great loss of accuracy. Consequently, provided the ensemble is large enough and observations are made that reflect both time scales, the EnKF is able to recover both time scales more accurately than optimal interpolation (OI), which uses a static error covariance matrix. For OI it is also found to be problematic to observe the state at a frequency that is a subharmonic of the gravity wave frequency, a problem that is in part overcome by the EnKF.However, error in themodeled gravity wave parameters can be detrimental to the performance of the EnKF and remove its implied advantages, suggesting that a modified algorithm or a method for accounting for model error is needed.