939 resultados para Simulation experiments
Resumo:
Two experimental studies were conducted to examine whether the stress-buffering effects of behavioral control on work task responses varied as a function of procedural information. Study 1 manipulated low and high levels of task demands, behavioral control, and procedural information for 128 introductory psychology students completing an in-basket activity. ANOVA procedures revealed a significant three-way interaction among these variables in the prediction of subjective task performance and task satisfaction. It was found that procedural information buffered the negative effects of task demands on ratings of performance and satisfaction only under conditions of low behavioral control. This pattern of results suggests that procedural information may have a compensatory effect when the work environment is characterized by a combination of high task demands and low behavioral control. Study 2 (N=256) utilized simple and complex versions of the in-basket activity to examine the extent to which the interactive relationship among task demands, behavioral control, and procedural information varied as a function of task complexity. There was further support for the stress-buffering role of procedural information on work task responses under conditions of low behavioral control. This effect was, however, only present when the in-basket activity was characterized by high task complexity, suggesting that the interactive relationship among these variables may depend on the type of tasks performed at work.
Resumo:
As a new research method supplementing the existing qualitative and quantitative approaches, agent-based modelling and simulation (ABMS) may fit well within the entrepreneurship field because the core concepts and basic premises of entrepreneurship coincide with the characteristics of ABMS (McKelvey, 2004; Yang & Chandra, 2013). Agent-based simulation is a simulation method based on agent-based models. The agentbased models are composed of heterogeneous agents and their behavioural rules. By repeatedly carrying out agent-based simulations on a computer, the simulations reproduce each agent’s behaviour, their interactive process, and the emerging macroscopic phenomenon according to the flow of time. Using agent-based simulations, researchers may investigate temporal or dynamic effects of each agent’s behaviours.
Resumo:
Purpose Traditional construction planning relies upon the critical path method (CPM) and bar charts. Both of these methods suffer from visualization and timing issues that could be addressed by 4D technology specifically geared to meet the needs of the construction industry. This paper proposed a new construction planning approach based on simulation by using a game engine. Design/methodology/approach A 4D automatic simulation tool was developed and a case study was carried out. The proposed tool was used to simulate and optimize the plans for the installation of a temporary platform for piling in a civil construction project in Hong Kong. The tool simulated the result of the construction process with three variables: 1) equipment, 2) site layout and 3) schedule. Through this, the construction team was able to repeatedly simulate a range of options. Findings The results indicate that the proposed approach can provide a user-friendly 4D simulation platform for the construction industry. The simulation can also identify the solution being sought by the construction team. The paper also identifies directions for further development of the 4D technology as an aid in construction planning and decision-making. Research limitations/implications The tests on the tool are limited to a single case study and further research is needed to test the use of game engines for construction planning in different construction projects to verify its effectiveness. Future research could also explore the use of alternative game engines and compare their performance and results. Originality/value The authors proposed the use of game engine to simulate the construction process based on resources, working space and construction schedule. The developed tool can be used by end-users without simulation experience.
Resumo:
Aim Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment. Ongoing work seeks to determine the impact of simulation on clinical skills.
Resumo:
In this paper, we introduce the Stochastic Adams-Bashforth (SAB) and Stochastic Adams-Moulton (SAM) methods as an extension of the tau-leaping framework to past information. Using the theta-trapezoidal tau-leap method of weak order two as a starting procedure, we show that the k-step SAB method with k >= 3 is order three in the mean and correlation, while a predictor-corrector implementation of the SAM method is weak order three in the mean but only order one in the correlation. These convergence results have been derived analytically for linear problems and successfully tested numerically for both linear and non-linear systems. A series of additional examples have been implemented in order to demonstrate the efficacy of this approach.
Resumo:
Background Biochemical systems with relatively low numbers of components must be simulated stochastically in order to capture their inherent noise. Although there has recently been considerable work on discrete stochastic solvers, there is still a need for numerical methods that are both fast and accurate. The Bulirsch-Stoer method is an established method for solving ordinary differential equations that possesses both of these qualities. Results In this paper, we present the Stochastic Bulirsch-Stoer method, a new numerical method for simulating discrete chemical reaction systems, inspired by its deterministic counterpart. It is able to achieve an excellent efficiency due to the fact that it is based on an approach with high deterministic order, allowing for larger stepsizes and leading to fast simulations. We compare it to the Euler τ-leap, as well as two more recent τ-leap methods, on a number of example problems, and find that as well as being very accurate, our method is the most robust, in terms of efficiency, of all the methods considered in this paper. The problems it is most suited for are those with increased populations that would be too slow to simulate using Gillespie’s stochastic simulation algorithm. For such problems, it is likely to achieve higher weak order in the moments. Conclusions The Stochastic Bulirsch-Stoer method is a novel stochastic solver that can be used for fast and accurate simulations. Crucially, compared to other similar methods, it better retains its high accuracy when the timesteps are increased. Thus the Stochastic Bulirsch-Stoer method is both computationally efficient and robust. These are key properties for any stochastic numerical method, as they must typically run many thousands of simulations.
Resumo:
The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Autonomous vehicles are able to share information about the local traffic state in real time, which could result in a better reaction to the mechanism of traffic jam formation. An upstream single-hop radio broadcast network can improve the perception of each cooperative driver within a specific radio range and hence the traffic stability. The impact of vehicle to vehicle cooperation on the onset of traffic congestion is investigated analytically and through simulation. A next generation simulation field dataset is used to calibrate the full velocity difference car-following model, and the MOBIL lane-changing model is implemented. The robustness of the calibration as well as the heterogeneity of the drivers is discussed. Assuming that congestion can be triggered either by the heterogeneity of drivers' behaviours or abnormal lane-changing behaviours, the calibrated car-following model is used to assess the impact of a microscopic cooperative law on egoistic lane-changing behaviours. The cooperative law can help reduce and delay traffic congestion and can have a positive effect on safety indicators.
Resumo:
Take-it or leave-it offers are probably as old as mankind. Our objective here is, first, to provide a, probably subjectively colored, recollection of the initial ultimatum game experiment, its motivation and the immediate responses. Second, we discuss extensions of the standard ultimatum bargaining game in a unified framework, and, third, we offer a survey of the experimental ultimatum bargaining literature containing papers published since the turn of the century. The paper argues that the ultimatum game is a versatile tool for research in bargaining and on social preferences. Finally, we provide examples for open research questions and directions for future studies.
Resumo:
A simulation model (PCPF-B) was developed based on the PCPF-1 model to predict the runoff of pesticides from paddy plots to a drainage canal in a paddy block. The block-scale model now comprises three modules: (1) a module for pesticide application, (2) a module for pesticide behavior in paddy fields, and (3) a module for pesticide concentration in the drainage canal. The PCPF-B model was first evaluated by published data in a single plot and then was applied to predict the concentration of bensulfuron-methyl in one paddy block in the Sakura river basin, Ibaraki, Japan, where a detailed field survey was conducted. The PCPF-B model simulated well the behavior of bensulfuron-methyl in individual paddy plots. It also reflected the runoff pattern of bensulfuron-methyl at the block outlet, although overestimation of bensulfuronmethyl concentrations occurred due to uncertainty in water balance estimation. Application of water management practice such as water-holding period and seepage control also affected the performance of the model. A probabilistic approach may be necessary for a comprehensive risk assessment in large-scale paddy areas.
Resumo:
The fate of two popular antibiotics, oxytetracycline and oxolinic acid, in a fish pond were simulated using a computational model. The VDC model, which is designed based on a model for predicting pesticide fate and transport in paddy fields, was modified to take into account the differences between the pond and the paddies as well as those between the fish and the rice plant behaviors. The pond conditions were set following the typical practice in South East Asia aquaculture. The two antibiotics were administered to the animal in the pond through medicated feed during a period of 5 days as in actual practice. Concentrations of oxytetracycline in pond water were higher than those of oxolinic acid at the beginning of the simulation. Dissipation rate of oxytetracycline is also higher as it is more readily available for degradation in the water. For the long term, oxolinic acid was present at higher concentration than oxytetracycline in pond water as well as pond sediment. The simulated results were expected to be conservative and can be useful for the lower tier assessment of exposure risk of veterinary medicine in aquaculture industry but more data are needed for the complete validation of the model.
Resumo:
BACKGROUND: Monitoring studies revealed high concentrations of pesticides in the drainage canal of paddy fields. It is important to have a way to predict these concentrations in different management scenarios as an assessment tool. A simulation model for predicting the pesticide concentration in a paddy block (PCPF-B) was evaluated and then used to assess the effect of water management practices for controlling pesticide runoff from paddy fields. RESULTS: The PCPF-B model achieved an acceptable performance. The model was applied to a constrained probabilistic approach using the Monte Carlo technique to evaluate the best management practices for reducing runoff of pretilachlor into the canal. The probabilistic model predictions using actual data of pesticide use and hydrological data in the canal showed that the water holding period (WHP) and the excess water storage depth (EWSD) effectively reduced the loss and concentration of pretilachlor from paddy fields to the drainage canal. The WHP also reduced the timespan of pesticide exposure in the drainage canal. CONCLUSIONS: It is recommended that: (1) the WHP be applied for as long as possible, but for at least 7 days, depending on the pesticide and field conditions; (2) an EWSD greater than 2 cm be maintained to store substantial rainfall in order to prevent paddy runoff, especially during the WHP.
Resumo:
In this paper, we propose a new load distribution strategy called `send-and-receive' for scheduling divisible loads, in a linear network of processors with communication delay. This strategy is designed to optimally utilize the network resources and thereby minimizes the processing time of entire processing load. A closed-form expression for optimal size of load fractions and processing time are derived when the processing load originates at processor located in boundary and interior of the network. A condition on processor and link speed is also derived to ensure that the processors are continuously engaged in load distributions. This paper also presents a parallel implementation of `digital watermarking problem' on a personal computer-based Pentium Linear Network (PLN) topology. Experiments are carried out to study the performance of the proposed strategy and results are compared with other strategies found in literature.
Resumo:
The main objective of statistical analysis of experi- mental investigations is to make predictions on the basis of mathematical equations so as the number of experiments. Abrasive jet machining (AJM) is an unconventional and novel machining process wherein microabrasive particles are propelled at high veloc- ities on to a workpiece. The resulting erosion can be used for cutting, etching, cleaning, deburring, drilling and polishing. In the study completed by the authors, statistical design of experiments was successfully employed to predict the rate of material removal by AJM. This paper discusses the details of such an approach and the findings.
Resumo:
We report here on a series of laboratory experiments on plumes, undertaken with the object of simulating the effect of the heat release that occurs in clouds on condensation of water vapor. The experimental technique used for this purpose relies on ohmic heating generated in an electrically conducting plume fluid subjected to a suitable alternating voltage across specified axial stations in the plume flow [Bhat et al., 1989]. The present series of experiments achieves a value of the Richardson number that is toward the lower end of the range that characteristics cumulus clouds. It is found that the buoyancy enhancement due to heating disrupts the eddy structures in the flow and reduces the dilution owing to entrainment of ambient fluid that would otherwise have occurred in the central region of the plume. Heating also reduces the spread rate of the plume, but as it accelerates the flow as well, the overall specific mass flux in the plume does not show a very significant change at the heat input employed in the experiment. However, there is some indication that the entrainment rate (proportional to the streamwise derivative of the mass flux) is slightly higher immediately after heat injection and slightly lower farther downstream. The measurements support a previous proposal for a cloud scenario [Bhat and Narasimha, 1996] and demonstrate how fresh insights into certain aspects of the fluid dynamics of clouds may be derived from the experimental techniques employed here.