939 resultados para Simulation experiments
Resumo:
This paper asks a new question: how we can use RFID technology in marketing products in supermarkets and how we can measure its performance or ROI (Return-on-Investment). We try to answer the question by proposing a simulation model whereby customers become aware of other customers' real-time shopping behavior and may hence be influenced by their purchases and the levels of purchases. The proposed model is orthogonal to sales model and can have the similar effects: increase in the overall shopping volume. Managers often struggle with the prediction of ROI on purchasing such a technology, this simulation sets to provide them the answers of questions like the percentage of increase in sales given real-time purchase information to other customers. The simulation is also flexible to incorporate any given model of customers' behavior tailored to particular supermarket, settings, events or promotions. The results, although preliminary, are promising to use RFID technology for marketing products in supermarkets and provide several dimensions to look for influencing customers via feedback, real-time marketing, target advertisement and on-demand promotions. Several other parameters have been discussed including the herd behavior, fake customers, privacy, and optimality of sales-price margin and the ROI of investing in RFID technology for marketing purposes. © 2010 Springer Science+Business Media B.V.
Resumo:
This work proposes a supermarket optimization simulation model called Swarm-Moves is based on self organized complex system studies to identify parameters and their values that can influence customers to buy more on impulse in a given period of time. In the proposed model, customers are assumed to have trolleys equipped with technology like RFID that can aid the passing of products' information directly from the store to them in real-time and vice-versa. Therefore, they can get the information about other customers purchase patterns and constantly informing the store of their own shopping behavior. This can be easily achieved because the trolleys "know" what products they contain at any point. The Swarm-Moves simulation is the virtual supermarket providing the visual display to run and test the proposed model. The simulation is also flexible to incorporate any given model of customers' behavior tailored to particular supermarket, settings, events or promotions. The results, although preliminary, are promising to use RFID technology for marketing products in supermarkets and provide several dimensions to look for influencing customers via feedback, real-time marketing, target advertisement and on-demand promotions. ©2009 IEEE.
Resumo:
Purpose - The purpose of this paper is to apply lattice Boltzmann equation method (LBM) with multiple relaxation time (MRT) model, to investigate lid-driven flow in a three-dimensional (3D), rectangular cavity, and compare the results with flow in an equivalent two-dimensional (2D) cavity. Design/methodology/approach - The second-order MRT model is implemented in a 3D LBM code. The flow structure in cavities of different aspect ratios (0.25-4) and Reynolds numbers (0.01-1000) is investigated. The LBM simulation results are compared with those from numerical solution of Navier-Stokes (NS) equations and with available experimental data. Findings - The 3D simulations demonstrate that 2D models may predict the flow structure reasonably well at low Reynolds numbers, but significant differences with experimental data appear at high Reynolds numbers. Such discrepancy between 2D and 3D results are attributed to the effect of boundary layers near the side-walls in transverse direction (in 3D), due to which the vorticity in the core-region is weakened in general. Secondly, owing to the vortex stretching effect present in 3D flow, the vorticity in the transverse plane intensifies whereas that in the lateral plane decays, with increase in Reynolds number. However, on the symmetry-plane, the flow structure variation with respect to cavity aspect ratio is found to be qualitatively consistent with results of 2D simulations. Secondary flow vortices whose axis is in the direction of the lid-motion are observed; these are weak at low. Reynolds numbers, but become quite strong at high Reynolds numbers. Originality/value - The findings will be useful in the study of variety of enclosed fluid flows.
Resumo:
Scratch assays are difficult to reproduce. Here we identify a previously overlooked source of variability which could partially explain this difficulty. We analyse a suite of scratch assays in which we vary the initial degree of confluence (initial cell density). Our results indicate that the rate of re-colonisation is very sensitive to the initial density. To quantify the relative roles of cell migration and proliferation, we calibrate the solution of the Fisher–Kolmogorov model to cell density profiles to provide estimates of the cell diffusivity, D, and the cell proliferation rate, λ. This procedure indicates that the estimates of D and λ are very sensitive to the initial density. This dependence suggests that the Fisher–Kolmogorov model does not accurately represent the details of the collective cell spreading process, since this model assumes that D and λ are constants that ought to be independent of the initial density. Since higher initial cell density leads to enhanced spreading, we also calibrate the solution of the Porous–Fisher model to the data as this model assumes that the cell flux is an increasing function of the cell density. Estimates of D and λ associated with the Porous–Fisher model are less sensitive to the initial density, suggesting that the Porous–Fisher model provides a better description of the experiments.
Resumo:
We have evaluated techniques of estimating animal density through direct counts using line transects during 1988-92 in the tropical deciduous forests of Mudumalai Sanctuary in southern India for four species of large herbivorous mammals, namely, chital (Axis axis), sambar (Cervus unicolor), Asian elephant (Elephas maximus) and gaur (Bos gauras). Density estimates derived from the Fourier Series and the Half-Normal models consistently had the lowest coefficient of variation. These two models also generated similar mean density estimates. For the Fourier Series estimator, appropriate cut-off widths for analysing line transect data for the four species are suggested. Grouping data into various distance classes did not produce any appreciable differences in estimates of mean density or their variances, although model fit is generally better when data are placed in fewer groups. The sampling effort needed to achieve a desired precision (coefficient of variation) in the density estimate is derived. A sampling effort of 800 km of transects returned a 10% coefficient of variation on estimate for chital; for the other species a higher effort was needed to achieve this level of precision. There was no statistically significant relationship between detectability of a group and the size of the group for any species. Density estimates along roads were generally significantly different from those in the interior af the forest, indicating that road-side counts may not be appropriate for most species.
Resumo:
This paper presents the results of a series of servo-controlled cyclic triaxial tests and numerical simulations using the three- dimensional discrete element method (DEM) on post-liquefaction undrained monotonic strength of granular materials. In a first test series,undrained monotonic tests were carried out after dissipating the excess pore water pressure developed during liquefaction. The influence of different parameters such as amplitude of axial strain,relative density and confining pressure prior to liquefaction on the post-liquefaction undrained response have been investigated.The results obtained highlight an insignificant influence of amplitude of axial strain, confining pressure and a significant influence of relative density on the post-liquefaction undrained monotonic stress-strain response.In the second series, undrained monotonic tests were carried out on similar triaxial samples without dissipating the excess pore water pressure developed during liquefaction. The results highlight that the amplitude of axial strain prior to liquefaction has a significant influence on the post-liquefaction undrained monotonic response.In addition,DEM simulations have been carried out on an assembly of spheres to simulate post-liquefaction behaviour.The simulations were very similar to the experiments with an objective to understand the behaviour of monotonic strength of liquefied samples from the grain scale. The numerical simulations using DEM have captured qualitatively all the features of the post-liquefaction undrained monotonic response in a manner similar to that of the experiments.In addition,a detailed study on the evolution of micromechanical parameters such as the average coordination number and induced anisotropic coefficients has been reported during the post-liquefaction undrained monotonic loading.
Resumo:
This article deals with a simulation-based Study of the impact of projectiles on thin aluminium plates using LS-DYNA by modelling plates with shell elements and projectiles with solid elements. In order to establish the required modelling criterion in terms of element size for aluminium plates, a convergence Study of residual velocity has been carried Out by varying mesh density in the impact zone. Using the preferred material and meshing criteria arrived at here, extremely good prediction of test residual velocities and ballistic limits given by Gupta et al. (2001) for thin aluminium plates has been obtained. The simulation-based pattern of failure with localized bulging and jagged edge of perforation is similar to the perforation with petalling seen in tests. A number Of simulation-based parametric studies have been carried out and results consistent with published test data have been obtained. Despite the robust correlation achieved against published experimental results, it would be prudent to conduct one's own experiments, for a final correlation via the present modelling procedure and analysis with the explicit LS-DYNTA 970 solver. Hence, a sophisticated ballistic impact testing facility and a high-speed camera have been used to conduct additional tests on grade 1100 aluminium plates of 1 mm thickness with projectiles Of four different nose shapes. Finally, using the developed numerical simulation procedure, an excellent correlation of residual velocity and failure modes with the corresponding test results has been obtained.
Resumo:
Flexible objects such as a rope or snake move in a way such that their axial length remains almost constant. To simulate the motion of such an object, one strategy is to discretize the object into large number of small rigid links connected by joints. However, the resulting discretised system is highly redundant and the joint rotations for a desired Cartesian motion of any point on the object cannot be solved uniquely. In this paper, we revisit an algorithm, based on the classical tractrix curve, to resolve the redundancy in such hyper-redundant systems. For a desired motion of the `head' of a link, the `tail' is moved along a tractrix, and recursively all links of the discretised objects are moved along different tractrix curves. The algorithm is illustrated by simulations of a moving snake, tying of knots with a rope and a solution of the inverse kinematics of a planar hyper-redundant manipulator. The simulations show that the tractrix based algorithm leads to a more `natural' motion since the motion is distributed uniformly along the entire object with the displacements diminishing from the `head' to the `tail'.
Resumo:
In many parts of the world, uncontrolled fires in sparsely populated areas are a major concern as they can quickly grow into large and destructive conflagrations in short time spans. Detecting these fires has traditionally been a job for trained humans on the ground, or in the air. In many cases, these manned solutions are simply not able to survey the amount of area necessary to maintain sufficient vigilance and coverage. This paper investigates the use of unmanned aerial systems (UAS) for automated wildfire detection. The proposed system uses low-cost, consumer-grade electronics and sensors combined with various airframes to create a system suitable for automatic detection of wildfires. The system employs automatic image processing techniques to analyze captured images and autonomously detect fire-related features such as fire lines, burnt regions, and flammable material. This image recognition algorithm is designed to cope with environmental occlusions such as shadows, smoke and obstructions. Once the fire is identified and classified, it is used to initialize a spatial/temporal fire simulation. This simulation is based on occupancy maps whose fidelity can be varied to include stochastic elements, various types of vegetation, weather conditions, and unique terrain. The simulations can be used to predict the effects of optimized firefighting methods to prevent the future propagation of the fires and greatly reduce time to detection of wildfires, thereby greatly minimizing the ensuing damage. This paper also documents experimental flight tests using a SenseFly Swinglet UAS conducted in Brisbane, Australia as well as modifications for custom UAS.
Resumo:
Importance of the field: The shift in focus from ligand based design approaches to target based discovery over the last two to three decades has been a major milestone in drug discovery research. Currently, it is witnessing another major paradigm shift by leaning towards the holistic systems based approaches rather the reductionist single molecule based methods. The effect of this new trend is likely to be felt strongly in terms of new strategies for therapeutic intervention, new targets individually and in combinations, and design of specific and safer drugs. Computational modeling and simulation form important constituents of new-age biology because they are essential to comprehend the large-scale data generated by high-throughput experiments and to generate hypotheses, which are typically iterated with experimental validation. Areas covered in this review: This review focuses on the repertoire of systems-level computational approaches currently available for target identification. The review starts with a discussion on levels of abstraction of biological systems and describes different modeling methodologies that are available for this purpose. The review then focuses on how such modeling and simulations can be applied for drug target discovery. Finally, it discusses methods for studying other important issues such as understanding targetability, identifying target combinations and predicting drug resistance, and considering them during the target identification stage itself. What the reader will gain: The reader will get an account of the various approaches for target discovery and the need for systems approaches, followed by an overview of the different modeling and simulation approaches that have been developed. An idea of the promise and limitations of the various approaches and perspectives for future development will also be obtained. Take home message: Systems thinking has now come of age enabling a `bird's eye view' of the biological systems under study, at the same time allowing us to `zoom in', where necessary, for a detailed description of individual components. A number of different methods available for computational modeling and simulation of biological systems can be used effectively for drug target discovery.
Resumo:
This paper presents two simple simulation and modelling tools designed to aid in the safety assessment required for unmanned aircraft operations within unsegregated airspace. First, a fast pair-wise encounter generator is derived to simulate the See and Avoid environment. The utility of the encounter generator is demonstrated through the development of a hybrid database and a statistical performance evaluation of an autonomous See and Avoid decision and control strategy. Second, an unmanned aircraft mission generator is derived to help visualise the impact of multiple persistent unmanned operations on existing air traffic. The utility of the mission generator is demonstrated through an example analysis of a mixed airspace environment using real traffic data in Australia. These simulation and modelling approaches constitute a useful and extensible set of analysis tools, that can be leveraged to help explore some of the more fundamental and challenging problems facing civilian unmanned aircraft system integration.
Resumo:
We investigate the ability of a global atmospheric general circulation model (AGCM) to reproduce observed 20 year return values of the annual maximum daily precipitation totals over the continental United States as a function of horizontal resolution. We find that at the high resolutions enabled by contemporary supercomputers, the AGCM can produce values of comparable magnitude to high quality observations. However, at the resolutions typical of the coupled general circulation models used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, the precipitation return values are severely underestimated.
Resumo:
An adaptive drug delivery design is presented in this paper using neural networks for effective treatment of infectious diseases. The generic mathematical model used describes the coupled evolution of concentration of pathogens, plasma cells, antibodies and a numerical value that indicates the relative characteristic of a damaged organ due to the disease under the influence of external drugs. From a system theoretic point of view, the external drugs can be interpreted as control inputs, which can be designed based on control theoretic concepts. In this study, assuming a set of nominal parameters in the mathematical model, first a nonlinear controller (drug administration) is designed based on the principle of dynamic inversion. This nominal drug administration plan was found to be effective in curing "nominal model patients" (patients whose immunological dynamics conform to the mathematical model used for the control design exactly. However, it was found to be ineffective in curing "realistic model patients" (patients whose immunological dynamics may have off-nominal parameter values and possibly unwanted inputs) in general. Hence, to make the drug delivery dosage design more effective for realistic model patients, a model-following adaptive control design is carried out next by taking the help of neural networks, that are trained online. Simulation studies indicate that the adaptive controller proposed in this paper holds promise in killing the invading pathogens and healing the damaged organ even in the presence of parameter uncertainties and continued pathogen attack. Note that the computational requirements for computing the control are very minimal and all associated computations (including the training of neural networks) can be carried out online. However it assumes that the required diagnosis process can be carried out at a sufficient faster rate so that all the states are available for control computation.
Resumo:
Bluetooth is a short-range radio technology operating in the unlicensed industrial-scientific-medical (ISM) band at 2.45 GHz. A piconet is basically a collection of slaves controlled by a master. A scatternet, on the other hand, is established by linking several piconets together in an ad hoc fashion to yield a global wireless ad hoc network. This paper proposes a scheduling policy that aims to achieve increased system throughput and reduced packet delays while providing reasonably good fairness among all traffic flows in bluetooth piconets and scatternets. We propose a novel algorithm for scheduling slots to slaves for both piconets and scatternets using multi-layered parameterized policies. Our scheduling scheme works with real data and obtains an optimal feedback policy within prescribed parameterized classes of these by using an efficient two-timescale simultaneous perturbation stochastic approximation (SPSA) algorithm. We show the convergence of our algorithm to an optimal multi-layered policy. We also propose novel polling schemes for intra- and inter-piconet scheduling that are seen to perform well. We present an extensive set of simulation results and performance comparisons with existing scheduling algorithms. Our results indicate that our proposed scheduling algorithm performs better overall on a wide range of experiments over the existing algorithms for both piconets (Das et al. in INFOCOM, pp. 591–600, 2001; Lapeyrie and Turletti in INFOCOM conference proceedings, San Francisco, US, 2003; Shreedhar and Varghese in SIGCOMM, pp. 231–242, 1995) and scatternets (Har-Shai et al. in OPNETWORK, 2002; Saha and Matsumot in AICT/ICIW, 2006; Tan and Guttag in The 27th annual IEEE conference on local computer networks(LCN). Tampa, 2002). Our studies also confirm that our proposed scheme achieves a high throughput and low packet delays with reasonable fairness among all the connections.