19 resultados para Easy Java Simulations
Resumo:
Background The evolutionary advantages of selective attention are unclear. Since the study of selective attention began, it has been suggested that the nervous system only processes the most relevant stimuli because of its limited capacity [1]. An alternative proposal is that action planning requires the inhibition of irrelevant stimuli, which forces the nervous system to limit its processing [2]. An evolutionary approach might provide additional clues to clarify the role of selective attention. Methods We developed Artificial Life simulations wherein animals were repeatedly presented two objects, "left" and "right", each of which could be "food" or "non-food." The animals' neural networks (multilayer perceptrons) had two input nodes, one for each object, and two output nodes to determine if the animal ate each of the objects. The neural networks also had a variable number of hidden nodes, which determined whether or not it had enough capacity to process both stimuli (Table 1). The evolutionary relevance of the left and the right food objects could also vary depending on how much the animal's fitness was increased when ingesting them (Table 1). We compared sensory processing in animals with or without limited capacity, which evolved in simulations in which the objects had the same or different relevances. Table 1. Nine sets of simulations were performed, varying the values of food objects and the number of hidden nodes in the neural networks. The values of left and right food were swapped during the second half of the simulations. Non-food objects were always worth -3. The evolution of neural networks was simulated by a simple genetic algorithm. Fitness was a function of the number of food and non-food objects each animal ate and the chromosomes determined the node biases and synaptic weights. During each simulation, 10 populations of 20 individuals each evolved in parallel for 20,000 generations, then the relevance of food objects was swapped and the simulation was run again for another 20,000 generations. The neural networks were evaluated by their ability to identify the two objects correctly. The detectability (d') for the left and the right objects was calculated using Signal Detection Theory [3]. Results and conclusion When both stimuli were equally relevant, networks with two hidden nodes only processed one stimulus and ignored the other. With four or eight hidden nodes, they could correctly identify both stimuli. When the stimuli had different relevances, the d' for the most relevant stimulus was higher than the d' for the least relevant stimulus, even when the networks had four or eight hidden nodes. We conclude that selection mechanisms arose in our simulations depending not only on the size of the neuron networks but also on the stimuli's relevance for action.
Resumo:
Industrial recurrent event data where an event of interest can be observed more than once in a single sample unit are presented in several areas, such as engineering, manufacturing and industrial reliability. Such type of data provide information about the number of events, time to their occurrence and also their costs. Nelson (1995) presents a methodology to obtain asymptotic confidence intervals for the cost and the number of cumulative recurrent events. Although this is a standard procedure, it can not perform well in some situations, in particular when the sample size available is small. In this context, computer-intensive methods such as bootstrap can be used to construct confidence intervals. In this paper, we propose a technique based on the bootstrap method to have interval estimates for the cost and the number of cumulative events. One of the advantages of the proposed methodology is the possibility for its application in several areas and its easy computational implementation. In addition, it can be a better alternative than asymptotic-based methods to calculate confidence intervals, according to some Monte Carlo simulations. An example from the engineering area illustrates the methodology.
Resumo:
This work aims to study the urban heat island on North region of Parana state, Brazil and the influence of land use and urban settlements on the intensity and frequency of occurrence of these events. Through atmospheric modeling whith WRF/Chem model two simulations were made with different land and use files, one with the original land use another obtained from a composition of MODIS-Landsat imagery. The simulations showed good skills compared to observed data. Urban areas presented higher temperatures. Landsat land use has represented better urban heat islands (UHI), the gradient between urban and rural areas was well demonstrated and the correlation coefficient was above 0.92. The model underestimated the maximum values and overestimated the minimum compared with observed data in both simulations.
Resumo:
The objective of this work were apply and provide a preliminary evaluation of the Weather Research and Forecasting model coupled with Chemistry (WRF-Chem) performance, for Londrina region. We performed comparison with measurements obtained in meteorological stations. The model was configured to run with three domains with 27,9 and 3 km of grid resolution, using the ndown program and also was realized a simulation with the model configured to run with a single domain using a land use file based in a classified image for region of MODIS sensor. The emission files to supply the chemistry run were generated based in the work of Martins et al., 2012. RADM2 chemical mechanism and MADE/SORGAM modal aerosol models were used in the simulations. The results demonstrated that model was able to represent coherently the formation and dispersion of the pollution in Metropolitan Region of Londrina and also the importance of using the appropriate land use file for the region.