879 resultados para Particle-based Model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulations of a complete reflected shock tunnel facility have been performed with the aim of providing a better understanding of the flow through these facilities. In particular, the analysis is focused on the premature contamination of the test flow with the driver gas. The axisymmetric simulations model the full geometry of the shock tunnel and incorporate an iris-based model of the primary diaphragm rupture mechanics, an ideal secondary diaphragm and account for turbulence in the shock tube boundary layer with the Baldwin-Lomax eddy viscosity model. Two operating conditions were examined: one resulting in an over-tailored mode of operation and the other resulting in approximately tailored operation. The accuracy of the simulations is assessed through comparison with experimental measurements of static pressure, pitot pressure and stagnation temperature. It is shown that the widely-accepted driver gas contamination mechanism in which driver gas 'jets' along the walls through action of the bifurcated foot of the reflected shock, does not directly transport the driver gas to the nozzle at these conditions. Instead, driver gas laden vortices are generated by the bifurcated reflected shock. These vortices prevent jetting of the driver gas along the walls and convect driver gas away from the shock tube wall and downstream into the nozzle. Additional vorticity generated by the interaction of the reflected shock and the contact surface enhances the process in the over-tailored case. However, the basic mechanism appears to operate in a similar way for both the over-tailored and the approximately tailored conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High-fidelity eye tracking is combined with a perceptual grouping task to provide insight into the likely mechanisms underlying the compensation of retinal image motion caused by movement of the eyes. The experiments describe the covert detection of minute temporal and spatial offsets incorporated into a test stimulus. Analysis of eye motion on individual trials indicates that the temporal offset sensitivity is actually due to motion of the eye inducing artificial spatial offsets in the briefly presented stimuli. The results have strong implications for two popular models of compensation for fixational eye movements, namely efference copy and image-based models. If an efference copy model is assumed, the results place constraints on the spatial accuracy and source of compensation. If an image-based model is assumed then limitations are placed on the integration time window over which motion estimates are calculated. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As field determinations take much effort, it would be useful to be able to predict easily the coefficients describing the functional response of free-living predators, the function relating food intake rate to the abundance of food organisms in the environment. As a means easily to parameterise an individual-based model of shorebird Charadriiformes populations, we attempted this for shorebirds eating macro-invertebrates. Intake rate is measured as the ash-free dry mass (AFDM) per second of active foraging; i.e. excluding time spent on digestive pauses and other activities, such as preening. The present and previous studies show that the general shape of the functional response in shorebirds eating approximately the same size of prey across the full range of prey density is a decelerating rise to a plateau, thus approximating the Holling type 11 ('disc equation') formulation. But field studies confirmed that the asymptote was not set by handling time, as assumed by the disc equation, because only about half the foraging time was spent in successfully or unsuccessfully attacking and handling prey, the rest being devoted to searching. A review of 30 functional responses showed that intake rate in free-living shorebirds varied independently of prey density over a wide range, with the asymptote being reached at very low prey densities (< 150/m(-2)). Accordingly, most of the many studies of shorebird intake rate have probably been conducted at or near the asymptote of the functional response, suggesting that equations that predict intake rate should also predict the asymptote. A multivariate analysis of 468 'spot' estimates of intake rates from 26 shorebirds identified ten variables, representing prey and shorebird characteristics, that accounted for 81 % of the variance in logarithm-transformed intake rate. But four-variables accounted for almost as much (77.3 %), these being bird size, prey size, whether the bird was an oystercatcher Haematopus ostralegus eating mussels Mytilus edulis, or breeding. The four variable equation under-predicted, on average, the observed 30 estimates of the asymptote by 11.6%, but this discrepancy was reduced to 0.2% when two suspect estimates from one early study in the 1960s were removed. The equation therefore predicted the observed asymptote very successfully in 93 % of cases. We conclude that the asymptote can be reliably predicted from just four easily measured variables. Indeed, if the birds are not breeding and are not oystercatchers eating mussels, reliable predictions can be obtained using just two variables, bird and prey sizes. A multivariate analysis of 23 estimates of the half-asymptote constant suggested they were smaller when prey were small but greater when the birds were large, especially in oystercatchers. The resulting equation could be used to predict the half-asymptote constant, but its predictive power has yet to be tested. As well as predicting the asymptote of the functional response, the equations will enable research workers engaged in many areas of shorebird ecology and behaviour to estimate intake rate without the need for conventional time-consuming field studies, including species for which it has not yet proved possible to measure intake rate in the field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditional sensitivity and elasticity analyses of matrix population models have been used to p inform management decisions, but they ignore the economic costs of manipulating vital rates. For exam le, the growth rate of a population is often most sensitive to changes in adult survival rate, but this does not mean that increasing that rate is the best option for managing the population because it may be much more expensive than other options. To explore how managers should optimize their manipulation of vital rates, we incorporated the cost of changing those rates into matrix population models. We derived analytic expressions for locations in parameter space where managers should shift between management of fecundity and survival, for the balance between fecundity and survival management at those boundaries, and for the allocation of management resources to sustain that optimal balance. For simple matrices, the optimal budget allocation can often be expressed as simple functions of vital rates and the relative costs of changing them. We applied our method to management of the Helmeted Honeyeater (Lichenostomus melanops cassidix; an endangered Australian bird) and the koala (Phascolarctos cinereus) as examples. Our method showed that cost-efficient management of the Helmeted Honeyeater should focus on increasing fecundity via nest protection, whereas optimal koala management should focus on manipulating both fecundity and survival simultaneously, These findings are contrary to the cost-negligent recommendations of elasticity analysis, which would suggest focusing on managing survival in both cases. A further investigation of Helmeted Honeyeater management options, based on an individual-based model incorporating density dependence, spatial structure, and environmental stochasticity, confirmed that fecundity management was the most cost-effective strategy. Our results demonstrate that decisions that ignore economic factors will reduce management efficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three important goals in describing software design patterns are: generality, precision, and understandability. To address these goals, this paper presents an integrated approach to specifying patterns using Object-Z and UML. To achieve the generality goal, we adopt a role-based metamodeling approach to define patterns. With this approach, each pattern is defined as a pattern role model. To achieve precision, we formalize role concepts using Object-Z (a role metamodel) and use these concepts to define patterns (pattern role models). To achieve understandability, we represent the role metamodel and pattern role models visually using UML. Our pattern role models provide a precise basis for pattern-based model transformations or refactoring approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge maintenance is a major challenge for both knowledge management and the Semantic Web. Operating over the Semantic Web, there will be a network of collaborating agents, each with their own ontologies or knowledge bases. Change in the knowledge state of one agent may need to be propagated across a number of agents and their associated ontologies. The challenge is to decide how to propagate a change of knowledge state. The effects of a change in knowledge state cannot be known in advance, and so an agent cannot know who should be informed unless it adopts a simple ‘tell everyone – everything’ strategy. This situation is highly reminiscent of the classic Frame Problem in AI. We argue that for agent-based technologies to succeed, far greater attention must be given to creating an appropriate model for knowledge update. In a closed system, simple strategies are possible (e.g. ‘sleeping dog’ or ‘cheap test’ or even complete checking). However, in an open system where cause and effect are unpredictable, a coherent cost-benefit based model of agent interaction is essential. Otherwise, the effectiveness of every act of knowledge update/maintenance is brought into question.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cellular thiols are critical moieties in signal transduction, regulation of gene expression, and ultimately are determinants of specific protein activity. Whilst protein bound thiols are the critical effector molecules, low molecular weight thiols, such as glutathione, play a central role in cytoprotection through (1) direct consumption of oxidants, (2) regeneration of protein thiols and (3) export of glutathione containing mixed disulphides. The brain is particularly vulnerable to oxidative stress, as it consumes 20% of oxygen load, contains high concentrations of polyunsaturated fatty acids and iron in certain regions, and expresses low concentrations of enzymic antioxidants. There is substantial evidence for a role for oxidative stress in neurodegenerative disease, where excitotoxic, redox cycling and mitochondrial dysfunction have been postulated to contribute to the enhanced oxidative load. Others have suggested that loss of important trophic factors may underlie neurodegeneration. However, the two are not mutually exclusive; using cell based model systems, low molecular weight antioxidants have been shown to play an important neuroprotective role in vitro, where neurotrophic factors have been suggested to modulate glutathione levels. Glutathione levels are regulated by substrate availability, synthetic enzyme and metabolic enzyme activity, and by the presence of other antioxidants, which according to the redox potential, consume or regenerate GSH from its oxidised partner. Therefore we have investigated the hypothesis that amyloid beta neurotoxicity is mediated by reactive oxygen species, where trophic factor cytoprotection against oxidative stress is achieved through regulation of glutathione levels. Using PC12 cells as a model system, amyloid beta 25-35 caused a shift in DCF fluorescence after four hours in culture. This fluorescence shift was attenuated by both desferioxamine and NGF. After four hours, cellular glutathione levels were depleted by as much as 75%, however, 24 hours following oxidant exposure, glutathione concentration was restored to twice the concentration seen in controls. NGF prevented both the loss of viability seen after 24 hours amyloid beta treatment and also protected glutathione levels. NGF decreased the total cellular glutathione concentration but did not affect expression of GCS. In conclusion, loss of glutathione precedes cell death in PC12 cells. However, at sublethal doses the surviving fraction respond to oxidative stress by increasing glutathione levels, where this is achieved, at least in part, at the gene level through upregulation of GCS. Whilst NGF does protect against oxidative toxicity, this is not achieved through upregulation of GCS or glutathione.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper contributes to the literature on the intra-firm diffusion of innovations by investigating the factors that affect the firm’s decision to adopt and use sets of complementary innovations. We define complementary innovations those innovations whose joint use generates super additive gains, i.e. the gain from the joint adoption is higher than the sum of the gains derived from the adoption of each innovation in isolation. From a theoretical perspective, we present a simple decision model, whereby the firm decides ‘whether’ and ‘how much’ to invest in each of the innovations under investigation based upon the expected profit gain from each possible combination of adoption and use. The model shows how the extent of complementarity among the innovations can affect the firm’s profit gains and therefore the likelihood that the firm will adopt these innovations jointly, rather than individually. From an empirical perspective, we focus on four sets of management practices, namely operating (OMP), monitoring (MMP), targets (TMP) and incentives (IMP) management practices. We show that these sets of practices, although to a different extent, are complementary to each other. Then, we construct a synthetic indicator of the depth of their use. The resulting intra-firm index is built to reflect not only the number of practices adopted but also the depth of their individual use and the extent of their complementarity. The empirical testing of the decision model is carried out using the evidence from the adoption behaviour of a sample of 1,238 UK establishments present in the 2004 Workplace Employment Relations Survey (WERS). Our empirical results show that the intra-firm profitability based model is a good model in that it can explain more of the variability of joint adoption than models based upon the variability of adoption and use of individual practices. We also investigate whether a number of firm specific and market characteristics by affecting the size of the gains (which the joint adoption of innovations can generate) may drive the intensity of use of the four innovations. We find that establishment size, whether foreign owned, whether exposed to an international market and the degree of homogeneity of the final product are important determinants of the intensity of the joint adoption of the four innovations. Most importantly, our results point out that the factors that the economics of innovation literature has been showing to affect the intensity of use of a technological innovation do also affect the intensity of use of sets of innovative management practices. However, they can explain only a small part of the diversity of their joint adoption use by the firms in the sample.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a multi-agent based model to simulate a population which comprises of two ethnic groups and a peacekeeping force. We investigate the effects of different strategies for civilian movement to the resulting violence in this bi-communal population. Specifically, we compare and contrast random and race-based migration strategies. Race-based migration leads the formation of clusters. Previous work in this area has shown that same-race clustering instigates violent behavior in otherwise passive segments of the population. Our findings confirm this. Furthermore, we show that in settings where only one of the two races adopts race-based migration it is a winning strategy especially in violently predisposed populations. On the other hand, in relatively peaceful settings clustering is a restricting factor which causes the race that adopts it to drift into annihilation. Finally, we show that when race-based migration is adopted as a strategy by both ethnic groups it results in peaceful co-existence even in the most violently predisposed populations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the policies of (1) restricting social influence and (2) imposing curfews upon interacting citizens in a community. We compare and contrast their effects on the social order and the emerging levels of civil violence. Influence models have been used in the past in the context of decision making in a variety of application domains. The policy of curfews has been utilised with the aim of curbing social violence but little research has been done on its effectiveness. We develop a multi-agent-based model that is used to simulate a community of citizens and the police force that guards it. We find that restricting social influence does indeed pacify rebellious societies, but has the opposite effect on peaceful ones. On the other hand, our simple model indicates that restricting mobility through curfews has a pacifying effect across all types of society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a stochastic agent-based model for the distribution of personal incomes in a developing economy. We start with the assumption that incomes are determined both by individual labour and by stochastic effects of trading and investment. The income from personal effort alone is distributed about a mean, while the income from trade, which may be positive or negative, is proportional to the trader's income. These assumptions lead to a Langevin model with multiplicative noise, from which we derive a Fokker-Planck (FP) equation for the income probability density function (IPDF) and its variation in time. We find that high earners have a power law income distribution while the low-income groups have a Levy IPDF. Comparing our analysis with the Indian survey data (obtained from the world bank website: http://go.worldbank.org/SWGZB45DN0) taken over many years we obtain a near-perfect data collapse onto our model's equilibrium IPDF. Using survey data to relate the IPDF to actual food consumption we define a poverty index (Sen A. K., Econometrica., 44 (1976) 219; Kakwani N. C., Econometrica, 48 (1980) 437), which is consistent with traditional indices, but independent of an arbitrarily chosen "poverty line" and therefore less susceptible to manipulation. Copyright © EPLA, 2010.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigates the critical role that opinion leaders (or influentials) play in the adoption process of new products. Recent existing reseach evidence indicates a limited effect of opinion leaders on diffusion processes, yet these studies take into account merely the network position of opinion leaders without addressing their influential power. Empirical findings of our study show that opinion leaders, in addition to having a more central network position, possess more accurate knowledge about a product and tend to be less susceptible to norms and more innovative. Experiments that address these attributes, using an agent-based model, demonstrate that opinion leaders increase the speed of the information stream and the adoption process itself. Furthermore, they increase the maximum adoption percentage. These results indicate that targeting opinion leaders remains a valuable marketing strategy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While conventional Data Envelopment Analysis (DEA) models set targets for each operational unit, this paper considers the problem of input/output reduction in a centralized decision making environment. The purpose of this paper is to develop an approach to input/output reduction problem that typically occurs in organizations with a centralized decision-making environment. This paper shows that DEA can make an important contribution to this problem and discusses how DEA-based model can be used to determine an optimal input/output reduction plan. An application in banking sector with limitation in IT investment shows the usefulness of the proposed method.