920 resultados para deterministic bispectrum


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ex vivo hematopoiesis is increasingly used for clinical applications. Models of ex vivo hematopoiesis are required to better understand the complex dynamics and to optimize hematopoietic culture processes. A general mathematical modeling framework is developed which uses traditional chemical engineering metaphors to describe the complex hematopoietic dynamics. Tanks and tubular reactors are used to describe the (pseudo-) stochastic and deterministic elements of hematopoiesis, respectively. Cells at any point in the differentiation process can belong to either an immobilized, inert phase (quiescent cells) or a mobile, active phase (cycling cells). The model describes five processes: (1) flow (differentiation), (2) autocatalytic formation (growth),(3) degradation (death), (4) phase transition from immobilized to mobile phase (quiescent to cycling transition), and (5) phase transition from mobile to immobilized phase (cycling to quiescent transition). The modeling framework is illustrated with an example concerning the effect of TGF-beta 1 on erythropoiesis. (C) 1998 Published by Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realistic time frames in which management decisions are made often preclude the completion of the detailed analyses necessary for conservation planning. Under these circumstances, efficient alternatives may assist in approximating the results of more thorough studies that require extensive resources and time. We outline a set of concepts and formulas that may be used in lieu of detailed population viability analyses and habitat modeling exercises to estimate the protected areas required to provide desirable conservation outcomes for a suite of threatened plant species. We used expert judgment of parameters and assessment of a population size that results in a specified quasiextinction risk based on simple dynamic models The area required to support a population of this size is adjusted to take into account deterministic and stochastic human influences, including small-scale disturbance deterministic trends such as habitat loss, and changes in population density through processes such as predation and competition. We set targets for different disturbance regimes and geographic regions. We applied our methods to Banksia cuneata, Boronia keysii, and Parsonsia dorrigoensis, resulting in target areas for conservation of 1102, 733, and 1084 ha, respectively. These results provide guidance on target areas and priorities for conservation strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The majority of past and current individual-tree growth modelling methodologies have failed to characterise and incorporate structured stochastic components. Rather, they have relied on deterministic predictions or have added an unstructured random component to predictions. In particular, spatial stochastic structure has been neglected, despite being present in most applications of individual-tree growth models. Spatial stochastic structure (also called spatial dependence or spatial autocorrelation) eventuates when spatial influences such as competition and micro-site effects are not fully captured in models. Temporal stochastic structure (also called temporal dependence or temporal autocorrelation) eventuates when a sequence of measurements is taken on an individual-tree over time, and variables explaining temporal variation in these measurements are not included in the model. Nested stochastic structure eventuates when measurements are combined across sampling units and differences among the sampling units are not fully captured in the model. This review examines spatial, temporal, and nested stochastic structure and instances where each has been characterised in the forest biometry and statistical literature. Methodologies for incorporating stochastic structure in growth model estimation and prediction are described. Benefits from incorporation of stochastic structure include valid statistical inference, improved estimation efficiency, and more realistic and theoretically sound predictions. It is proposed in this review that individual-tree modelling methodologies need to characterise and include structured stochasticity. Possibilities for future research are discussed. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The traditional theory of price index numbers is based on the law of one price. But in the real world, we frequently observe the existence of an equilibrium price dispersion instead of one price of equilibrium. This article discusses the effects of price dispersion on two price indexes: the cost of living index and the consumer price index. With price dispersion and consumer searching for the lowest price, these indexes cannot be interpreted as deterministic indicators, but as stochastic indicators, and they can be biased if price dispersion is not taken into account. A measure for the bias of the consumer price index is proposed and the article ends with an estimation of the bias based on data obtained from the consumer price index calculated for the city of Sao Paulo, Brazil, from January 1988 through December 2004. The period analysed is very interesting, because it exhibits different inflationary environments: high levels and high volatility of the rates of inflation with great price dispersion until July 1994 and low and relatively stable rates of inflation with prices less dispersed after August 1994.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a kinetic Ising model which represents a generic agent-based model for various types of socio-economic systems. We study the case of a finite (and not necessarily large) number of agents N as well as the asymptotic case when the number of agents tends to infinity. The main ingredient are individual decision thresholds which are either fixed over time (corresponding to quenched disorder in the Ising model, leading to nonlinear deterministic dynamics which are generically non-ergodic) or which may change randomly over time (corresponding to annealed disorder, leading to ergodic dynamics). We address the question how increasing the strength of annealed disorder relative to quenched disorder drives the system from non-ergodic behavior to ergodicity. Mathematically rigorous analysis provides an explicit and detailed picture for arbitrary realizations of the quenched initial thresholds, revealing an intriguing ""jumpy"" transition from non-ergodicity with many absorbing sets to ergodicity. For large N we find a critical strength of annealed randomness, above which the system becomes asymptotically ergodic. Our theoretical results suggests how to drive a system from an undesired socio-economic equilibrium (e. g. high level of corruption) to a desirable one (low level of corruption).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Minimal perfect hash functions are used for memory efficient storage and fast retrieval of items from static sets. We present an infinite family of efficient and practical algorithms for generating order preserving minimal perfect hash functions. We show that almost all members of the family construct space and time optimal order preserving minimal perfect hash functions, and we identify the one with minimum constants. Members of the family generate a hash function in two steps. First a special kind of function into an r-graph is computed probabilistically. Then this function is refined deterministically to a minimal perfect hash function. We give strong theoretical evidence that the first step uses linear random time. The second step runs in linear deterministic time. The family not only has theoretical importance, but also offers the fastest known method for generating perfect hash functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Philosophers have long been fascinated by the connection between cause and effect: are 'causes' things we can experience, or are they concepts provided by our minds? The study of causation goes back to Aristotle, but resurged with David Hume and Immanuel Kant, and is now one of the most important topics in metaphysics. Most of the recent work done in this area has attempted to place causation in a deterministic, scientific, worldview. But what about the unpredictable and chancey world we actually live in: can one theory of causation cover all instances of cause and effect?Cause and Chance: Causation in an Indeterministic Worldis a collection of specially written papers by world-class metaphysicians. Its focus is the problem facing the 'reductionist' approach to causation: the attempt to cover all types of causation, deterministic and indeterministic, with one basic theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we discuss implicit Taylor methods for stiff Ito stochastic differential equations. Based on the relationship between Ito stochastic integrals and backward stochastic integrals, we introduce three implicit Taylor methods: the implicit Euler-Taylor method with strong order 0.5, the implicit Milstein-Taylor method with strong order 1.0 and the implicit Taylor method with strong order 1.5. The mean-square stability properties of the implicit Euler-Taylor and Milstein-Taylor methods are much better than those of the corresponding semi-implicit Euler and Milstein methods and these two implicit methods can be used to solve stochastic differential equations which are stiff in both the deterministic and the stochastic components. Numerical results are reported to show the convergence properties and the stability properties of these three implicit Taylor methods. The stability analysis and numerical results show that the implicit Euler-Taylor and Milstein-Taylor methods are very promising methods for stiff stochastic differential equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we discuss implicit methods based on stiffly accurate Runge-Kutta methods and splitting techniques for solving Stratonovich stochastic differential equations (SDEs). Two splitting techniques: the balanced splitting technique and the deterministic splitting technique, are used in this paper. We construct a two-stage implicit Runge-Kutta method with strong order 1.0 which is corrected twice and no update is needed. The stability properties and numerical results show that this approach is suitable for solving stiff SDEs. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A deterministic mathematical model which predicts the probability of developing a new drug-resistant parasite population within the human host is reported, The model incorporates the host's specific antibody response to PfEMP1, and also investigates the influence of chemotherapy on the probability of developing a viable drug-resistant parasite population within the host. Results indicate that early, treatment, and a high antibody threshold coupled with a long lag time between antibody stimulation and activity, are risk factors which increase the likelihood of developing a viable drug-resistant parasite population. High parasite mutation rates and fast PfEMP1 var gene switching are also identified as risk factors. The model output allows the relative importance of the various risk factors as well as the relationships between them to be established, thereby increasing the understanding of the conditions which favour the development of a new drug-resistant parasite population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: This study examines human scalp electroencephalographic (EEG) data for evidence of non-linear interdependence between posterior channels. The spectral and phase properties of those epochs of EEG exhibiting non-linear interdependence are studied. Methods: Scalp EEG data was collected from 40 healthy subjects. A technique for the detection of non-linear interdependence was applied to 2.048 s segments of posterior bipolar electrode data. Amplitude-adjusted phase-randomized surrogate data was used to statistically determine which EEG epochs exhibited non-linear interdependence. Results: Statistically significant evidence of non-linear interactions were evident in 2.9% (eyes open) to 4.8% (eyes closed) of the epochs. In the eyes-open recordings, these epochs exhibited a peak in the spectral and cross-spectral density functions at about 10 Hz. Two types of EEG epochs are evident in the eyes-closed recordings; one type exhibits a peak in the spectral density and cross-spectrum at 8 Hz. The other type has increased spectral and cross-spectral power across faster frequencies. Epochs identified as exhibiting non-linear interdependence display a tendency towards phase interdependencies across and between a broad range of frequencies. Conclusions: Non-linear interdependence is detectable in a small number of multichannel EEG epochs, and makes a contribution to the alpha rhythm. Non-linear interdependence produces spatially distributed activity that exhibits phase synchronization between oscillations present at different frequencies. The possible physiological significance of these findings are discussed with reference to the dynamical properties of neural systems and the role of synchronous activity in the neocortex. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why does species richness vary so greatly across lineages? Traditionally, variation in species richness has been attributed to deterministic processes, although it is equally plausible that it may result from purely stochastic processes. We show that, based on the best available phylogenetic hypothesis, the pattern of cladogenesis among agamid lizards is not consistent with a random model, with some lineages having more species, and others fewer species, than expected by chance. We then use phylogenetic comparative methods to test six types of deterministic explanation for variation in species richness: body size, life history, sexual selection, ecological generalism, range size and latitude. Of eight variables we tested, only sexual size dimorphism and sexual dichromatism predicted species richness. Increases in species richness are associated with increases in sexual dichromatism but reductions in sexual size dimorphism. Consistent with recent comparative studies, we find no evidence that species richness is associated with small body size or high fecundity. Equally, we find no evidence that species richness covaries with ecological generalism, latitude or range size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze the sequences of round-off errors of the orbits of a discretized planar rotation, from a probabilistic angle. It was shown [Bosio & Vivaldi, 2000] that for a dense set of parameters, the discretized map can be embedded into an expanding p-adic dynamical system, which serves as a source of deterministic randomness. For each parameter value, these systems can generate infinitely many distinct pseudo-random sequences over a finite alphabet, whose average period is conjectured to grow exponentially with the bit-length of the initial condition (the seed). We study some properties of these symbolic sequences, deriving a central limit theorem for the deviations between round-off and exact orbits, and obtain bounds concerning repetitions of words. We also explore some asymptotic problems computationally, verifying, among other things, that the occurrence of words of a given length is consistent with that of an abstract Bernoulli sequence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A área da engenharia responsável pelo dimensionamento de estruturas vive em busca da solução que melhor atenderá a vários parâmetros simultâneos como estética, custo, qualidade, peso entre outros. Na prática, não se pode afirmar que o melhor projeto foi de fato executado, pois os projetos são feitos principalmente baseados na experiência do executor, sem se esgotar todas as hipóteses possíveis. É neste sentido que os processos de otimização se fazem necessários na área de dimensionamento de estruturas. É possível obter a partir de um objetivo dado, como o custo, o dimensionamento que melhor atenderá a este parâmetro. Existem alguns estudos nesta área, porém ainda é necessário mais pesquisas. Uma área que vem avançando no estudo de otimização estrutural é o dimensionamento de pilares de acordo com a ABNT NBR 6118:2014 que atenda a uma gama maior de geometrias possíveis. Deve-se também estudar o melhor método de otimização para este tipo de problema dentro dos vários existentes na atualidade. Assim o presente trabalho contempla o embasamento conceitual nos temas de dimensionamento de pilares e métodos de otimização na revisão bibliográfica indicando as referências e métodos utilizados no software de dimensionamento otimizado de pilares, programado com auxílio do software MathLab e seus pacotes, utilizando métodos determinísticos de otimização. Esta pesquisa foi realizada para obtenção do Título de Mestre em Engenharia Civil na Universidade Federal do Espírito Santo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the main determinants of the use of the cost accounting system (CAS) in Portuguese local government (PLG). Regression analysis is used to study the fit of a model of accounting changes in PLG, focused on cost accounting systems oriented to activities and outputs. Based on survey data gathered from PLG, we have found that the use of information in decision-making and external reporting is still a mirage. We obtain evidence about the influence of the internal organizational context (especially the lack of support and difficulties in the CAS implementation) in the use for internal purposes, while the institutional environment (like external pressures to implement the CAS) appears to be more deterministic of the external use. Results strengthen the function of external reporting to legitimate the organization’s activities to external stakeholders. On the other hand, some control variables (like political competition, usefulness and experience) also evidence some explanatory power in the model. Some mixed results were found that appeal to further research in the future. Our empirical results contribute to understand the importance of interconnecting the contingency and institutional approaches to gain a clear picture of cost accounting changes in the public sector.