915 resultados para deterministic fractals
Resumo:
We study the incentive to invest to improve marriage prospects, in a frictionless marriage market with non-transferable utility. Stochastic returns to investment eliminate the multiplicity of equilibria in models with deterministic returns, and a unique equilibrium exists under reasonable conditions. Equilibrium investment is efficient when the sexes are symmetric. However, when there is any asymmetry, including an unbalanced sex ratio, investments are generically excessive. For example, if there is an excess of boys, then there is parental over-investment in boys and under-investment in girls, and total investment will be excessive.
Resumo:
This paper is a contribution to the growing literature on constrained inefficiencies in economies with financial frictions. The purpose is to present two simple examples, inspired by the stochastic models in Gersbach-Rochet (2012) and Lorenzoni (2008), of deterministic environments in which such inefficiencies arise through credit constraints. Common to both examples is a pecuniary externality, which operates through an asset price. In the second example, a simple transfer between two groups of agents can bring about a Pareto improvement. In a first best economy, there are no pecuniary externalities because marginal productivities are equalised. But when agents face credit constraints, there is a wedge between their marginal productivities and those of the non-credit-constrained agents. The wedge is the source of the pecuniary externality: economies with these kinds of imperfections in credit markets are not second-best efficient. This is akin to the constrained inefficiency of an economy with incomplete markets, as in Geanakoplos and Polemarchakis (1986).
Resumo:
There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.
Resumo:
To describe the collective behavior of large ensembles of neurons in neuronal network, a kinetic theory description was developed in [13, 12], where a macroscopic representation of the network dynamics was directly derived from the microscopic dynamics of individual neurons, which are modeled by conductance-based, linear, integrate-and-fire point neurons. A diffusion approximation then led to a nonlinear Fokker-Planck equation for the probability density function of neuronal membrane potentials and synaptic conductances. In this work, we propose a deterministic numerical scheme for a Fokker-Planck model of an excitatory-only network. Our numerical solver allows us to obtain the time evolution of probability distribution functions, and thus, the evolution of all possible macroscopic quantities that are given by suitable moments of the probability density function. We show that this deterministic scheme is capable of capturing the bistability of stationary states observed in Monte Carlo simulations. Moreover, the transient behavior of the firing rates computed from the Fokker-Planck equation is analyzed in this bistable situation, where a bifurcation scenario, of asynchronous convergence towards stationary states, periodic synchronous solutions or damped oscillatory convergence towards stationary states, can be uncovered by increasing the strength of the excitatory coupling. Finally, the computation of moments of the probability distribution allows us to validate the applicability of a moment closure assumption used in [13] to further simplify the kinetic theory.
Resumo:
AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.
Resumo:
The 22q11.2 deletion syndrome (22q11DS) is a widely recognized genetic model allowing the study of neuroanatomical biomarkers that underlie the risk for developing schizophrenia. Recent advances in magnetic resonance image analyses enable the examination of structural connectivity integrity, scarcely used in the 22q11DS field. This framework potentially provides evidence for the disconnectivity hypothesis of schizophrenia in this high-risk population. In the present study, we quantify the whole brain white matter connections in 22q11DS using deterministic tractography. Diffusion Tensor Imaging was acquired in 30 affected patients and 30 age- and gender-matched healthy participants. The Human Connectome technique was applied to register white matter streamlines with cortical anatomy. The number of fibers (streamlines) was used as a measure of connectivity for comparison between groups at the global, lobar and regional level. All statistics were corrected for age and gender. Results showed a 10% reduction of the total number of fibers in patients compared to controls. After correcting for this global reduction, preserved connectivity was found within the right frontal and right parietal lobes. The relative increase in the number of fibers was located mainly in the right hemisphere. Conversely, an excessive reduction of connectivity was observed within and between limbic structures. Finally, a disproportionate reduction was shown at the level of fibers connecting the left fronto-temporal regions. We could therefore speculate that the observed disruption to fronto-temporal connectivity in individuals at risk of schizophrenia implies that fronto-temporal disconnectivity, frequently implicated in the pathogenesis of schizophrenia, could precede the onset of symptoms and, as such, constitutes a biomarker of the vulnerability to develop psychosis. On the contrary, connectivity alterations in the limbic lobe play a role in a wide range of psychiatric disorders and therefore seem to be less specific in defining schizophrenia.
Resumo:
A collection of spherical obstacles in the unit ball in Euclidean space is said to be avoidable for Brownian motion if there is a positive probability that Brownian motion diffusing from some point in the ball will avoid all the obstacles and reach the boundary of the ball. The centres of the spherical obstacles are generated according to a Poisson point process while the radius of an obstacle is a deterministic function. If avoidable configurations are generated with positive probability Lundh calls this percolation diffusion. An integral condition for percolation diffusion is derived in terms of the intensity of the point process and the function that determines the radii of the obstacles.
Resumo:
In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.
Resumo:
A collection of spherical obstacles in the unit ball in Euclidean space is said to be avoidable for Brownian motion if there is a positive probability that Brownian motion diffusing from some point in the ball will avoid all the obstacles and reach the boundary of the ball. The centres of the spherical obstacles are generated according to a Poisson point process while the radius of an obstacle is a deterministic function. If avoidable con gurations are generated with positive probability Lundh calls this percolation di usion. An integral condition for percolation di ffusion is derived in terms of the intensity of the point process and the function that determines the radii of the obstacles.
Resumo:
Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.
Resumo:
Résumé : Depuis la fin de la perestroïka s'est mis en place en Russie un discours identitaire qui, en linguistique, prend des formes extrêmes, reposant sur un strict déterminisme de la pensée par la langue. Les organismes de financement de la recherche scientifique soutiennent des projets qui étudient le rapport entre la grammaire russe et le « caractère national russe ». Des objets nouveaux apparaissent : « l'image linguistique russe du monde », « la personnalité linguistique », la « linguoculturologie ». Cet ensemble discursif construit dans l'imaginaire une identité collective rassurante, reposant sur l'idée que 1) tous les gens qui parlent la même langue pensent de la même façon; 2) les langues, donc les pensées collectives, sont imperméables entre elles, et donc intraduisibles. Cette tendance néo-humboldtienne dans la linguistique russe actuelle se déploie en toute méconnaissance de ses origines historiques : le Romantisme allemand dans son opposition à la philosophie des Lumières, le positivisme évolutionnisme d'Auguste Comte et la linguistique déterministe de l'Allemagne des années 1930.AbstractSince the end of perestroika, in linguistics in Russia, a new form of discourse has taken place, which stresses a very tight determinism of thought by language. The funding organizations of scientific research back up projects studying the relationship between Russiangrammar and the « Russian national character ». New objects of knowledge come to light : « the Russian linguistic image of the world », « linguistic personnality », « culturology ». This kind of discourse builds up an imaginary comforting collective identity, which relies on the principle that 1) all the people who speak the same language think the same way; 2) languages, hence collective kinds of thought, are hermetically closed to each other, and untranslatable. This neo-humboldtian trend in contemporary Russian linguistics has no knowledge of its historical origins : German Romanticism in its Anti-Enlightenment trend, evolutionnist positivism of Auguste Comte, and deterministic linguistics in Germany in the 1930s.
Resumo:
A version of Matheron’s discrete Gaussian model is applied to cell composition data.The examples are for map patterns of felsic metavolcanics in two different areas. Q-Qplots of the model for cell values representing proportion of 10 km x 10 km cell areaunderlain by this rock type are approximately linear, and the line of best fit can be usedto estimate the parameters of the model. It is also shown that felsic metavolcanics in theAbitibi area of the Canadian Shield can be modeled as a fractal
Resumo:
MOTIVATION: Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. RESULTS: In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. AVAILABILITY: Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.
Resumo:
The observation of non-random phylogenetic distribution of traits in communities provides evidence for niche-based community assembly. Environment may influence the phylogenetic structure of communities because traits determining how species respond to prevailing conditions can be phylogenetically conserved. In this study, we investigate the variation of butterfly species richness and of phylogenetic - and -diversities along temperature and plant species richness gradients. Our study indicates that butterfly richness is independently positively correlated to temperature and plant species richness in the study area. However, the variation of phylogenetic - and -diversities is only correlated to temperature. The significant phylogenetic clustering at high elevation suggests that cold temperature filters butterfly lineages, leading to communities mostly composed of closely related species adapted to those climatic conditions. These results suggest that in colder and more severe conditions at high elevations deterministic processes and not purely stochastic events drive the assemblage of butterfly communities.
Resumo:
Peccata Mundi arrenca de la iniciativa del director de R+D+I de l'empresa Mas Parés, Jaume Juher, i l’artista plàstic Jaume Xifra, que l’any 2004 van decidir sumar a l’amistat que els uneix un objectiu: convergir en un sol projecte els reptes professionals que es plantejaven individualment des de cadascuna de les seves disciplines: l’art i la investigació gastronòmica. Posteriorment, a l'any 2005, s'incorporen al projecte els experts que constitueixen actualment el nucli central de treball: Josep Bel, expert en anàlisi sensorial i aplicació d'aromes; David Juher, matemàtic i professor de la UdG; Xavier de Palau, músic electrònic; Clara Perxachs, investigadora de la cultura del menjar; i Toni Botella, cuiner. A l'experiència gastronòmico-artística Peccata Mundi el participant tasta un seguit de plats i vins i valora les seves percepcions contestant un qüestionari. Les dades d'aquest qüestionari s'utilitzen, a través d'unes transformacions regides per criteris neurològics, matemàtics, antropològics, etc., per produir unes dades numèriques que seran l'entrada a una aplicació que les farà servir per generar un vídeo amb música d'una durada aproximadament de 2 minuts. Aquest vídeo, que consta d'imatges fractals en moviment i d'una música de fons, generada també utilitzant funcions de comportament caòtic, és el retrat audiovisual de l'experiència sensorial del participant. El projecte consisteix a implementar tota la logística informàtica de l’experiència sensorial Peccata Mundi: dissenyar les aplicacions d'entrada de dades, tractament de la base de dades, processament de les dades del qüestionari, generació del vídeo i la música i producció de l'arxiu audiovisual que finalment el participant s'emporta gravat en suport DVD