911 resultados para Deterministic walkers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design of control, estimation or diagnosis algorithms most often assumes that all available process variables represent the system state at the same instant of time. However, this is never true in current network systems, because of the unknown deterministic or stochastic transmission delays introduced by the communication network. During the diagnosing stage, this will often generate false alarms. Under nominal operation, the different transmission delays associated with the variables that appear in the computation form produce discrepancies of the residuals from zero. A technique aiming at the minimisation of the resulting false alarms rate, that is based on the explicit modelling of communication delays and on their best-case estimation is proposed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les écosystèmes fournissent de nombreuses ressources et services écologiques qui sont utiles à la population humaine. La biodiversité est une composante essentielle des écosystèmes et maintient de nombreux services. Afin d'assurer la permanence des services écosystémiques, des mesures doivent être prises pour conserver la biodiversité. Dans ce but, l'acquisition d'informations détaillées sur la distribution de la biodiversité dans l'espace est essentielle. Les modèles de distribution d'espèces (SDMs) sont des modèles empiriques qui mettent en lien des observations de terrain (présences ou absences d'une espèce) avec des descripteurs de l'environnement, selon des courbes de réponses statistiques qui décrive la niche réalisée des espèces. Ces modèles fournissent des projections spatiales indiquant les lieux les plus favorables pour les espèces considérées. Le principal objectif de cette thèse est de fournir des projections plus réalistes de la distribution des espèces et des communautés en montagne pour le climat présent et futur en considérant non-seulement des variables abiotiques mais aussi biotiques. Les régions de montagne et l'écosystème alpin sont très sensibles aux changements globaux et en même temps assurent de nombreux services écosystémiques. Cette thèse est séparée en trois parties : (i) fournir une meilleure compréhension du rôle des interactions biotiques dans la distribution des espèces et l'assemblage des communautés en montagne (ouest des Alpes Suisses), (ii) permettre le développement d'une nouvelle approche pour modéliser la distribution spatiale de la biodiversité, (iii) fournir des projections plus réalistes de la distribution future des espèces ainsi que de la composition des communautés. En me focalisant sur les papillons, bourdons et plantes vasculaires, j'ai détecté des interactions biotiques importantes qui lient les espèces entre elles. J'ai également identifié la signature du filtre de l'environnement sur les communautés en haute altitude confirmant l'utilité des SDMs pour reproduire ce type de processus. A partir de ces études, j'ai contribué à l'amélioration méthodologique des SDMs dans le but de prédire les communautés en incluant les interactions biotiques et également les processus non-déterministes par une approche probabiliste. Cette approche permet de prédire non-seulement la distribution d'espèces individuelles, mais également celle de communautés dans leur entier en empilant les projections (S-SDMs). Finalement, j'ai utilisé cet outil pour prédire la distribution d'espèces et de communautés dans le passé et le futur. En particulier, j'ai modélisé la migration post-glaciaire de Trollius europaeus qui est à l'origine de la structure génétique intra-spécifique chez cette espèce et évalué les risques de perte face au changement climatique. Finalement, j'ai simulé la distribution des communautés de bourdons pour le 21e siècle afin d'évaluer les changements probables dans ce groupe important de pollinisateurs. La diversité fonctionnelle des bourdons va être altérée par la perte d'espèces spécialistes de haute altitude et ceci va influencer la pollinisation des plantes en haute altitude. - Ecosystems provide a multitude of resources and ecological services, which are useful to human. Biodiversity is an essential component of those ecosystems and guarantee many services. To assure the permanence of ecosystem services for future generation, measure should be applied to conserve biodiversity. For this purpose, the acquisition of detailed information on how biodiversity implicated in ecosystem function is distributed in space is essential. Species distribution models (SDMs) are empirical models relating field observations to environmental predictors based on statistically-derived response surfaces that fit the realized niche. These models result in spatial predictions indicating locations of the most suitable environment for the species and may potentially be applied to predict composition of communities and their functional properties. The main objective of this thesis was to provide more accurate projections of species and communities distribution under current and future climate in mountains by considering not solely abiotic but also biotic drivers of species distribution. Mountain areas and alpine ecosystems are considered as particularly sensitive to global changes and are also sources of essential ecosystem services. This thesis had three main goals: (i) a better ecological understanding of biotic interactions and how they shape the distribution of species and communities, (ii) the development of a novel approach to the spatial modeling of biodiversity, that can account for biotic interactions, and (iii) ecologically more realistic projections of future species distributions, of future composition and structure of communities. Focusing on butterfly and bumblebees in interaction with the vegetation, I detected important biotic interactions for species distribution and community composition of both plant and insects along environmental gradients. I identified the signature of environmental filtering processes at high elevation confirming the suitability of SDMs for reproducing patterns of filtering. Using those case-studies, I improved SDMs by incorporating biotic interaction and accounting for non-deterministic processes and uncertainty using a probabilistic based approach. I used improved modeling to forecast the distribution of species through the past and future climate changes. SDMs hindcasting allowed a better understanding of the spatial range dynamic of Trollius europaeus in Europe at the origin of the species intra-specific genetic diversity and identified the risk of loss of this genetic diversity caused by climate change. By simulating the future distribution of all bumblebee species in the western Swiss Alps under nine climate change scenarios for the 21st century, I found that the functional diversity of this pollinator guild will be largely affected by climate change through the loss of high elevation specialists. In turn, this will have important consequences on alpine plant pollination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper questions the practitioners' deterministic approach(es) in forensic identification and notes the limits of their conclusions in order to encourage a discussion to question current practices. With this end in view, a hypothetical discussion between an expert in dentistry and an enthusiastic member of a jury, eager to understand the scientific principles of evidence interpretation, is presented. This discussion will lead us to regard any argument aiming at identification as probabilistic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Critical real-time ebedded (CRTE) Systems require safe and tight worst-case execution time (WCET) estimations to provide required safety levels and keep costs low. However, CRTE Systems require increasing performance to satisfy performance needs of existing and new features. Such performance can be only achieved by means of more agressive hardware architectures, which are much harder to analyze from a WCET perspective. The main features considered include cache memòries and multi-core processors.Thus, althoug such features provide higher performance, corrent WCET analysis methods are unable to provide tight WCET estimations. In fact, WCET estimations become worse than for simple rand less powerful hardware. The main reason is the fact that hardware behavior is deterministic but unknown and, therefore, the worst-case behavior must be assumed most of the time, leading to large WCET estimations. The purpose of this project is developing new hardware designs together with WCET analysis tools able to provide tight and safe WCET estimations. In order to do so, those pieces of hardware whose behavior is not easily analyzable due to lack of accurate information during WCET analysis will be enhanced to produce a probabilistically analyzable behavior. Thus, even if the worst-case behavior cannot be removed, its probabilty can be bounded, and hence, a safe and tight WCET can be provided for a particular safety level in line with the safety levels of the remaining components of the system. During the first year the project we have developed molt of the evaluation infraestructure as well as the techniques hardware techniques to analyze cache memories. During the second year those techniques have been evaluated, and new purely-softwar techniques have been developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To set local dose reference levels (DRL) that allow radiologists to control stochastic and deterministic effects. Methods and materials: Dose indicators for cerebral angiographies and hepatic embolizations were collected during 4 months and analyzed in our hospital. The data were compared when an image amplifier was used instead of a flat panel detector. The Mann and Whitney test was used. Results: For the 40 cerebral angiographies performed the DRL for DAP, fluoroscopy time and number of images were respectively: 166 Gy.cm2, 19 min, 600. The maximum DAP was 490 Gy.cm2 (fluoroscopy time: 84 min). No significant difference for fluoroscopy time and DAP for image amplifier and flat panel detector (p = 0.88) was observed. The number of images was larger for flat panel detector (p = 0.004). The values obtained were slightly over the present proposed DRL: 150 Gy.cm2, 15 min, 400. Concerning the 13 hepatic embolizations the DRL for DAP fluoroscopy time and number of images were: 315 Gy.cm2, 25 min, 370. The maximum DAP delivered was 845 Gy.cm2 (fluoroscopy time of 48 min). No significant difference between image amplifier and flat panel detector was observed (p = 0.005). The values obtained were also slightly over the present proposed DRL: 300 Gy.cm2, 20 min, 200. Conclusion: These results show that the introduction of flat panel detector did not lead to an increase in patient dose. A DRL concerning the cumulative dose (that allow to control the deterministic effect) should be introduced to allow radiologists to have full control on the risks associated with ionizing radiations. Results of this on going study will be presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods: The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results: After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion: The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The simultaneous use of multiple transmit and receive antennas can unleash very large capacity increases in rich multipath environments. Although such capacities can be approached by layered multi-antenna architectures with per-antenna rate control, the need for short-term feedback arises as a potential impediment, in particular as the number of antennas—and thus the number of rates to be controlled—increases. What we show, however, is that the need for short-term feedback in fact vanishes as the number of antennas and/or the diversity order increases. Specifically, the rate supported by each transmit antenna becomes deterministic and a sole function of the signal-to-noise, the ratio of transmit and receive antennas, and the decoding order, all of which are either fixed or slowly varying. More generally, we illustrate -through this specific derivation— the relevance of some established random CDMA results to the single-user multi-antenna problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from realworld dynamics even though these are not necessarily deterministic and stationary. In the present study we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose we here propose a recurrence quantification analysis measure that allows tracking potentially curved and disrupted traces in cross recurrence plots. We apply this measure to cross recurrence plots constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Rössler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

T cells belong to two mutually exclusive lineages expressing either alpha beta or gamma delta T-cell receptors (TCR). Although alpha beta and gamma delta cells are known to share a common precursor the role of TCR rearrangement and specificity in the lineage commitment process is controversial. Instructive lineage commitment models endow the alpha beta or gamma delta TCR with a deterministic role in lineage choice, whereas separate lineage models invoke TCR-independent lineage commitment followed by TCR-dependent selection and maturation of alpha beta and gamma delta cells. Here we review the published data pertaining to the role of the TCR in alpha beta/gamma delta lineage commitment and provide some additional information obtained from recent intracellular TCR staining studies. We conclude that a variant of the separate lineage model is best able to accommodate all of the available experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An easy-living home requires a full-sized bathroom on the main level. Family members will appreciate the extra space and guests of all ages and abilities will feel more welcome. At a minimum, you’ll need a five foot circle of open floor space for maneuvering a wheelchair between bathroom fixtures. A small powder room won’t work for guests who use walkers or wheelchairs. A shower stall—with no curb to step over—is more convenient than a tub for most guests. Make sure the doorway opening for the bathroom is at least 32 inches wide (preferably 36 inches). Universal design features, such as these, make homes better for everyone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the role of deterministic components in the DGP and in the auxiliary regression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d ∈ [0, 1). This is an important test in many economic applications because I(d) processess with d & 1 are mean-reverting although, when 0.5 ≤ d & 1,, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributed tests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we examine the design of permit trading programs when the objective is to minimize the cost of achieving an ex ante pollution target, that is, one that is defined in expectation rather than an ex post deterministic value. We consider two potential sources of uncertainty, the presence of either of which can make our model appropriate: incomplete information on abatement costs and uncertain delivery coefficients. In such a setting, we find three distinct features that depart from the well-established results on permit trading: (1) the regulator’s information on firms’ abatement costs can matter; (2) the optimal permit cap is not necessarily equal to the ex ante pollution target; and (3) the optimal trading ratio is not necessarily equal to the delivery coefficient even when it is known with certainty. Intuitively, since the regulator is only required to meet a pollution target on average, she can set the trading ratio and total permit cap such that there will be more pollution when abatement costs are high and less pollution when abatement costs are low. Information on firms’ abatement costs is important in order for the regulator to induce the optimal alignment between pollution level and abatement costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.