902 resultados para Deterministic imputation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design of control, estimation or diagnosis algorithms most often assumes that all available process variables represent the system state at the same instant of time. However, this is never true in current network systems, because of the unknown deterministic or stochastic transmission delays introduced by the communication network. During the diagnosing stage, this will often generate false alarms. Under nominal operation, the different transmission delays associated with the variables that appear in the computation form produce discrepancies of the residuals from zero. A technique aiming at the minimisation of the resulting false alarms rate, that is based on the explicit modelling of communication delays and on their best-case estimation is proposed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les écosystèmes fournissent de nombreuses ressources et services écologiques qui sont utiles à la population humaine. La biodiversité est une composante essentielle des écosystèmes et maintient de nombreux services. Afin d'assurer la permanence des services écosystémiques, des mesures doivent être prises pour conserver la biodiversité. Dans ce but, l'acquisition d'informations détaillées sur la distribution de la biodiversité dans l'espace est essentielle. Les modèles de distribution d'espèces (SDMs) sont des modèles empiriques qui mettent en lien des observations de terrain (présences ou absences d'une espèce) avec des descripteurs de l'environnement, selon des courbes de réponses statistiques qui décrive la niche réalisée des espèces. Ces modèles fournissent des projections spatiales indiquant les lieux les plus favorables pour les espèces considérées. Le principal objectif de cette thèse est de fournir des projections plus réalistes de la distribution des espèces et des communautés en montagne pour le climat présent et futur en considérant non-seulement des variables abiotiques mais aussi biotiques. Les régions de montagne et l'écosystème alpin sont très sensibles aux changements globaux et en même temps assurent de nombreux services écosystémiques. Cette thèse est séparée en trois parties : (i) fournir une meilleure compréhension du rôle des interactions biotiques dans la distribution des espèces et l'assemblage des communautés en montagne (ouest des Alpes Suisses), (ii) permettre le développement d'une nouvelle approche pour modéliser la distribution spatiale de la biodiversité, (iii) fournir des projections plus réalistes de la distribution future des espèces ainsi que de la composition des communautés. En me focalisant sur les papillons, bourdons et plantes vasculaires, j'ai détecté des interactions biotiques importantes qui lient les espèces entre elles. J'ai également identifié la signature du filtre de l'environnement sur les communautés en haute altitude confirmant l'utilité des SDMs pour reproduire ce type de processus. A partir de ces études, j'ai contribué à l'amélioration méthodologique des SDMs dans le but de prédire les communautés en incluant les interactions biotiques et également les processus non-déterministes par une approche probabiliste. Cette approche permet de prédire non-seulement la distribution d'espèces individuelles, mais également celle de communautés dans leur entier en empilant les projections (S-SDMs). Finalement, j'ai utilisé cet outil pour prédire la distribution d'espèces et de communautés dans le passé et le futur. En particulier, j'ai modélisé la migration post-glaciaire de Trollius europaeus qui est à l'origine de la structure génétique intra-spécifique chez cette espèce et évalué les risques de perte face au changement climatique. Finalement, j'ai simulé la distribution des communautés de bourdons pour le 21e siècle afin d'évaluer les changements probables dans ce groupe important de pollinisateurs. La diversité fonctionnelle des bourdons va être altérée par la perte d'espèces spécialistes de haute altitude et ceci va influencer la pollinisation des plantes en haute altitude. - Ecosystems provide a multitude of resources and ecological services, which are useful to human. Biodiversity is an essential component of those ecosystems and guarantee many services. To assure the permanence of ecosystem services for future generation, measure should be applied to conserve biodiversity. For this purpose, the acquisition of detailed information on how biodiversity implicated in ecosystem function is distributed in space is essential. Species distribution models (SDMs) are empirical models relating field observations to environmental predictors based on statistically-derived response surfaces that fit the realized niche. These models result in spatial predictions indicating locations of the most suitable environment for the species and may potentially be applied to predict composition of communities and their functional properties. The main objective of this thesis was to provide more accurate projections of species and communities distribution under current and future climate in mountains by considering not solely abiotic but also biotic drivers of species distribution. Mountain areas and alpine ecosystems are considered as particularly sensitive to global changes and are also sources of essential ecosystem services. This thesis had three main goals: (i) a better ecological understanding of biotic interactions and how they shape the distribution of species and communities, (ii) the development of a novel approach to the spatial modeling of biodiversity, that can account for biotic interactions, and (iii) ecologically more realistic projections of future species distributions, of future composition and structure of communities. Focusing on butterfly and bumblebees in interaction with the vegetation, I detected important biotic interactions for species distribution and community composition of both plant and insects along environmental gradients. I identified the signature of environmental filtering processes at high elevation confirming the suitability of SDMs for reproducing patterns of filtering. Using those case-studies, I improved SDMs by incorporating biotic interaction and accounting for non-deterministic processes and uncertainty using a probabilistic based approach. I used improved modeling to forecast the distribution of species through the past and future climate changes. SDMs hindcasting allowed a better understanding of the spatial range dynamic of Trollius europaeus in Europe at the origin of the species intra-specific genetic diversity and identified the risk of loss of this genetic diversity caused by climate change. By simulating the future distribution of all bumblebee species in the western Swiss Alps under nine climate change scenarios for the 21st century, I found that the functional diversity of this pollinator guild will be largely affected by climate change through the loss of high elevation specialists. In turn, this will have important consequences on alpine plant pollination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Highly recurrent major depressive disorder (MDD) has reportedly increased risk of shifting to bipolar disorder; high recurrence frequency has, therefore, featured as evidence of 'soft bipolarity'. We aimed to investigate the genetic underpinnings of total depressive episode count in recurrent MDD. METHODS: Our primary sample included 1966 MDD cases with negative family history of bipolar disorder from the RADIANT studies. Total episode count was adjusted for gender, age, MDD duration, study and center before being tested for association with genotype in two separate genome-wide analyses (GWAS), in the full set and in a subset of 1364 cases with positive family history of MDD (FH+). We also calculated polygenic scores from the Psychiatric Genomics Consortium MDD and bipolar disorder studies. RESULTS: Episodicity (especially intermediate episode counts) was an independent index of MDD familial aggregation, replicating previous reports. The GWAS produced no genome-wide significant findings. The strongest signals were detected in the full set at MAGI1 (p=5.1×10(-7)), previously associated with bipolar disorder, and in the FH+ subset at STIM1 (p=3.9×10(-6) after imputation), a calcium channel signaling gene. However, these findings failed to replicate in an independent Munich cohort. In the full set polygenic profile analyses, MDD polygenes predicted episodicity better than bipolar polygenes; however, in the FH+ subset, both polygenic scores performed similarly. LIMITATIONS: Episode count was self-reported and, therefore, subject to recall bias. CONCLUSIONS: Our findings lend preliminary support to the hypothesis that highly recurrent MDD with FH+ is part of a 'soft bipolar spectrum' but await replication in larger cohorts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper questions the practitioners' deterministic approach(es) in forensic identification and notes the limits of their conclusions in order to encourage a discussion to question current practices. With this end in view, a hypothetical discussion between an expert in dentistry and an enthusiastic member of a jury, eager to understand the scientific principles of evidence interpretation, is presented. This discussion will lead us to regard any argument aiming at identification as probabilistic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiple genome-wide association studies (GWAS) have been performed in HIV-1 infected individuals, identifying common genetic influences on viral control and disease course. Similarly, common genetic correlates of acquisition of HIV-1 after exposure have been interrogated using GWAS, although in generally small samples. Under the auspices of the International Collaboration for the Genomics of HIV, we have combined the genome-wide single nucleotide polymorphism (SNP) data collected by 25 cohorts, studies, or institutions on HIV-1 infected individuals and compared them to carefully matched population-level data sets (a list of all collaborators appears in Note S1 in Text S1). After imputation using the 1,000 Genomes Project reference panel, we tested approximately 8 million common DNA variants (SNPs and indels) for association with HIV-1 acquisition in 6,334 infected patients and 7,247 population samples of European ancestry. Initial association testing identified the SNP rs4418214, the C allele of which is known to tag the HLA-B*57:01 and B*27:05 alleles, as genome-wide significant (p = 3.6×10(-11)). However, restricting analysis to individuals with a known date of seroconversion suggested that this association was due to the frailty bias in studies of lethal diseases. Further analyses including testing recessive genetic models, testing for bulk effects of non-genome-wide significant variants, stratifying by sexual or parenteral transmission risk and testing previously reported associations showed no evidence for genetic influence on HIV-1 acquisition (with the exception of CCR5Δ32 homozygosity). Thus, these data suggest that genetic influences on HIV acquisition are either rare or have smaller effects than can be detected by this sample size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Critical real-time ebedded (CRTE) Systems require safe and tight worst-case execution time (WCET) estimations to provide required safety levels and keep costs low. However, CRTE Systems require increasing performance to satisfy performance needs of existing and new features. Such performance can be only achieved by means of more agressive hardware architectures, which are much harder to analyze from a WCET perspective. The main features considered include cache memòries and multi-core processors.Thus, althoug such features provide higher performance, corrent WCET analysis methods are unable to provide tight WCET estimations. In fact, WCET estimations become worse than for simple rand less powerful hardware. The main reason is the fact that hardware behavior is deterministic but unknown and, therefore, the worst-case behavior must be assumed most of the time, leading to large WCET estimations. The purpose of this project is developing new hardware designs together with WCET analysis tools able to provide tight and safe WCET estimations. In order to do so, those pieces of hardware whose behavior is not easily analyzable due to lack of accurate information during WCET analysis will be enhanced to produce a probabilistically analyzable behavior. Thus, even if the worst-case behavior cannot be removed, its probabilty can be bounded, and hence, a safe and tight WCET can be provided for a particular safety level in line with the safety levels of the remaining components of the system. During the first year the project we have developed molt of the evaluation infraestructure as well as the techniques hardware techniques to analyze cache memories. During the second year those techniques have been evaluated, and new purely-softwar techniques have been developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To set local dose reference levels (DRL) that allow radiologists to control stochastic and deterministic effects. Methods and materials: Dose indicators for cerebral angiographies and hepatic embolizations were collected during 4 months and analyzed in our hospital. The data were compared when an image amplifier was used instead of a flat panel detector. The Mann and Whitney test was used. Results: For the 40 cerebral angiographies performed the DRL for DAP, fluoroscopy time and number of images were respectively: 166 Gy.cm2, 19 min, 600. The maximum DAP was 490 Gy.cm2 (fluoroscopy time: 84 min). No significant difference for fluoroscopy time and DAP for image amplifier and flat panel detector (p = 0.88) was observed. The number of images was larger for flat panel detector (p = 0.004). The values obtained were slightly over the present proposed DRL: 150 Gy.cm2, 15 min, 400. Concerning the 13 hepatic embolizations the DRL for DAP fluoroscopy time and number of images were: 315 Gy.cm2, 25 min, 370. The maximum DAP delivered was 845 Gy.cm2 (fluoroscopy time of 48 min). No significant difference between image amplifier and flat panel detector was observed (p = 0.005). The values obtained were also slightly over the present proposed DRL: 300 Gy.cm2, 20 min, 200. Conclusion: These results show that the introduction of flat panel detector did not lead to an increase in patient dose. A DRL concerning the cumulative dose (that allow to control the deterministic effect) should be introduced to allow radiologists to have full control on the risks associated with ionizing radiations. Results of this on going study will be presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods: The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results: After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion: The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The simultaneous use of multiple transmit and receive antennas can unleash very large capacity increases in rich multipath environments. Although such capacities can be approached by layered multi-antenna architectures with per-antenna rate control, the need for short-term feedback arises as a potential impediment, in particular as the number of antennas—and thus the number of rates to be controlled—increases. What we show, however, is that the need for short-term feedback in fact vanishes as the number of antennas and/or the diversity order increases. Specifically, the rate supported by each transmit antenna becomes deterministic and a sole function of the signal-to-noise, the ratio of transmit and receive antennas, and the decoding order, all of which are either fixed or slowly varying. More generally, we illustrate -through this specific derivation— the relevance of some established random CDMA results to the single-user multi-antenna problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from realworld dynamics even though these are not necessarily deterministic and stationary. In the present study we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose we here propose a recurrence quantification analysis measure that allows tracking potentially curved and disrupted traces in cross recurrence plots. We apply this measure to cross recurrence plots constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Rössler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The study aimed to compare the cost-effectiveness of concomitant and adjuvant temozolomide (TMZ) for the treatment of newly diagnosed glioblastoma multiforme versus initial radiotherapy alone from a public health care perspective. METHODS: The economic evaluation was performed alongside a randomized, multicenter, phase 3 trial. The primary endpoint of the trial was overall survival. Costs included all direct medical costs. Economic data were collected prospectively for a subgroup of 219 patients (38%). Unit costs for drugs, procedures, laboratory and imaging, radiotherapy, and hospital costs per day were collected from the official national reimbursement lists based on 2004. For the cost-effectiveness analysis, survival was expressed as 2.5 years restricted mean estimates. The incremental cost-effectiveness ratio (ICER) was constructed. Confidence intervals for the ICER were calculated using the Fieller method and bootstrapping. RESULTS: The difference in 2.5 years restricted mean survival between the treatment arms was 0.25 life-years and the ICER was euro37,361 per life-year gained with a 95% confidence interval (CI) ranging from euro19,544 to euro123,616. The area between the survival curves of the treatment arms suggests an increase of the overall survival gain for a longer follow-up. An extrapolation of the overall survival per treatment arm and imputation of costs for the extrapolated survival showed a substantial reduction in ICER. CONCLUSIONS: The ICER of euro37,361 per life-year gained is a conservative estimate. We concluded that despite the high TMZ acquisition costs, the costs per life-year gained are comparable to accepted first-line treatment with chemotherapy in patients with cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

T cells belong to two mutually exclusive lineages expressing either alpha beta or gamma delta T-cell receptors (TCR). Although alpha beta and gamma delta cells are known to share a common precursor the role of TCR rearrangement and specificity in the lineage commitment process is controversial. Instructive lineage commitment models endow the alpha beta or gamma delta TCR with a deterministic role in lineage choice, whereas separate lineage models invoke TCR-independent lineage commitment followed by TCR-dependent selection and maturation of alpha beta and gamma delta cells. Here we review the published data pertaining to the role of the TCR in alpha beta/gamma delta lineage commitment and provide some additional information obtained from recent intracellular TCR staining studies. We conclude that a variant of the separate lineage model is best able to accommodate all of the available experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the role of deterministic components in the DGP and in the auxiliary regression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d ∈ [0, 1). This is an important test in many economic applications because I(d) processess with d & 1 are mean-reverting although, when 0.5 ≤ d & 1,, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributed tests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.