46 resultados para false rejection


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most effective techniques offering QoS routing is minimum interference routing. However, it is complex in terms of computation time and is not oriented toward improving the network protection level. In order to include better levels of protection, new minimum interference routing algorithms are necessary. Minimizing the failure recovery time is also a complex process involving different failure recovery phases. Some of these phases depend completely on correct routing selection, such as minimizing the failure notification time. The level of protection also involves other aspects, such as the amount of resources used. In this case shared backup techniques should be considered. Therefore, minimum interference techniques should also be modified in order to include sharing resources for protection in their objectives. These aspects are reviewed and analyzed in this article, and a new proposal combining minimum interference with fast protection using shared segment backups is introduced. Results show that our proposed method improves both minimization of the request rejection ratio and the percentage of bandwidth allocated to backup paths in networks with low and medium protection requirements

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detecting changes between images of the same scene taken at different times is of great interest for monitoring and understanding the environment. It is widely used for on-land application but suffers from different constraints. Unfortunately, Change detection algorithms require highly accurate geometric and photometric registration. This requirement has precluded their use in underwater imagery in the past. In this paper, the change detection techniques available nowadays for on-land application were analyzed and a method to automatically detect the changes in sequences of underwater images is proposed. Target application scenarios are habitat restoration sites, or area monitoring after sudden impacts from hurricanes or ship groundings. The method is based on the creation of a 3D terrain model from one image sequence over an area of interest. This model allows for synthesizing textured views that correspond to the same viewpoints of a second image sequence. The generated views are photometrically matched and corrected against the corresponding frames from the second sequence. Standard change detection techniques are then applied to find areas of difference. Additionally, the paper shows that it is possible to detect false positives, resulting from non-rigid objects, by applying the same change detection method to the first sequence exclusively. The developed method was able to correctly find the changes between two challenging sequences of images from a coral reef taken one year apart and acquired with two different cameras

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a method for enhancing current QoS routing methods by means of QoS protection is presented. In an MPLS network, the segments (links) to be protected are predefined and an LSP request involves, apart from establishing a working path, creating a specific type of backup path (local, reverse or global). Different QoS parameters, such as network load balancing, resource optimization and minimization of LSP request rejection should be considered. QoS protection is defined as a function of QoS parameters, such as packet loss, restoration time, and resource optimization. A framework to add QoS protection to many of the current QoS routing algorithms is introduced. A backup decision module to select the most suitable protection method is formulated and different case studies are analyzed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper focuses on QoS routing with protection in an MPLS network over an optical layer. In this multi-layer scenario each layer deploys its own fault management methods. A partially protected optical layer is proposed and the rest of the network is protected at the MPLS layer. New protection schemes that avoid protection duplications are proposed. Moreover, this paper also introduces a new traffic classification based on the level of reliability. The failure impact is evaluated in terms of recovery time depending on the traffic class. The proposed schemes also include a novel variation of minimum interference routing and shared segment backup computation. A complete set of experiments proves that the proposed schemes are more efficient as compared to the previous ones, in terms of resources used to protect the network, failure impact and the request rejection ratio

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On Friday May 16, the Ministry of Foreign Affairs of Cuba summoned the newly-appointed charged’affairs of the European Commission in Havana and announced the withdrawal of the application procedure for membership in the Cotonou Agreement of the Africa, Caribbean, and Pacific (ACP) countries, and in fact renouncing to benefit from European development aid.1 In a blistering note published in the Granma official newspaper of the Cuban Communist Party, the government blamed the EU Commission for exerting undue pressure, its alleged alignment with the policies of the United States, and censure for the measures taken by Cuba during the previous weeks.2 In reality, Cuba avoided an embarrasin flat rejection for its application. This was the anti-climatic ending for a long process that can be traced back to the end of the Cold War, in a context where Cuba has been testing alternative grounds to substitute for the overwhelming protection of the Soviet Union

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design of control, estimation or diagnosis algorithms most often assumes that all available process variables represent the system state at the same instant of time. However, this is never true in current network systems, because of the unknown deterministic or stochastic transmission delays introduced by the communication network. During the diagnosing stage, this will often generate false alarms. Under nominal operation, the different transmission delays associated with the variables that appear in the computation form produce discrepancies of the residuals from zero. A technique aiming at the minimisation of the resulting false alarms rate, that is based on the explicit modelling of communication delays and on their best-case estimation is proposed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Searching for associations between genetic variants and complex diseases has been a very active area of research for over two decades. More than 51,000 potential associations have been studied and published, a figure that keeps increasing, especially with the recent explosion of array-based Genome-Wide Association Studies. Even if the number of true associations described so far is high, many of the putative risk variants detected so far have failed to be consistently replicated and are widely considered false positives. Here, we focus on the world-wide patterns of replicability of published association studies.Results: We report three main findings. First, contrary to previous results, genes associated to complex diseases present lower degrees of genetic differentiation among human populations than average genome-wide levels. Second, also contrary to previous results, the differences in replicability of disease associated-loci between Europeans and East Asians are highly correlated with genetic differentiation between these populations. Finally, highly replicated genes present increased levels of high-frequency derived alleles in European and Asian populations when compared to African populations. Conclusions: Our findings highlight the heterogeneous nature of the genetic etiology of complex disease, confirm the importance of the recent evolutionary history of our species in current patterns of disease susceptibility and could cast doubts on the status as false positives of some associations that have failed to replicate across populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Functional RNA structures play an important role both in the context of noncoding RNA transcripts as well as regulatory elements in mRNAs. Here we present a computational study to detect functional RNA structures within the ENCODE regions of the human genome. Since structural RNAs in general lack characteristic signals in primary sequence, comparative approaches evaluating evolutionary conservation of structures are most promising. We have used three recently introduced programs based on either phylogenetic–stochastic context-free grammar (EvoFold) or energy directed folding (RNAz and AlifoldZ), yielding several thousand candidate structures (corresponding to ∼2.7% of the ENCODE regions). EvoFold has its highest sensitivity in highly conserved and relatively AU-rich regions, while RNAz favors slightly GC-rich regions, resulting in a relatively small overlap between methods. Comparison with the GENCODE annotation points to functional RNAs in all genomic contexts, with a slightly increased density in 3′-UTRs. While we estimate a significant false discovery rate of ∼50%–70% many of the predictions can be further substantiated by additional criteria: 248 loci are predicted by both RNAz and EvoFold, and an additional 239 RNAz or EvoFold predictions are supported by the (more stringent) AlifoldZ algorithm. Five hundred seventy RNAz structure predictions fall into regions that show signs of selection pressure also on the sequence level (i.e., conserved elements). More than 700 predictions overlap with noncoding transcripts detected by oligonucleotide tiling arrays. One hundred seventy-five selected candidates were tested by RT-PCR in six tissues, and expression could be verified in 43 cases (24.6%).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The completion of the sequencing of the mouse genome promises to help predict human genes with greater accuracy. While current ab initio gene prediction programs are remarkably sensitive (i.e., they predict at least a fragment of most genes), their specificity is often low, predicting a large number of false-positive genes in the human genome. Sequence conservation at the protein level with the mouse genome can help eliminate some of those false positives. Here we describe SGP2, a gene prediction program that combines ab initio gene prediction with TBLASTX searches between two genome sequences to provide both sensitive and specific gene predictions. The accuracy of SGP2 when used to predict genes by comparing the human and mouse genomes is assessed on a number of data sets, including single-gene data sets, the highly curated human chromosome 22 predictions, and entire genome predictions from ENSEMBL. Results indicate that SGP2 outperforms purely ab initio gene prediction methods. Results also indicate that SGP2 works about as well with 3x shotgun data as it does with fully assembled genomes. SGP2 provides a high enough specificity that its predictions can be experimentally verified at a reasonable cost. SGP2 was used to generate a complete set of gene predictions on both the human and mouse by comparing the genomes of these two species. Our results suggest that another few thousand human and mouse genes currently not in ENSEMBL are worth verifying experimentally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent availability of the chicken genome sequence poses the question of whether there are human protein-coding genes conserved in chicken that are currently not included in the human gene catalog. Here, we show, using comparative gene finding followed by experimental verification of exon pairs by RT–PCR, that the addition to the multi-exonic subset of this catalog could be as little as 0.2%, suggesting that we may be closing in on the human gene set. Our protocol, however, has two shortcomings: (i) the bioinformatic screening of the predicted genes, applied to filter out false positives, cannot handle intronless genes; and (ii) the experimental verification could fail to identify expression at a specific developmental time. This highlights the importance of developing methods that could provide a reliable estimate of the number of these two types of genes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.