67 resultados para probability of error

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new econometric estimation method for analyzing the probabilityof leaving unemployment using uncompleted spells from repeated cross-sectiondata, which can be especially useful when panel data are not available. Theproposed method-of-moments-based estimator has two important features:(1) it estimates the exit probability at the individual level and(2) it does not rely on the stationarity assumption of the inflowcomposition. We illustrate and gauge the performance of the proposedestimator using the Spanish Labor Force Survey data, and analyze the changesin distribution of unemployment between the 1980s and 1990s during a periodof labor market reform. We find that the relative probability of leavingunemployment of the short-term unemployed versus the long-term unemployedbecomes significantly higher in the 1990s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The economic literature on crime and punishment focuses on the trade-off between probability and severity of punishment, and suggests that detection probability and fines are substitutes. In this paper it is shown that, in presence of substantial underdeterrence caused by costly detection and punishment, these instruments may become complements. When offenders are poor, the deterrent value of monetary sanctions is low. Thus, the government does not invest a lot in detection. If offenders are rich, however, the deterrent value of monetary sanctions is high, so it is more profitable to prosecute them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using event-related brain potentials, the time course of error detection and correction was studied in healthy human subjects. A feedforward model of error correction was used to predict the timing properties of the error and corrective movements. Analysis of the multichannel recordings focused on (1) the error-related negativity (ERN) seen immediately after errors in response- and stimulus-locked averages and (2) on the lateralized readiness potential (LRP) reflecting motor preparation. Comparison of the onset and time course of the ERN and LRP components showed that the signs of corrective activity preceded the ERN. Thus, error correction was implemented before or at least in parallel with the appearance of the ERN component. Also, the amplitude of the ERN component was increased for errors, followed by fast corrective movements. The results are compatible with recent views considering the ERN component as the output of an evaluative system engaged in monitoring motor conflict.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este proyecto se han presentado los modelos de distribución de canal más comunes que se puede encontrar una señal en una transmisión. Seguidamente se ha presentado el concepto de diversidad en comunicaciones inalámbricas terrestres y se ha trasladado el escenario a comunicaciones por satélite. Para analizar la calidad de los enlaces con diversidad se ha realizado un simulador, con Matlab, que modele la estructura básica de un sistema de comunicaciones (emisor, canal y receptor). Simulando las comunicaciones entre los diferentes sistemas de diversidad se ha podido comparar la calidad de cada enlace. El modelo Alamouti ha presentado una robustez y una baja probabilidad de error que hacen que sea la mejor elección a la hora de diseñar un sistema de diversidad para comunicaciones por satélite. Utiliza la diversidad de canal para aprovechar cada pizca de señal que recibe y así poder descifrar el mensaje enviado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La evolución de peso en los períodos de lactación y transición de 583 lechones fue estudiada mediante un análisis estadístico, evaluando el efecto de la suplementación con ácidos grasos de cadena media (AGCM) en lechones con poco peso al nacimiento. 188 de los 375 lechones que nacieron con un peso al nacimiento (PN) &1250 g recibieron 3mL de AGCM cada 24 h durante los primeros 3 días de vida; su peso medio al destete (día 28) fue inferior respecto al grupo control (lechones no suplementados) (-114,17 g). No obstante, 106 de los 180 lechones nacidos con un PN &1000 g fueron suplementados, y su peso medio al destete y a finales de transición (día 63) fue superior respecto al grupo control (destete: +315,16 g; día 63: +775,47 g). Finalmente, los lechones suplementados con PN&800 g tuvieron los peores resultados: su diferencia de peso medio al destete fue de -177,58 g respecto al grupo control. Por lo tanto, en esta prueba fueron estudiados los lechones con un PN entre 800 y 999 g porque el grupo suplementado al destete tuvo una diferencia de peso medio considerable respecto al grupo control:+511,58 g. Asimismo, considerando una probabilidad de error inferior a 0,05, no hubieron diferencias significativas en las diferentes categorías de PN analizadas. De todas maneras, es importante destacar el alto grado de significación en la suplementación con AGCM en lechones con PN entre 800 y 999g (P=0,059). Por otra parte, el PN del grupo suplementado con PN&1000 g fue inferior que el del grupo no suplementado con PN&1000 g; esta diferencia de PN fue significativa (P=0,004) y como consecuencia el grado de significación en la suplementación con AGCM en lechones con PN entre 800 y 999 g fue inferior al esperado. Además, en esta prueba se incluyeron algunos resultados generales y también un análisis simple de supervivencia, aunque no era el objetivo principal

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La evolución de peso en los períodos de lactación y transición de 583 lechones fue estudiada mediante un análisis estadístico, evaluando el efecto de la suplementación con ácidos grasos de cadena media (AGCM) en lechones con poco peso al nacimiento. 188 de los 375 lechones que nacieron con un peso al nacimiento (PN) <1250 g recibieron 3mL de AGCM cada 24 h durante los primeros 3 días de vida; su peso medio al destete (día 28) fue inferior respecto al grupo control (lechones no suplementados) (-114,17 g). No obstante, 106 de los 180 lechones nacidos con un PN <1000 g fueron suplementados, y su peso medio al destete y a finales de transición (día 63) fue superior respecto al grupo control (destete: +315,16 g; día 63: +775,47 g). Finalmente, los lechones suplementados con PN<800 g tuvieron los peores resultados: su diferencia de peso medio al destete fue de -177,58 g respecto al grupo control. Por lo tanto, en esta prueba fueron estudiados los lechones con un PN entre 800 y 999 g porque el grupo suplementado al destete tuvo una diferencia de peso medio considerable respecto al grupo control:+511,58 g. Asimismo, considerando una probabilidad de error inferior a 0,05, no hubieron diferencias significativas en las diferentes categorías de PN analizadas. De todas maneras, es importante destacar el alto grado de significación en la suplementación con AGCM en lechones con PN entre 800 y 999g (P=0,059). Por otra parte, el PN del grupo suplementado con PN<1000 g fue inferior que el del grupo no suplementado con PN<1000 g; esta diferencia de PN fue significativa (P=0,004) y como consecuencia el grado de significación en la suplementación con AGCM en lechones con PN entre 800 y 999 g fue inferior al esperado. Además, en esta prueba se incluyeron algunos resultados generales y también un análisis simple de supervivencia, aunque no era el objetivo principal

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is to analyze the causes leading to social exclusion dynamics. In particular, we wish to understand why any individual experiencing social exclusion today is much more likely to experience it again. In fact, there are two distinct processes that may generate a persistence of social exclusion: heterogeneity (individuals are heterogeneous with respect to some observed and/or unobserved adverse characteristics that are relevant for the chance of experiencing social exclusion and persistence over time) and true state of dependence (experiencing social exclusion in a specific time period, in itself, increases the probability of undergoing social exclusion in subsequent periods). Distinguishing between the two processes is crucial since the policy implications are very different.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As computer chips implementation technologies evolve to obtain more performance, those computer chips are using smaller components, with bigger density of transistors and working with lower power voltages. All these factors turn the computer chips less robust and increase the probability of a transient fault. Transient faults may occur once and never more happen the same way in a computer system lifetime. There are distinct consequences when a transient fault occurs: the operating system might abort the execution if the change produced by the fault is detected by bad behavior of the application, but the biggest risk is that the fault produces an undetected data corruption that modifies the application final result without warnings (for example a bit flip in some crucial data). With the objective of researching transient faults in computer system’s processor registers and memory we have developed an extension of HP’s and AMD joint full system simulation environment, named COTSon. This extension allows the injection of faults that change a single bit in processor registers and memory of the simulated computer. The developed fault injection system makes it possible to: evaluate the effects of single bit flip transient faults in an application, analyze an application robustness against single bit flip transient faults and validate fault detection mechanism and strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Pyrenean chamois (Rupicapra pyrenaica pyrenaica) is a mountain-dwelling ungulate with an extensive presence in open areas. Optimal group size results from the trade off between advantages (a reduction in the risk of predation) and disadvantages (competition between members of the herd) of group living. In addition, advantages and disadvantages of group living may vary depending on the position of each individual within the herd. Our objective was to study the effect of central vs. peripheral position in the herd on feeding and vigilance behavior in male and female Pyrenean chamois and to ascertain if a group size effect existed. We used focal animal sampling and recorded social interactions when a focal animal was involved. With males, vigilance rate was higher in the central part of the group than at the periphery, probably due to a higher density of animals in the central part of the herd and a higher probability of being disturbed by conspecifics. With females, vigilance rate did not differ according to position in the herd. Females spent more time feeding than males, and males showed a higher frequency of the vigilance behavior than females. We did not observe a clear relationship between group size and vigilance behavior. The differences in vigilance behavior might be due to social interactions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densitiesby generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred