74 resultados para Discrete components
Resumo:
A recent study defines a new network plane: the knowledge plane. The incorporation of the knowledge plane over the network allows having more accurate information of the current and future network states. In this paper, the introduction and management of the network reliability information in the knowledge plane is proposed in order to improve the quality of service with protection routing algorithms in GMPLS over WDM networks. Different experiments prove the efficiency and scalability of the proposed scheme in terms of the percentage of resources used to protect the network
Resumo:
This paper examines a dataset which is modeled well by thePoisson-Log Normal process and by this process mixed with LogNormal data, which are both turned into compositions. Thisgenerates compositional data that has zeros without any need forconditional models or assuming that there is missing or censoreddata that needs adjustment. It also enables us to model dependenceon covariates and within the composition
Resumo:
Process supervision is the activity focused on monitoring the process operation in order to deduce conditions to maintain the normality including when faults are present Depending on the number/distribution/heterogeneity of variables, behaviour situations, sub-processes, etc. from processes, human operators and engineers do not easily manipulate the information. This leads to the necessity of automation of supervision activities. Nevertheless, the difficulty to deal with the information complicates the design and development of software applications. We present an approach called "integrated supervision systems". It proposes multiple supervisors coordination to supervise multiple sub-processes whose interactions permit one to supervise the global process
Resumo:
A joint distribution of two discrete random variables with finite support can be displayed as a two way table of probabilities adding to one. Assume that this table hasn rows and m columns and all probabilities are non-null. This kind of table can beseen as an element in the simplex of n · m parts. In this context, the marginals areidentified as compositional amalgams, conditionals (rows or columns) as subcompositions. Also, simplicial perturbation appears as Bayes theorem. However, the Euclideanelements of the Aitchison geometry of the simplex can also be translated into the tableof probabilities: subspaces, orthogonal projections, distances.Two important questions are addressed: a) given a table of probabilities, which isthe nearest independent table to the initial one? b) which is the largest orthogonalprojection of a row onto a column? or, equivalently, which is the information in arow explained by a column, thus explaining the interaction? To answer these questionsthree orthogonal decompositions are presented: (1) by columns and a row-wise geometric marginal, (2) by rows and a columnwise geometric marginal, (3) by independenttwo-way tables and fully dependent tables representing row-column interaction. Animportant result is that the nearest independent table is the product of the two (rowand column)-wise geometric marginal tables. A corollary is that, in an independenttable, the geometric marginals conform with the traditional (arithmetic) marginals.These decompositions can be compared with standard log-linear models.Key words: balance, compositional data, simplex, Aitchison geometry, composition,orthonormal basis, arithmetic and geometric marginals, amalgam, dependence measure,contingency table
Resumo:
La qualitat d’un producte elaborat és un factor important, tant pels consumidors, com pels òrgans reguladors que en defineixen normatives cada cop més estrictes. Iniciatives com la del PAT (Process Analytical Technology) en el sector farmacèutic, responen a aquestes necessitats. El PAT afavoreix la implantació de noves tècniques analítiques que facilitin el monitoratge i el control de paràmetres clau in-/on-line durant els processos de producció. En aquest sentit, el NIR-CI (Near Infrarred-Chemical Imaging) podria ser una eina molt útil en la millora de la qualitat de la indústria farmacèutica, ja que aprofita les avantatges del NIR com a tècnica analítica (ràpid, no invasiu, no destructiu) i les aplica a tota la superfície espacial de la mostra. És una tècnica capaç de proporcionar una gran quantitat d’informació, tant espectral com espacial, en una sola imatge. L’objectiu d’aquest treball és avaluar la capacitat de la tècnica NIR-CI, com a eina pel control de paràmetres de qualitat de comprimits farmacèutics. Concretament, s’han analitzat quantitativament la concentració i la distribució dels components (principi actiu i excipients) d’un comprimit farmacèutic amb i sense recobriment. A més, també s’ha determinat el gruix de la pel·lícula de laca de recobriment i la seva distribució a la superfície del comprimit. Per obtenir aquesta informació, es parteix d’imatges NIR-CI hiperespectrals dels comprimits. Per a l’extracció de les dades d’interès s’ha usat l’algoritme PLS en les diferents versions dels softwares Isys 5.0 i Unscrambler 9.8. La versió de l’Isys permet determinar la contribució de cada component pur a cada punt de la imatge, emprant únicament l’espectre del component en estudi. Amb la de l’Unscrambler, en canvi, es construeix un model de calibratge que, a partir d’unes mostres de referència, prediu la distribució del gruix de recobriment sobre la superfície del comprimit.
Resumo:
In this note, we consider claims problems with indivisible goods. Specifically, by applying recursively the P-rights lower bound (Jiménez-Gómez and Marco-Gil (2008)), we ensure the fulfillment of Weak Order Preservation, considered by many authors as a minimal requirement of fairness. Moreover, we retrieve the Discrete Constrained Equal Losses and the Discrete Constrained Equal Awards rules (Herrero and Martíınez (2008)). Finally, by the recursive double imposition of a lower and an upper bound, we obtain the average between them. Keywords: Claims problems, Indivisibilities, Order Preservation, Constrained Egalitarian rules, Midpoint. JEL classification: C71, D63, D71.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
This paper addresses the application of a PCA analysis on categorical data prior to diagnose a patients data set using a Case-Based Reasoning (CBR) system. The particularity is that the standard PCA techniques are designed to deal with numerical attributes, but our medical data set contains many categorical data and alternative methods as RS-PCA are required. Thus, we propose to hybridize RS-PCA (Regular Simplex PCA) and a simple CBR. Results show how the hybrid system produces similar results when diagnosing a medical data set, that the ones obtained when using the original attributes. These results are quite promising since they allow to diagnose with less computation effort and memory storage
Resumo:
We developed a procedure that combines three complementary computational methodologies to improve the theoretical description of the electronic structure of nickel oxide. The starting point is a Car-Parrinello molecular dynamics simulation to incorporate vibrorotational degrees of freedom into the material model. By means ofcomplete active space self-consistent field second-order perturbation theory (CASPT2) calculations on embedded clusters extracted from the resulting trajectory, we describe localized spectroscopic phenomena on NiO with an efficient treatment of electron correlation. The inclusion of thermal motion into the theoretical description allowsus to study electronic transitions that, otherwise, would be dipole forbidden in the ideal structure and results in a natural reproduction of the band broadening. Moreover, we improved the embedded cluster model by incorporating self-consistently at the complete active space self-consistent field (CASSCF) level a discrete (or direct) reaction field (DRF) in the cluster surroundings. The DRF approach offers an efficient treatment ofelectric response effects of the crystalline embedding to the electronic transitions localized in the cluster. We offer accurate theoretical estimates of the absorption spectrum and the density of states around the Fermi level of NiO, and a comprehensive explanation of the source of the broadening and the relaxation of the charge transferstates due to the adaptation of the environment
Resumo:
A comparision of the local effects of the basis set superposition error (BSSE) on the electron densities and energy components of three representative H-bonded complexes was carried out. The electron densities were obtained with Hartee-Fock and density functional theory versions of the chemical Hamiltonian approach (CHA) methodology. It was shown that the effects of the BSSE were common for all complexes studied. The electron density difference maps and the chemical energy component analysis (CECA) analysis confirmed that the local effects of the BSSE were different when diffuse functions were present in the calculations
Resumo:
The article presents and discusses estimates of social and economic indicators for Italy’s regions in benchmark years roughly from Unification to the present day: life expectancy, education, GDP per capita at purchasing power parity, and the new Human Development Index (HDI). A broad interpretative hypothesis, based on the distinction between passive and active modernization, is proposed to account for the evolution of regional imbalances over the long-run. In the lack of active modernization, Southern Italy converged thanks to passive modernization, i.e., State intervention: however, this was more effective in life expectancy, less successful in education, expensive and as a whole ineffective in GDP. As a consequence, convergence in the HDI occurred from the late XIX century to the 1970s, but came to a sudden halt in the last decades of the XX century.
Resumo:
In this work we describe the usage of bilinear statistical models as a means of factoring the shape variability into two components attributed to inter-subject variation and to the intrinsic dynamics of the human heart. We show that it is feasible to reconstruct the shape of the heart at discrete points in the cardiac cycle. Provided we are given a small number of shape instances representing the same heart atdifferent points in the same cycle, we can use the bilinearmodel to establish this. Using a temporal and a spatial alignment step in the preprocessing of the shapes, around half of the reconstruction errors were on the order of the axial image resolution of 2 mm, and over 90% was within 3.5 mm. From this, weconclude that the dynamics were indeed separated from theinter-subject variability in our dataset.
Resumo:
This paper discusses the role of deterministic components in the DGP and in the auxiliary regression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d ∈ [0, 1). This is an important test in many economic applications because I(d) processess with d & 1 are mean-reverting although, when 0.5 ≤ d & 1,, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributed tests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.
Resumo:
The first generation models of currency crises have often been criticized because they predict that, in the absence of very large triggering shocks, currency attacks should be predictable and lead to small devaluations. This paper shows that these features of first generation models are not robust to the inclusion of private information. In particular, this paper analyzes a generalization of the Krugman-Flood-Garber (KFG) model, which relaxes the assumption that all consumers are perfectly informed about the level of fundamentals. In this environment, the KFG equilibrium of zero devaluation is only one of many possible equilibria. In all the other equilibria, the lack of perfect information delays the attack on the currency past the point at which the shadow exchange rate equals the peg, giving rise to unpredictable and discrete devaluations.