112 resultados para CONSISTENCY
Resumo:
The disintegration of recovered paper is the first operation in the preparation of recycled pulp. It is known that the defibering process follows a first order kinetics from which it is possible to obtain the disintegration kinetic constant (KD) by means of different ways. The disintegration constant can be obtained from the Somerville index results (%lsv and from the dissipated energy per volume unit (Ss). The %slv is related to the quantity of non-defibrated paper, as a measure of the non-disintegrated fiber residual (percentage of flakes), which is expressed in disintegration time units. In this work, disintegration kinetics from recycled coated paper has been evaluated, working at 20 revise rotor speed and for different fiber consistency (6, 8, 10, 12 and 14%). The results showed that the values of experimental disintegration kinetic constant, Ko, through the analysis of Somerville index, as function of time. Increased, the disintegration time was drastically reduced. The calculation of the disintegration kinetic constant (modelled Ko), extracted from the Rayleigh’s dissipation function, showed a good correlation with the experimental values using the evolution of the Somerville index or with the dissipated energy
Resumo:
In response to an increasing need for ever-shorter personality instruments, Gosling, Rentfrow, and Swann (2003) developed the Ten-Item-Personality Inventory (TIPI), which measures the dimensions of the Five Factor Model (FFM) using 10 items (two for each dimension) and can be administered in about one minute. In two studies and using a multi-judge (self and observer) and multi-instrument design, we develop Spanish (Castilian) and Catalan versions of the TIPI and evaluate them in terms of internal consistency, test-retest reliability, convergent, discriminant, and content validity, as well as self-observer correlations. Test-retest correlations were strong, and convergence with the NEO-PI-R factors was significant. There were also strong correlations between observer ratings and the participants’ self-ratings. Despite some inconsistencies with respect to the Agreeableness scale, the Catalan translation and both translations into Spanish of the original TIPI demonstrated sufficient psychometric properties to warrant use as a Five Factor personality measure when the use of longer instruments is not convenient or possible. Furthermore, as the first translation of a brief standard Big Five Instrument into Catalan, this work should facilitate future research on personality in the Catalan-speaking population.
Resumo:
Daily precipitation is recorded as the total amount of water collected by a rain-gauge in 24 h. Events are modelled as a Poisson process and the 24 h precipitation by a Generalised Pareto Distribution (GPD) of excesses. Hazard assessment is complete when estimates of the Poisson rate and the distribution parameters, together with a measure of their uncertainty, are obtained. The shape parameter of the GPD determines the support of the variable: Weibull domain of attraction (DA) corresponds to finite support variables as should be for natural phenomena. However, Fréchet DA has been reported for daily precipitation, which implies an infinite support and a heavy-tailed distribution. Bayesian techniques are used to estimate the parameters. The approach is illustrated with precipitation data from the Eastern coast of the Iberian Peninsula affected by severe convective precipitation. The estimated GPD is mainly in the Fréchet DA, something incompatible with the common sense assumption of that precipitation is a bounded phenomenon. The bounded character of precipitation is then taken as a priori hypothesis. Consistency of this hypothesis with the data is checked in two cases: using the raw-data (in mm) and using log-transformed data. As expected, a Bayesian model checking clearly rejects the model in the raw-data case. However, log-transformed data seem to be consistent with the model. This fact may be due to the adequacy of the log-scale to represent positive measurements for which differences are better relative than absolute
Resumo:
On the domain of general assignment games (with possible reservation prices) the core is axiomatized as the unique solution satisfying two consistency principles: projection consistency and derived consistency. Also, an axiomatic characterization of the nucleolus is given as the unique solution that satisfies derived consistency and equal maximum complaint between groups. As a consequence, we obtain a geometric characterization of the nucleolus. Maschler et al. (1979) provide a geometrical characterization for the intersection of the kernel and the core of a coalitional game, showing that those allocations that lie in both sets are always the midpoint of certain bargaining range between each pair of players. In the case of the assignment game, this means that the kernel can be determined as those core allocations where the maximum amount, that can be transferred without getting outside the core, from one agent to his / her optimally matched partner equals the maximum amount that he / she can receive from this partner, also remaining inside the core. We now prove that the nucleolus of the assignment game can be characterized by requiring this bisection property be satisfied not only for optimally matched pairs but also for optimally matched coalitions. Key words: cooperative games, assignment game, core, nucleolus
Resumo:
In this note we introduce the Lorenz stable set and provide an axiomatic characterization in terms of constrained egalitarianism and projection consistency. On the domain of all coalitional games, we find that this solution connects the weak constrained egalitarian solution (Dutta and Ray, 1989) with their strong counterpart (Dutta and Ray, 1991)
Resumo:
The Quaternary Active Faults Database of Iberia (QAFI) is an initiative lead by the Institute of Geology and Mines of Spain (IGME) for building a public repository of scientific data regarding faults having documented activity during the last 2.59 Ma (Quaternary). QAFI also addresses a need to transfer geologic knowledge to practitioners of seismic hazard and risk in Iberia by identifying and characterizing seismogenic fault-sources. QAFI is populated by the information freely provided by more than 40 Earth science researchers, storing to date a total of 262 records. In this article we describe the development and evolution of the database, as well as its internal architecture. Aditionally, a first global analysis of the data is provided with a special focus on length and slip-rate fault parameters. Finally, the database completeness and the internal consistency of the data are discussed. Even though QAFI v.2.0 is the most current resource for calculating fault-related seismic hazard in Iberia, the database is still incomplete and requires further review.
Resumo:
Identifiability of the so-called ω-slice algorithm is proven for ARMA linear systems. Although proofs were developed in the past for the simpler cases of MA and AR models, they were not extendible to general exponential linear systems. The results presented in this paper demonstrate a unique feature of the ω-slice method, which is unbiasedness and consistency when order is overdetermined, regardless of the IIR or FIR nature of the underlying system, and numerical robustness.
Resumo:
Context. The understanding of Galaxy evolution can be facilitated by the use of population synthesis models, which allow to test hypotheses on the star formation history, star evolution, as well as chemical and dynamical evolution of the Galaxy. Aims. The new version of the Besanc¸on Galaxy Model (hereafter BGM) aims to provide a more flexible and powerful tool to investigate the Initial Mass Function (IMF) and Star Formation Rate (SFR) of the Galactic disc. Methods. We present a new strategy for the generation of thin disc stars which assumes the IMF, SFR and evolutionary tracks as free parameters. We have updated most of the ingredients for the star count production and, for the first time, binary stars are generated in a consistent way. We keep in this new scheme the local dynamical self-consistency as in Bienayme et al (1987). We then compare simulations from the new model with Tycho-2 data and the local luminosity function, as a first test to verify and constrain the new ingredients. The effects of changing thirteen different ingredients of the model are systematically studied. Results. For the first time, a full sky comparison is performed between BGM and data. This strategy allows to constrain the IMF slope at high masses which is found to be close to 3.0, excluding a shallower slope such as Salpeter"s one. The SFR is found decreasing whatever IMF is assumed. The model is compatible with a local dark matter density of 0.011 M pc−3 implying that there is no compelling evidence for significant amount of dark matter in the disc. While the model is fitted to Tycho2 data, a magnitude limited sample with V<11, we check that it is still consistent with fainter stars. Conclusions. The new model constitutes a new basis for further comparisons with large scale surveys and is being prepared to become a powerful tool for the analysis of the Gaia mission data.
Resumo:
The disintegration of recovered paper is the first operation in the preparation of recycled pulp. It is known that the defibering process follows a first order kinetics from which it is possible to obtain the disintegration kinetic constant (KD) by means of different ways. The disintegration constant can be obtained from the Somerville index results (%lsv and from the dissipated energy per volume unit (Ss). The %slv is related to the quantity of non-defibrated paper, as a measure of the non-disintegrated fiber residual (percentage of flakes), which is expressed in disintegration time units. In this work, disintegration kinetics from recycled coated paper has been evaluated, working at 20 revise rotor speed and for different fiber consistency (6, 8, 10, 12 and 14%). The results showed that the values of experimental disintegration kinetic constant, Ko, through the analysis of Somerville index, as function of time. Increased, the disintegration time was drastically reduced. The calculation of the disintegration kinetic constant (modelled Ko), extracted from the Rayleigh’s dissipation function, showed a good correlation with the experimental values using the evolution of the Somerville index or with the dissipated energy
Resumo:
The effects of both barley and Lolium rigidum densities on weed growth and spike production and on crop yield were examined in five field experiments carried out in the Mediterranean drylands of Spain and Western Australia. The aim was to check the consistency of the competitiveness of the crop in different environmental and management conditions. L. rigidum reduced barley yields in most of the experiments (between 0 and 85%), the number of ears per m2 being the most affected. It was found that increasing the barley seeding rate did not reduce the crop losses but did limit weed biomass (between 5 and 61%) and spike production (between 24 and 85%). The variability observed in crop yield losses between sites and seasons was related to rainfall at the beginning of the season. The most sensitive component of yield to weed competition was the number of ears per plant.
Resumo:
El artículo se propone mostrar cómo el estudio de los poemas y canciones de Boris Vian sólo cobra sentido mediante un análisis de conjunto. Tras la aparente dispersión de temas y estilos, bajo la apariencia superficial, cómica y en ocasiones incluso grotesca y banal de sus poemas y canciones, se oculta una unidad y una coherencia profundas y minuciosamente calculadas. Para expresarla Vian huye de las limitaciones de la lógica heredada de Aristóteles y opta por utilizar la técnica del collage, yuxtaponiendo una pluralidad de situaciones y acciones en las que lo real se entremezcla con lo imaginario, cuyo resultado final es la representación de una realidad única, indivisible y a la vez relativa y singular en función de cada individuo, en la que los aparentes antagonismos se revelan como elementos complementarios en el proceso de individualización y conocimiento del «yo» interior.
Resumo:
This article introduces a new interface for T-Coffee, a consistency-based multiple sequence alignment program. This interface provides an easy and intuitive access to the most popular functionality of the package. These include the default T-Coffee mode for protein and nucleic acid sequences, the M-Coffee mode that allows combining the output of any other aligners, and template-based modes of T-Coffee that deliver high accuracy alignments while using structural or homology derived templates. These three available template modes are Expresso for the alignment of protein with a known 3D-Structure, R-Coffee to align RNA sequences with conserved secondary structures and PSI-Coffee to accurately align distantly related sequences using homology extension. The new server benefits from recent improvements of the T-Coffee algorithm and can align up to 150 sequences as long as 10 000 residues and is available from both http://www.tcoffee.org and its main mirror http://tcoffee.crg.cat.
Resumo:
The Centre d’Art d’Època Moderna (CAEM) of the University of Lleida is a specialized research centre for the scientific study of the pictorial heritage that understands the direct observation of every piece of the work of art as the methodological foundation on which are integrated most traditional techniques in the field of the historic-artistic studies, like the comparative study and records and documents research. It is proposed the creation of historical and interpretative speeches with great consistency and credibility on the basis of a detailed registry of that material evidence.
Resumo:
Recently, edge matching puzzles, an NP-complete problem, have received, thanks to money-prized contests, considerable attention from wide audiences. We consider these competitions not only a challenge for SAT/CSP solving techniques but also as an opportunity to showcase the advances in the SAT/CSP community to a general audience. This paper studies the NP-complete problem of edge matching puzzles focusing on providing generation models of problem instances of variable hardness and on its resolution through the application of SAT and CSP techniques. From the generation side, we also identify the phase transition phenomena for each model. As solving methods, we employ both; SAT solvers through the translation to a SAT formula, and two ad-hoc CSP solvers we have developed, with different levels of consistency, employing several generic and specialized heuristics. Finally, we conducted an extensive experimental investigation to identify the hardest generation models and the best performing solving techniques.
Resumo:
Recently, edge matching puzzles, an NP-complete problem, have rececived, thanks to money-prized contests, considerable attention from wide audiences. We consider these competitions not only a challenge for SAT/CSP solving techniques but also as an opportunity to showcase the advances in the SAT/CSP community to a general audience. This paper studies the NP-complete problem of edge matching puzzles focusing on providing generation models of problem instances of variable hardness and on its resolution through the application of SAT and CSP techniques. From the generation side, we also identify the phase transition phenomena for each model. As solving methods, we employ both; SAT solvers through the translation to a SAT formula, and two ad-hoc CSP solvers we have developed, with different levels of consistency, employing several generic and specialized heuristics. Finally, we conducted an extensive experimental investigation to identify the hardest generation models and the best performing solving techniques.