977 resultados para Interval generalized set
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
The proposed game is a natural extension of the Shapley and Shubik Assignment Game to the case where each seller owns a set of different objets instead of only one indivisible object. We propose definitions of pairwise stability and group stability that are adapted to our framework. Existence of both pairwise and group stable outcomes is proved. We study the structure of the group stable set and we finally prove that the set of group stable payoffs forms a complete lattice with one optimal group stable payoff for each side of the market.
Resumo:
There is recent interest in the generalization of classical factor models in which the idiosyncratic factors are assumed to be orthogonal and there are identification restrictions on cross-sectional and time dimensions. In this study, we describe and implement a Bayesian approach to generalized factor models. A flexible framework is developed to determine the variations attributed to common and idiosyncratic factors. We also propose a unique methodology to select the (generalized) factor model that best fits a given set of data. Applying the proposed methodology to the simulated data and the foreign exchange rate data, we provide a comparative analysis between the classical and generalized factor models. We find that when there is a shift from classical to generalized, there are significant changes in the estimates of the structures of the covariance and correlation matrices while there are less dramatic changes in the estimates of the factor loadings and the variation attributed to common factors.
International consensus conference on PFAPA syndrome: Evaluation of a new set of diagnostic criteria
Resumo:
The PFAPA syndrome is characterized by periodic fever, associated with pharyngitis, cervical adenitis and/or aphtous stomatitis and belongs to the auto-inflammatory diseases. Diagnostic criteria are based on clinical features and the exclusion of other periodic fever syndromes. An analysis of a large cohort of patients has shown weaknesses for these criteria and there is a lack of international consensus. An International Conference was held in Morges in November 2008 to propose a new set of classification criteria based on a consensus among experts in the field. We aimed to verify the applicability of the new set of classification criteria. 80 patients diagnosed with PFAPA syndrome from 3 centers (Genoa, Lausanne and Geneva) for pediatric rheumatology were included in the study. A detailed description of the clinical and laboratory features was obtained. The new classification criteria and the actual diagnostic criteria were applied to the patients. Only 43/80 patients (53.8%) fulfilled all criteria of the new classification. 31 patients were excluded because they didn't meet one of the 7 diagnostic criteria, 8 because of 2 criteria, and one because of 3 criteria. When we applied the current criteria to the same patients, 11/80 patients (13%) needed to be excluded. 8/80 patients (10%) were excluded from both sets. Exclusion was related only to some of the criteria. Number of patients for each not fulfilled criterion (new set of criteria/actual criteria): age (1/6), symptoms between episodes (2/2), delayed growth (3/3), main symptoms (21/0), periodicity, length of fever, interval between episodes, and length of disease (19/0). The application of some of the new criteria was not easy, as they were both very restrictive and needed precise information from the patients. Our work has shown that the new set of classification criteria can be applied to patients suspected for PFAPA syndrome, but it seems to be more restrictive than the actual diagnostic criteria. A further work of validation needs to be done for this new set of classification criteria in order to determine if these criteria allow a good discrimination between PFAPA patients and other causes of recurrent fever syndromes.
Resumo:
INTRODUCTION: PFAPA syndrome is characterized by periodic fever, associated with pharyngitis, cervical adenitis and/or aphthous stomatitis and belongs to the auto-inflammatory diseases. Diagnostic criteria are based on clinical features and the exclusion of other periodic fever syndromes. An analysis of a large cohort of patients has shown weaknesses for these criteria and there is a lack of international consensus. An International Conference was held in Morges in November 2008 to propose a new set of classification criteria based on a consensus among experts in the field.OBJECTIVE: We aimed to verify the applicability of the new set of classification criteria.PATIENTS & METHODS: 80 patients diagnosed with PFAPA syndrome from 3 centers (Genoa, Lausanne and Geneva) for pediatric rheumatology were included in the study. A detailed description of the clinical and laboratory features was obtained. The new classification criteria and the actual diagnostic criteria were applied to the patients.RESULTS: Only 40/80 patients (50%) fulfilled all criteria of the new classification. 31 patients were excluded because they didn't meet one of the 7 diagnostic criteria, 7 because of 2 criteria, and one because of 3 criteria. When we applied the current criteria to the same patients, 11/80 patients (13.7%) needed to be excluded. 8/80 patients (10%) were excluded from both sets. Exclusion was related only to some of the criteria. Number of patients for each not fulfilled criterion (new set of criteria/actual criteria): age (1/6), symptoms between episodes (2/2), delayed growth (4/1), main symptoms (21/0), periodicity, length of fever, interval between episodes, and length of disease (20/0). The application of some of the new criteria was not easy, as they were both very restrictive and needed precise information from the patients.CONCLUSION: Our work has shown that the new set of classification criteria can be applied to patients suspected for PFAPA syndrome, but it seems to be more restrictive than the actual diagnostic criteria. A further work of validation needs to be done in order to determine if this new set of classification criteria allow a good discrimination between PFAPA patients and other causes of recurrent fever syndromes.
Resumo:
The preceding two editions of CoDaWork included talks on the possible considerationof densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended theEuclidean structure of the simplex to a Hilbert space structure of the set of densitieswithin a bounded interval, and van den Boogaart (2005) generalized this to the setof densities bounded by an arbitrary reference density. From the many variations ofthe Hilbert structures available, we work with three cases. For bounded variables, abasis derived from Legendre polynomials is used. For variables with a lower bound, westandardize them with respect to an exponential distribution and express their densitiesas coordinates in a basis derived from Laguerre polynomials. Finally, for unboundedvariables, a normal distribution is used as reference, and coordinates are obtained withrespect to a Hermite-polynomials-based basis.To get the coordinates, several approaches can be considered. A numerical accuracyproblem occurs if one estimates the coordinates directly by using discretized scalarproducts. Thus we propose to use a weighted linear regression approach, where all k-order polynomials are used as predictand variables and weights are proportional to thereference density. Finally, for the case of 2-order Hermite polinomials (normal reference)and 1-order Laguerre polinomials (exponential), one can also derive the coordinatesfrom their relationships to the classical mean and variance.Apart of these theoretical issues, this contribution focuses on the application of thistheory to two main problems in sedimentary geology: the comparison of several grainsize distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock orsediment, like their composition
Resumo:
This paper describes a new reliable method, based on modal interval analysis (MIA) and set inversion (SI) techniques, for the characterization of solution sets defined by quantified constraints satisfaction problems (QCSP) over continuous domains. The presented methodology, called quantified set inversion (QSI), can be used over a wide range of engineering problems involving uncertain nonlinear models. Finally, an application on parameter identification is presented
Resumo:
Aim This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location State of Vaud, western Switzerland. Methods Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.
Resumo:
The Generalized Assignment Problem consists in assigning a setof tasks to a set of agents with minimum cost. Each agent hasa limited amount of a single resource and each task must beassigned to one and only one agent, requiring a certain amountof the resource of the agent. We present new metaheuristics forthe generalized assignment problem based on hybrid approaches.One metaheuristic is a MAX-MIN Ant System (MMAS), an improvedversion of the Ant System, which was recently proposed byStutzle and Hoos to combinatorial optimization problems, and itcan be seen has an adaptive sampling algorithm that takes inconsideration the experience gathered in earlier iterations ofthe algorithm. Moreover, the latter heuristic is combined withlocal search and tabu search heuristics to improve the search.A greedy randomized adaptive search heuristic (GRASP) is alsoproposed. Several neighborhoods are studied, including one basedon ejection chains that produces good moves withoutincreasing the computational effort. We present computationalresults of the comparative performance, followed by concludingremarks and ideas on future research in generalized assignmentrelated problems.
Resumo:
The approximants to regular continued fractions constitute `best approximations' to the numbers they converge to in two ways known as of the first and the second kind.This property of continued fractions provides a solution to Gosper's problem of the batting average: if the batting average of a baseball player is 0.334, what is the minimum number of times he has been at bat? In this paper, we tackle somehow the inverse question: given a rational number P/Q, what is the set of all numbers for which P/Q is a `best approximation' of one or the other kind? We prove that inboth cases these `Optimality Sets' are intervals and we give aprecise description of their endpoints.
Resumo:
In many research areas (such as public health, environmental contamination, and others) one deals with the necessity of using data to infer whether some proportion (%) of a population of interest is (or one wants it to be) below and/or over some threshold, through the computation of tolerance interval. The idea is, once a threshold is given, one computes the tolerance interval or limit (which might be one or two - sided bounded) and then to check if it satisfies the given threshold. Since in this work we deal with the computation of one - sided tolerance interval, for the two-sided case we recomend, for instance, Krishnamoorthy and Mathew [5]. Krishnamoorthy and Mathew [4] performed the computation of upper tolerance limit in balanced and unbalanced one-way random effects models, whereas Fonseca et al [3] performed it based in a similar ideas but in a tow-way nested mixed or random effects model. In case of random effects model, Fonseca et al [3] performed the computation of such interval only for the balanced data, whereas in the mixed effects case they dit it only for the unbalanced data. For the computation of twosided tolerance interval in models with mixed and/or random effects we recomend, for instance, Sharma and Mathew [7]. The purpose of this paper is the computation of upper and lower tolerance interval in a two-way nested mixed effects models in balanced data. For the case of unbalanced data, as mentioned above, Fonseca et al [3] have already computed upper tolerance interval. Hence, using the notions persented in Fonseca et al [3] and Krishnamoorthy and Mathew [4], we present some results on the construction of one-sided tolerance interval for the balanced case. Thus, in order to do so at first instance we perform the construction for the upper case, and then the construction for the lower case.
Resumo:
In a previous paper a novel Generalized Multiobjective Multitree model (GMM-model) was proposed. This model considers for the first time multitree-multicast load balancing with splitting in a multiobjective context, whose mathematical solution is a whole Pareto optimal set that can include several results than it has been possible to find in the publications surveyed. To solve the GMM-model, in this paper a multi-objective evolutionary algorithm (MOEA) inspired by the Strength Pareto Evolutionary Algorithm (SPEA) is proposed. Experimental results considering up to 11 different objectives are presented for the well-known NSF network, with two simultaneous data flows
Resumo:
Positive-operator-valued measurements on a finite number of N identically prepared systems of arbitrary spin J are discussed. Pure states are characterized in terms of Bloch-like vectors restricted by a SU(2J+1) covariant constraint. This representation allows for a simple description of the equations to be fulfilled by optimal measurements. We explicitly find the minimal positive-operator-valued measurement for the N=2 case, a rigorous bound for N=3, and set up the analysis for arbitrary N.
Resumo:
We prove for any pure three-quantum-bit state the existence of local bases which allow one to build a set of five orthogonal product states in terms of which the state can be written in a unique form. This leads to a canonical form which generalizes the two-quantum-bit Schmidt decomposition. It is uniquely characterized by the five entanglement parameters. It leads to a complete classification of the three-quantum-bit states. It shows that the right outcome of an adequate local measurement always erases all entanglement between the other two parties.