950 resultados para generalized assignment
Resumo:
This contribution compares existing and newly developed techniques for geometrically representing mean-variances-kewness portfolio frontiers based on the rather widely adapted methodology of polynomial goal programming (PGP) on the one hand and the more recent approach based on the shortage function on the other hand. Moreover, we explain the working of these different methodologies in detail and provide graphical illustrations. Inspired by these illustrations, we prove a generalization of the well-known two fund separation theorem from traditionalmean-variance portfolio theory.
Resumo:
Aims Food-deceptive pollination, in which plants do not offer any food reward to their pollinators, is common within the Orchidaceae. As food-deceptive orchids are poorer competitors for pollinator visitation than rewarding orchids, their occurrence in a given habitat may be more constrained than that of rewarding orchids. In particular, the success of deceptive orchids strongly relies on several biotic factors such as interactions with co-flowering rewarding species and pollinators, which may vary with altitude and over time. Our study compares generalized food-deceptive (i.e. excluding sexually deceptive) and rewarding orchids to test whether (i) deceptive orchids flower earlier compared to their rewarding counterparts and whether (ii) the relative occurrence of deceptive orchids decreases with increasing altitude. Methods To compare the flowering phenology of rewarding and deceptive orchids, we analysed data compiled from the literature at the species level over the occidental Palaearctic area. Since flowering phenology can be constrained by the latitudinal distribution of the species and by their phylogenetic relationships, we accounted for these factors in our analysis. To compare the altitudinal distribution of rewarding and deceptive orchids, we used field observations made over the entire Swiss territory and over two Swiss mountain ranges. Important Findings We found that deceptive orchid species start flowering earlier than rewarding orchids do, which is in accordance with the hypotheses of exploitation of naive pollinators and/or avoidance of competition with rewarding co-occurring species. Also, the relative frequency of deceptive orchids decreases with altitude, suggesting that deception may be less profitable at high compared to low altitude.
Resumo:
In this paper, we analyze working experiences of female sports journalists in the French-speaking Swiss daily press. We draw on Bourdieu's theory of habitus and field to examine how structures of power shape these journalists' lives. Based on 27 semistructured interviews and observations in the field, we found that women journalists' work experiences depend on the relationship between their position in the field and their ethos and hexis. We identified three main strategies through which the women journalists negotiated their experiences: (1) conforming to the dominant male ethos (2) threatening the orthodoxy (3) resisting while hijacking the assigned role.
Resumo:
Models of codon evolution have attracted particular interest because of their unique capabilities to detect selection forces and their high fit when applied to sequence evolution. We described here a novel approach for modeling codon evolution, which is based on Kronecker product of matrices. The 61 × 61 codon substitution rate matrix is created using Kronecker product of three 4 × 4 nucleotide substitution matrices, the equilibrium frequency of codons, and the selection rate parameter. The entities of the nucleotide substitution matrices and selection rate are considered as parameters of the model, which are optimized by maximum likelihood. Our fully mechanistic model allows the instantaneous substitution matrix between codons to be fully estimated with only 19 parameters instead of 3,721, by using the biological interdependence existing between positions within codons. We illustrate the properties of our models using computer simulations and assessed its relevance by comparing the AICc measures of our model and other models of codon evolution on simulations and a large range of empirical data sets. We show that our model fits most biological data better compared with the current codon models. Furthermore, the parameters in our model can be interpreted in a similar way as the exchangeability rates found in empirical codon models.
Resumo:
A new graph-based construction of generalized low density codes (GLD-Tanner) with binary BCH constituents is described. The proposed family of GLD codes is optimal on block erasure channels and quasi-optimal on block fading channels. Optimality is considered in the outage probability sense. Aclassical GLD code for ergodic channels (e.g., the AWGN channel,the i.i.d. Rayleigh fading channel, and the i.i.d. binary erasure channel) is built by connecting bitnodes and subcode nodes via a unique random edge permutation. In the proposed construction of full-diversity GLD codes (referred to as root GLD), bitnodes are divided into 4 classes, subcodes are divided into 2 classes, and finally both sides of the Tanner graph are linked via 4 random edge permutations. The study focuses on non-ergodic channels with two states and can be easily extended to channels with 3 states or more.
Resumo:
An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001.We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling.
Resumo:
We study the earnings structure and the equilibrium assignment of workers when workers exert intra-firm spillovers on each other.We allow for arbitrary spillovers provided output depends on some aggregate index of workers' skill. Despite the possibility of increasing returns to skills, equilibrium typically exists. We show that equilibrium will typically be segregated; that the skill space can be partitioned into a set of segments and any firm hires from only one segment. Next, we apply the model to analyze the effect of information technology on segmentation and the distribution of income. There are two types of human capital, productivity and creativity, i.e. the ability to produce ideas that may be duplicated over a network. Under plausible assumptions, inequality rises and then falls when network size increases, and the poorest workers cannot lose. We also analyze the impact of an improvement in worker quality and of an increased international mobility of ideas.
Resumo:
We present strategies for chemical shift assignments of large proteins by magic-angle spinning solid-state NMR, using the 21-kDa disulfide-bond-forming enzyme DsbA as prototype. Previous studies have demonstrated that complete de novo assignments are possible for proteins up to approximately 17 kDa, and partial assignments have been performed for several larger proteins. Here we show that combinations of isotopic labeling strategies, high field correlation spectroscopy, and three-dimensional (3D) and four-dimensional (4D) backbone correlation experiments yield highly confident assignments for more than 90% of backbone resonances in DsbA. Samples were prepared as nanocrystalline precipitates by a dialysis procedure, resulting in heterogeneous linewidths below 0.2 ppm. Thus, high magnetic fields, selective decoupling pulse sequences, and sparse isotopic labeling all improved spectral resolution. Assignments by amino acid type were facilitated by particular combinations of pulse sequences and isotopic labeling; for example, transferred echo double resonance experiments enhanced sensitivity for Pro and Gly residues; [2-(13)C]glycerol labeling clarified Val, Ile, and Leu assignments; in-phase anti-phase correlation spectra enabled interpretation of otherwise crowded Glx/Asx side-chain regions; and 3D NCACX experiments on [2-(13)C]glycerol samples provided unique sets of aromatic (Phe, Tyr, and Trp) correlations. Together with high-sensitivity CANCOCA 4D experiments and CANCOCX 3D experiments, unambiguous backbone walks could be performed throughout the majority of the sequence. At 189 residues, DsbA represents the largest monomeric unit for which essentially complete solid-state NMR assignments have so far been achieved. These results will facilitate studies of nanocrystalline DsbA structure and dynamics and will enable analysis of its 41-kDa covalent complex with the membrane protein DsbB, for which we demonstrate a high-resolution two-dimensional (13)C-(13)C spectrum.
Resumo:
Asymptotic chi-squared test statistics for testing the equality ofmoment vectors are developed. The test statistics proposed aregeneralizedWald test statistics that specialize for different settings by inserting andappropriate asymptotic variance matrix of sample moments. Scaled teststatisticsare also considered for dealing with situations of non-iid sampling. Thespecializationwill be carried out for testing the equality of multinomial populations, andtheequality of variance and correlation matrices for both normal andnon-normaldata. When testing the equality of correlation matrices, a scaled versionofthe normal theory chi-squared statistic is proven to be an asymptoticallyexactchi-squared statistic in the case of elliptical data.
Resumo:
To report the case of a child with short absences and occasional myoclonias since infancy who was first diagnosed with an idiopathic generalized epilepsy, but was documented at follow-up to have a mild phenotype of glucose transporter type 1 deficiency syndrome. Unlike other reported cases of Glut-1 DS and epilepsy, this child had a normal development as well as a normal head growth and neurological examination. Early onset of seizures and later recognized episodes of mild confusion before meals together with persistent atypical EEG features and unexpected learning difficulties led to the diagnosis. Seizure control and neuropsychological improvements were obtained with a ketogenic diet.
Resumo:
Aim This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location State of Vaud, western Switzerland. Methods Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.
Resumo:
A Method is offered that makes it possible to apply generalized canonicalcorrelations analysis (CANCOR) to two or more matrices of different row and column order. The new method optimizes the generalized canonical correlationanalysis objective by considering only the observed values. This is achieved byemploying selection matrices. We present and discuss fit measures to assessthe quality of the solutions. In a simulation study we assess the performance of our new method and compare it to an existing procedure called GENCOM,proposed by Green and Carroll. We find that our new method outperforms the GENCOM algorithm both with respect to model fit and recovery of the truestructure. Moreover, as our new method does not require any type of iteration itis easier to implement and requires less computation. We illustrate the methodby means of an example concerning the relative positions of the political parties inthe Netherlands based on provincial data.
Resumo:
In this paper I explore the issue of nonlinearity (both in the datageneration process and in the functional form that establishes therelationship between the parameters and the data) regarding the poorperformance of the Generalized Method of Moments (GMM) in small samples.To this purpose I build a sequence of models starting with a simple linearmodel and enlarging it progressively until I approximate a standard (nonlinear)neoclassical growth model. I then use simulation techniques to find the smallsample distribution of the GMM estimators in each of the models.
Resumo:
This paper presents a general equilibrium model of money demand wherethe velocity of money changes in response to endogenous fluctuations in the interest rate. The parameter space can be divided into two subsets: one where velocity is constant and equal to one as in cash-in-advance models, and another one where velocity fluctuates as in Baumol (1952). Despite its simplicity, in terms of paramaters to calibrate, the model performs surprisingly well. In particular, it approximates the variability of money velocity observed in the U.S. for the post-war period. The model is then used to analyze the welfare costs of inflation under uncertainty. This application calculates the errors derived from computing the costs of inflation with deterministic models. It turns out that the size of this difference is small, at least for the levels of uncertainty estimated for the U.S. economy.