989 resultados para multi-constraint assignment
Resumo:
We study the earnings structure and the equilibrium assignment of workers when workers exert intra-firm spillovers on each other.We allow for arbitrary spillovers provided output depends on some aggregate index of workers' skill. Despite the possibility of increasing returns to skills, equilibrium typically exists. We show that equilibrium will typically be segregated; that the skill space can be partitioned into a set of segments and any firm hires from only one segment. Next, we apply the model to analyze the effect of information technology on segmentation and the distribution of income. There are two types of human capital, productivity and creativity, i.e. the ability to produce ideas that may be duplicated over a network. Under plausible assumptions, inequality rises and then falls when network size increases, and the poorest workers cannot lose. We also analyze the impact of an improvement in worker quality and of an increased international mobility of ideas.
Resumo:
The human brain displays heterogeneous organization in both structure and function. Here we develop a method to characterize brain regions and networks in terms of information-theoretic measures. We look at how these measures scale when larger spatial regions as well as larger connectome sub-networks are considered. This framework is applied to human brain fMRI recordings of resting-state activity and DSI-inferred structural connectivity. We find that strong functional coupling across large spatial distances distinguishes functional hubs from unimodal low-level areas, and that this long-range functional coupling correlates with structural long-range efficiency on the connectome. We also find a set of connectome regions that are both internally integrated and coupled to the rest of the brain, and which resemble previously reported resting-state networks. Finally, we argue that information-theoretic measures are useful for characterizing the functional organization of the brain at multiple scales.
Resumo:
Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.
Resumo:
We present strategies for chemical shift assignments of large proteins by magic-angle spinning solid-state NMR, using the 21-kDa disulfide-bond-forming enzyme DsbA as prototype. Previous studies have demonstrated that complete de novo assignments are possible for proteins up to approximately 17 kDa, and partial assignments have been performed for several larger proteins. Here we show that combinations of isotopic labeling strategies, high field correlation spectroscopy, and three-dimensional (3D) and four-dimensional (4D) backbone correlation experiments yield highly confident assignments for more than 90% of backbone resonances in DsbA. Samples were prepared as nanocrystalline precipitates by a dialysis procedure, resulting in heterogeneous linewidths below 0.2 ppm. Thus, high magnetic fields, selective decoupling pulse sequences, and sparse isotopic labeling all improved spectral resolution. Assignments by amino acid type were facilitated by particular combinations of pulse sequences and isotopic labeling; for example, transferred echo double resonance experiments enhanced sensitivity for Pro and Gly residues; [2-(13)C]glycerol labeling clarified Val, Ile, and Leu assignments; in-phase anti-phase correlation spectra enabled interpretation of otherwise crowded Glx/Asx side-chain regions; and 3D NCACX experiments on [2-(13)C]glycerol samples provided unique sets of aromatic (Phe, Tyr, and Trp) correlations. Together with high-sensitivity CANCOCA 4D experiments and CANCOCX 3D experiments, unambiguous backbone walks could be performed throughout the majority of the sequence. At 189 residues, DsbA represents the largest monomeric unit for which essentially complete solid-state NMR assignments have so far been achieved. These results will facilitate studies of nanocrystalline DsbA structure and dynamics and will enable analysis of its 41-kDa covalent complex with the membrane protein DsbB, for which we demonstrate a high-resolution two-dimensional (13)C-(13)C spectrum.
Resumo:
In moment structure analysis with nonnormal data, asymptotic valid inferences require the computation of a consistent (under general distributional assumptions) estimate of the matrix $\Gamma$ of asymptotic variances of sample second--order moments. Such a consistent estimate involves the fourth--order sample moments of the data. In practice, the use of fourth--order moments leads to computational burden and lack of robustness against small samples. In this paper we show that, under certain assumptions, correct asymptotic inferences can be attained when $\Gamma$ is replaced by a matrix $\Omega$ that involves only the second--order moments of the data. The present paper extends to the context of multi--sample analysis of second--order moment structures, results derived in the context of (simple--sample) covariance structure analysis (Satorra and Bentler, 1990). The results apply to a variety of estimation methods and general type of statistics. An example involving a test of equality of means under covariance restrictions illustrates theoretical aspects of the paper.
Resumo:
Working Paper no longer available. Please contact the author.
Resumo:
We extend to score, Wald and difference test statistics the scaled and adjusted corrections to goodness-of-fit test statistics developed in Satorra and Bentler (1988a,b). The theory is framed in the general context of multisample analysis of moment structures, under general conditions on the distribution of observable variables. Computational issues, as well as the relation of the scaled and corrected statistics to the asymptotic robust ones, is discussed. A Monte Carlo study illustrates thecomparative performance in finite samples of corrected score test statistics.
Resumo:
The HACEK organisms (Haemophilus species, Aggregatibacter species, Cardiobacterium hominis, Eikenella corrodens, and Kingella species) are rare causes of infective endocarditis (IE). The objective of this study is to describe the clinical characteristics and outcomes of patients with HACEK endocarditis (HE) in a large multi-national cohort. Patients hospitalized with definite or possible infective endocarditis by the International Collaboration on Endocarditis Prospective Cohort Study in 64 hospitals from 28 countries were included and characteristics of HE patients compared with IE due to other pathogens. Of 5591 patients enrolled, 77 (1.4%) had HE. HE was associated with a younger age (47 vs. 61 years; p<0.001), a higher prevalence of immunologic/vascular manifestations (32% vs. 20%; p<0.008) and stroke (25% vs. 17% p = 0.05) but a lower prevalence of congestive heart failure (15% vs. 30%; p = 0.004), death in-hospital (4% vs. 18%; p = 0.001) or after 1 year follow-up (6% vs. 20%; p = 0.01) than IE due to other pathogens (n = 5514). On multivariable analysis, stroke was associated with mitral valve vegetations (OR 3.60; CI 1.34-9.65; p<0.01) and younger age (OR 0.62; CI 0.49-0.90; p<0.01). The overall outcome of HE was excellent with the in-hospital mortality (4%) significantly better than for non-HE (18%; p<0.001). Prosthetic valve endocarditis was more common in HE (35%) than non-HE (24%). The outcome of prosthetic valve and native valve HE was excellent whether treated medically or with surgery. Current treatment is very successful for the management of both native valve prosthetic valve HE but further studies are needed to determine why HE has a predilection for younger people and to cause stroke. The small number of patients and observational design limit inferences on treatment strategies. Self selection of study sites limits epidemiological inferences.
Resumo:
BACKGROUND AND PURPOSE: Multi-phase postmortem CT angiography (MPMCTA) is increasingly being recognized as a valuable adjunct medicolegal tool to explore the vascular system. Adequate interpretation, however, requires knowledge about the most common technique-related artefacts. The purpose of this study was to identify and index the possible artefacts related to MPMCTA. MATERIAL AND METHODS: An experienced radiologist blinded to all clinical and forensic data retrospectively reviewed 49 MPMCTAs. Each angiographic phase, i.e. arterial, venous and dynamic, was analysed separately to identify phase-specific artefacts based on location and aspect. RESULTS: Incomplete contrast filling of the cerebral venous system was the most commonly encountered artefact, followed by contrast agent layering in the lumen of the thoracic aorta. Enhancement or so-called oedematization of the digestive system mucosa was also frequently observed. CONCLUSION: All MPMCTA artefacts observed and described here are reproducible and easily identifiable. Knowledge about these artefacts is important to avoid misinterpreting them as pathological findings.
Resumo:
Several studies have reported high performance of simple decision heuristics multi-attribute decision making. In this paper, we focus on situations where attributes are binary and analyze the performance of Deterministic-Elimination-By-Aspects (DEBA) and similar decision heuristics. We consider non-increasing weights and two probabilistic models for the attribute values: one where attribute values are independent Bernoulli randomvariables; the other one where they are binary random variables with inter-attribute positive correlations. Using these models, we show that good performance of DEBA is explained by the presence of cumulative as opposed to simple dominance. We therefore introduce the concepts of cumulative dominance compliance and fully cumulative dominance compliance and show that DEBA satisfies those properties. We derive a lower bound with which cumulative dominance compliant heuristics will choose a best alternative and show that, even with many attributes, this is not small. We also derive an upper bound for the expected loss of fully cumulative compliance heuristics and show that this is moderateeven when the number of attributes is large. Both bounds are independent of the values ofthe weights.
Resumo:
The Generalized Assignment Problem consists in assigning a setof tasks to a set of agents with minimum cost. Each agent hasa limited amount of a single resource and each task must beassigned to one and only one agent, requiring a certain amountof the resource of the agent. We present new metaheuristics forthe generalized assignment problem based on hybrid approaches.One metaheuristic is a MAX-MIN Ant System (MMAS), an improvedversion of the Ant System, which was recently proposed byStutzle and Hoos to combinatorial optimization problems, and itcan be seen has an adaptive sampling algorithm that takes inconsideration the experience gathered in earlier iterations ofthe algorithm. Moreover, the latter heuristic is combined withlocal search and tabu search heuristics to improve the search.A greedy randomized adaptive search heuristic (GRASP) is alsoproposed. Several neighborhoods are studied, including one basedon ejection chains that produces good moves withoutincreasing the computational effort. We present computationalresults of the comparative performance, followed by concludingremarks and ideas on future research in generalized assignmentrelated problems.
Resumo:
This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.
Resumo:
The old, understudied electoral system composed of multi-member districts, open ballot and plurality rule is presented as the most remote scene of the origin of both political parties and new electoral systems. A survey of the uses of this set of electoral rules in different parts of the world during remote and recent periods shows its wide spread. A model of voting by this electoral system demonstrates that, while it can produce varied and pluralistic representation, it also provides incentives to form factional or partisan candidacies. Famous negative reactions to the emergence of factions and political parties during the 18th and 19th centuries are reinterpreted in this context. Many electoral rules and procedures invented since the second half of the 19th century, including the Australian ballot, single-member districts, limited and cumulative ballots, and proportional representation rules, derived from the search to reduce the effects of the originating multi-member district system in favor of a single party sweep. The general relations between political parties and electoral systems are restated to account for the foundational stage here discussed.
Resumo:
In this paper we develop two models for an inventory system in which the distributormanages the inventory at the retailers location. These type of systems correspondto the Vendor Managed Inventory (VMI) systems described ib the literature. Thesesystems are very common in many different types of industries, such as retailingand manufacturing, although assuming different characteristics.The objective of our model is to minimize total inventory cost for the distributorin a multi-period multi-retailer setting. The inventory system includes holdingand stock-out costs and we study the case whre an additional fixed setup cost ischarged per delivery.We construct a numerical experiment to analyze the model bahavior and observe theimpact of the characteristics of the model on the solutions.