962 resultados para heory of constraints


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A summary of the constraints from the ATLAS experiment on R-parity-conserving supersymmetry is presented. Results from 22 separate ATLAS searches are considered, each based on analysis of up to 20.3 fb−1 of proton-proton collision data at centre-of-mass energies of s√=7 and 8 TeV at the Large Hadron Collider. The results are interpreted in the context of the 19-parameter phenomenological minimal supersymmetric standard model, in which the lightest supersymmetric particle is a neutralino, taking into account constraints from previous precision electroweak and flavour measurements as well as from dark matter related measurements. The results are presented in terms of constraints on supersymmetric particle masses and are compared to limits from simplified models. The impact of ATLAS searches on parameters such as the dark matter relic density, the couplings of the observed Higgs boson, and the degree of electroweak fine-tuning is also shown. Spectra for surviving supersymmetry model points with low fine-tunings are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper develops a stability theory for the optimal value and the optimal set mapping of optimization problems posed in a Banach space. The problems considered in this paper have an arbitrary number of inequality constraints involving lower semicontinuous (not necessarily convex) functions and one closed abstract constraint set. The considered perturbations lead to problems of the same type as the nominal one (with the same space of variables and the same number of constraints), where the abstract constraint set can also be perturbed. The spaces of functions involved in the problems (objective and constraints) are equipped with the metric of the uniform convergence on the bounded sets, meanwhile in the space of closed sets we consider, coherently, the Attouch-Wets topology. The paper examines, in a unified way, the lower and upper semicontinuity of the optimal value function, and the closedness, lower and upper semicontinuity (in the sense of Berge) of the optimal set mapping. This paper can be seen as a second part of the stability theory presented in [17], where we studied the stability of the feasible set mapping (completed here with the analysis of the Lipschitz-like property).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Trait decoupling, wherein evolutionary release of constraints permits specialization of formerly integrated structures, represents a major conceptual framework for interpreting patterns of organismal diversity. However, few empirical tests of this hypothesis exist. A central prediction, that the tempo of morphological evolution and ecological diversification should increase following decoupling events, remains inadequately tested. In damselfishes (Pomacentridae), a ceratomandibular ligament links the hyoid bar and lower jaws, coupling two main morphofunctional units directly involved in both feeding and sound production. Here, we test the decoupling hypothesis by examining the evolutionary consequences of the loss of the ceratomandibular ligament in multiple damselfish lineages. As predicted, we find that rates of morphological evolution of trophic structures increased following the loss of the ligament. However, this increase in evolutionary rate is not associated with an increase in trophic breadth, but rather with morphofunctional specialization for the capture of zooplanktonic prey. Lineages lacking the ceratomandibular ligament also shows different acoustic signals (i.e. higher variation of pulse periods) from others, resulting in an increase of the acoustic diversity across the family. Our results support the idea that trait decoupling can increase morphological and behavioural diversity through increased specialization rather than the generation of novel ecotypes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The prevalence of obesity is rising progressively, even among older age groups. By the year 2030-2035 over 20% of the adult US population and over 25% of the Europeans will be aged 65 years and older. The predicted prevalence of obesity in Americans, 60 years and older was 37% in 2010. The predicted prevalence of obesity in Europe in 2015 varies between 20 and 30% dependent on the model used. This means 20.9 million obese 60+ people in the USA in 2010 and 32 million obese elders in 2015 in the EU. Although cut-off values of BMI, waist circumference and percentages of fat mass have not been defined for the elderly (nor for the elderly of different ethnicity), it is clear from several meta-analyses that mortality and morbidity associated with overweight and obesity only increases at a BMI above 30 kg/m(2). Thus, treatment should only be offered to patients who are obese rather than overweight and who also have functional impairments, metabolic complications or obesity-related diseases, that can benefit from weight loss. The weight loss therapy should aim to minimize muscle and bone loss but also vigilance as regards the development of sarcopenic obesity - a combination of an unhealthy excess of body fat with a detrimental loss of muscle and fat-free mass including bone - is important in the elderly, who are vulnerable to this outcome. Life-style intervention should be the first step and consists of a diet with a 500 kcal (2.1 MJ) energy deficit and an adequate intake of protein of high biological quality together with calcium and vitamin D, behavioural therapy and multi-component exercise. Multi-component exercise includes flexibility training, balance training, aerobic exercise and resistance training. The adherence rate in most studies is around 75%. Knowledge of constraints and modulators of physical inactivity should be of help to engage the elderly in physical activity. The role of pharmacotherapy and bariatric surgery in the elderly is largely unknown as in most studies people aged 65 years and older have been excluded.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We discuss reality conditions and the relation between spacetime diffeomorphisms and gauge transformations in Ashtekars complex formulation of general relativity. We produce a general theoretical framework for the stabilization algorithm for the reality conditions, which is different from Diracs method of stabilization of constraints. We solve the problem of the projectability of the diffeomorphism transformations from configuration-velocity space to phase space, linking them to the reality conditions. We construct the complete set of canonical generators of the gauge group in the phase space which includes all the gauge variables. This result proves that the canonical formalism has all the gauge structure of the Lagrangian theory, including the time diffeomorphisms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Polyploidization, which is expected to trigger major genomic reorganizations, occurs much less commonly in animals than in plants, possibly because of constraints imposed by sex-determination systems. We investigated the origins and consequences of allopolyploidization in Palearctic green toads (Bufo viridis subgroup) from Central Asia, with three ploidy levels and different modes of genome transmission (sexual versus clonal), to (i) establish a topology for the reticulate phylogeny in a species-rich radiation involving several closely related lineages and (ii) explore processes of genomic reorganization that may follow polyploidization. Sibship analyses based on 30 cross-amplifying microsatellite markers substantiated the maternal origins and revealed the paternal origins and relationships of subgenomes in allopolyploids. Analyses of the synteny of linkage groups identified three markers affected by translocation events, which occurred only within the paternally inherited subgenomes of allopolyploid toads and exclusively affected the linkage group that determines sex in several diploid species of the green toad radiation. Recombination rates did not differ between diploid and polyploid toad species, and were overall much reduced in males, independent of linkage group and ploidy levels. Clonally transmitted subgenomes in allotriploid toads provided support for strong genetic drift, presumably resulting from recombination arrest. The Palearctic green toad radiation seems to offer unique opportunities to investigate the consequences of polyploidization and clonal transmission on the dynamics of genomes in vertebrates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes an approach to overcoming the complexity of software product management (SPM) and consists of several studies that investigate the activities and roles in product management, as well as issues related to the adoption of software product management. The thesis focuses on organizations that have started the adoption of SPM but faced difficulties due to its complexity and fuzziness and suggests the frameworks for overcoming these challenges using the principles of decomposition and iterative improvements. The research process consisted of three phases, each of which provided complementary results and empirical observation to the problem of overcoming the complexity of SPM. Overall, product management processes and practices in 13 companies were studied and analysed. Moreover, additional data was collected with a survey conducted worldwide. The collected data were analysed using the grounded theory (GT) to identify the possible ways to overcome the complexity of SPM. Complementary research methods, like elements of the Theory of Constraints were used for deeper data analysis. The results of the thesis indicate that the decomposition of SPM activities depending on the specific characteristics of companies and roles is a useful approach for simplifying the existing SPM frameworks. Companies would benefit from the results by adopting SPM activities more efficiently and effectively and spending fewer resources on its adoption by concentrating on the most important SPM activities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The transformation from high level task specification to low level motion control is a fundamental issue in sensorimotor control in animals and robots. This thesis develops a control scheme called virtual model control which addresses this issue. Virtual model control is a motion control language which uses simulations of imagined mechanical components to create forces, which are applied through joint torques, thereby creating the illusion that the components are connected to the robot. Due to the intuitive nature of this technique, designing a virtual model controller requires the same skills as designing the mechanism itself. A high level control system can be cascaded with the low level virtual model controller to modulate the parameters of the virtual mechanisms. Discrete commands from the high level controller would then result in fluid motion. An extension of Gardner's Partitioned Actuator Set Control method is developed. This method allows for the specification of constraints on the generalized forces which each serial path of a parallel mechanism can apply. Virtual model control has been applied to a bipedal walking robot. A simple algorithm utilizing a simple set of virtual components has successfully compelled the robot to walk eight consecutive steps.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The formulation of four-dimensional variational data assimilation allows the incorporation of constraints into the cost function which need only be weakly satisfied. In this paper we investigate the value of imposing conservation properties as weak constraints. Using the example of the two-body problem of celestial mechanics we compare weak constraints based on conservation laws with a constraint on the background state.We show how the imposition of conservation-based weak constraints changes the nature of the gradient equation. Assimilation experiments demonstrate how this can add extra information to the assimilation process, even when the underlying numerical model is conserving.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Adaptive methods which “equidistribute” a given positive weight function are now used fairly widely for selecting discrete meshes. The disadvantage of such schemes is that the resulting mesh may not be smoothly varying. In this paper a technique is developed for equidistributing a function subject to constraints on the ratios of adjacent steps in the mesh. Given a weight function $f \geqq 0$ on an interval $[a,b]$ and constants $c$ and $K$, the method produces a mesh with points $x_0 = a,x_{j + 1} = x_j + h_j ,j = 0,1, \cdots ,n - 1$ and $x_n = b$ such that\[ \int_{xj}^{x_{j + 1} } {f \leqq c\quad {\text{and}}\quad \frac{1} {K}} \leqq \frac{{h_{j + 1} }} {{h_j }} \leqq K\quad {\text{for}}\, j = 0,1, \cdots ,n - 1 . \] A theoretical analysis of the procedure is presented, and numerical algorithms for implementing the method are given. Examples show that the procedure is effective in practice. Other types of constraints on equidistributing meshes are also discussed. The principal application of the procedure is to the solution of boundary value problems, where the weight function is generally some error indicator, and accuracy and convergence properties may depend on the smoothness of the mesh. Other practical applications include the regrading of statistical data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using the eye-movement monitoring technique in two reading comprehension experiments, we investigated the timing of constraints on wh-dependencies (so-called ‘island’ constraints) in native and nonnative sentence processing. Our results show that both native and nonnative speakers of English are sensitive to extraction islands during processing, suggesting that memory storage limitations affect native and nonnative comprehenders in essentially the same way. Furthermore, our results show that the timing of island effects in native compared to nonnative sentence comprehension is affected differently by the type of cue (semantic fit versus filled gaps) signalling whether dependency formation is possible at a potential gap site. Whereas English native speakers showed immediate sensitivity to filled gaps but not to lack of semantic fit, proficient German-speaking learners of L2 English showed the opposite sensitivity pattern. This indicates that initial wh-dependency formation in nonnative processing is based on semantic feature-matching rather than being structurally mediated as in native comprehension.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We report findings from psycholinguistic experiments investigating the detailed timing of processing morphologically complex words by proficient adult second (L2) language learners of English in comparison to adult native (L1) speakers of English. The first study employed the masked priming technique to investigate -ed forms with a group of advanced Arabic-speaking learners of English. The results replicate previously found L1/L2 differences in morphological priming, even though in the present experiment an extra temporal delay was offered after the presentation of the prime words. The second study examined the timing of constraints against inflected forms inside derived words in English using the eye-movement monitoring technique and an additional acceptability judgment task with highly advanced Dutch L2 learners of English in comparison to adult L1 English controls. Whilst offline the L2 learners performed native-like, the eye-movement data showed that their online processing was not affected by the morphological constraint against regular plurals inside derived words in the same way as in native speakers. Taken together, these findings indicate that L2 learners are not just slower than native speakers in processing morphologically complex words, but that the L2 comprehension system employs real-time grammatical analysis (in this case, morphological information) less than the L1 system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computational formalisms have been pushing the boundaries of the field of computing for the last 80 years and much debate has surrounded what computing entails; what it is, and what it is not. This paper seeks to explore the boundaries of the ideas of computation and provide a framework for enabling a constructive discussion of computational ideas. First, a review of computing is given, ranging from Turing Machines to interactive computing. Then, a variety of natural physical systems are considered for their computational qualities. From this exploration, a framework is presented under which all dynamical systems can be considered as instances of the class of abstract computational platforms. An abstract computational platform is defined by both its intrinsic dynamics and how it allows computation that is meaningful to an external agent through the configuration of constraints upon those dynamics. It is asserted that a platform’s computational expressiveness is directly related to the freedom with which constraints can be placed. Finally, the requirements for a formal constraint description language are considered and it is proposed that Abstract State Machines may provide a reasonable basis for such a language.