993 resultados para Globular clusters: general


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The population structure of an organism reflects its evolutionary history and influences its evolutionary trajectory. It constrains the combination of genetic diversity and reveals patterns of past gene flow. Understanding it is a prerequisite for detecting genomic regions under selection, predicting the effect of population disturbances, or modeling gene flow. This paper examines the detailed global population structure of Arabidopsis thaliana. Using a set of 5,707 plants collected from around the globe and genotyped at 149 SNPs, we show that while A. thaliana as a species self-fertilizes 97% of the time, there is considerable variation among local groups. This level of outcrossing greatly limits observed heterozygosity but is sufficient to generate considerable local haplotypic diversity. We also find that in its native Eurasian range A. thaliana exhibits continuous isolation by distance at every geographic scale without natural breaks corresponding to classical notions of populations. By contrast, in North America, where it exists as an exotic species, A. thaliana exhibits little or no population structure at a continental scale but local isolation by distance that extends hundreds of km. This suggests a pattern for the development of isolation by distance that can establish itself shortly after an organism fills a new habitat range. It also raises questions about the general applicability of many standard population genetics models. Any model based on discrete clusters of interchangeable individuals will be an uneasy fit to organisms like A. thaliana which exhibit continuous isolation by distance on many scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: A projection onto convex sets reconstruction of multiplexed sensitivity encoded MRI (POCSMUSE) is developed to reduce motion-related artifacts, including respiration artifacts in abdominal imaging and aliasing artifacts in interleaved diffusion-weighted imaging. THEORY: Images with reduced artifacts are reconstructed with an iterative projection onto convex sets (POCS) procedure that uses the coil sensitivity profile as a constraint. This method can be applied to data obtained with different pulse sequences and k-space trajectories. In addition, various constraints can be incorporated to stabilize the reconstruction of ill-conditioned matrices. METHODS: The POCSMUSE technique was applied to abdominal fast spin-echo imaging data, and its effectiveness in respiratory-triggered scans was evaluated. The POCSMUSE method was also applied to reduce aliasing artifacts due to shot-to-shot phase variations in interleaved diffusion-weighted imaging data corresponding to different k-space trajectories and matrix condition numbers. RESULTS: Experimental results show that the POCSMUSE technique can effectively reduce motion-related artifacts in data obtained with different pulse sequences, k-space trajectories and contrasts. CONCLUSION: POCSMUSE is a general post-processing algorithm for reduction of motion-related artifacts. It is compatible with different pulse sequences, and can also be used to further reduce residual artifacts in data produced by existing motion artifact reduction methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scheduling a set of jobs over a collection of machines to optimize a certain quality-of-service measure is one of the most important research topics in both computer science theory and practice. In this thesis, we design algorithms that optimize {\em flow-time} (or delay) of jobs for scheduling problems that arise in a wide range of applications. We consider the classical model of unrelated machine scheduling and resolve several long standing open problems; we introduce new models that capture the novel algorithmic challenges in scheduling jobs in data centers or large clusters; we study the effect of selfish behavior in distributed and decentralized environments; we design algorithms that strive to balance the energy consumption and performance.

The technically interesting aspect of our work is the surprising connections we establish between approximation and online algorithms, economics, game theory, and queuing theory. It is the interplay of ideas from these different areas that lies at the heart of most of the algorithms presented in this thesis.

The main contributions of the thesis can be placed in one of the following categories.

1. Classical Unrelated Machine Scheduling: We give the first polygorithmic approximation algorithms for minimizing the average flow-time and minimizing the maximum flow-time in the offline setting. In the online and non-clairvoyant setting, we design the first non-clairvoyant algorithm for minimizing the weighted flow-time in the resource augmentation model. Our work introduces iterated rounding technique for the offline flow-time optimization, and gives the first framework to analyze non-clairvoyant algorithms for unrelated machines.

2. Polytope Scheduling Problem: To capture the multidimensional nature of the scheduling problems that arise in practice, we introduce Polytope Scheduling Problem (\psp). The \psp problem generalizes almost all classical scheduling models, and also captures hitherto unstudied scheduling problems such as routing multi-commodity flows, routing multicast (video-on-demand) trees, and multi-dimensional resource allocation. We design several competitive algorithms for the \psp problem and its variants for the objectives of minimizing the flow-time and completion time. Our work establishes many interesting connections between scheduling and market equilibrium concepts, fairness and non-clairvoyant scheduling, and queuing theoretic notion of stability and resource augmentation analysis.

3. Energy Efficient Scheduling: We give the first non-clairvoyant algorithm for minimizing the total flow-time + energy in the online and resource augmentation model for the most general setting of unrelated machines.

4. Selfish Scheduling: We study the effect of selfish behavior in scheduling and routing problems. We define a fairness index for scheduling policies called {\em bounded stretch}, and show that for the objective of minimizing the average (weighted) completion time, policies with small stretch lead to equilibrium outcomes with small price of anarchy. Our work gives the first linear/ convex programming duality based framework to bound the price of anarchy for general equilibrium concepts such as coarse correlated equilibrium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our media is saturated with claims of ``facts'' made from data. Database research has in the past focused on how to answer queries, but has not devoted much attention to discerning more subtle qualities of the resulting claims, e.g., is a claim ``cherry-picking''? This paper proposes a Query Response Surface (QRS) based framework that models claims based on structured data as parameterized queries. A key insight is that we can learn a lot about a claim by perturbing its parameters and seeing how its conclusion changes. This framework lets us formulate and tackle practical fact-checking tasks --- reverse-engineering vague claims, and countering questionable claims --- as computational problems. Within the QRS based framework, we take one step further, and propose a problem along with efficient algorithms for finding high-quality claims of a given form from data, i.e. raising good questions, in the first place. This is achieved to using a limited number of high-valued claims to represent high-valued regions of the QRS. Besides the general purpose high-quality claim finding problem, lead-finding can be tailored towards specific claim quality measures, also defined within the QRS framework. An example of uniqueness-based lead-finding is presented for ``one-of-the-few'' claims, landing in interpretable high-quality claims, and an adjustable mechanism for ranking objects, e.g. NBA players, based on what claims can be made for them. Finally, we study the use of visualization as a powerful way of conveying results of a large number of claims. An efficient two stage sampling algorithm is proposed for generating input of 2d scatter plot with heatmap, evalutaing a limited amount of data, while preserving the two essential visual features, namely outliers and clusters. For all the problems, we present real-world examples and experiments that demonstrate the power of our model, efficiency of our algorithms, and usefulness of their results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2014, Springer Science+Business Media Dordrecht.The burgeoning literature on global value chains (GVCs) has recast our understanding of how industrial clusters are shaped by their ties to the international economy, but within this context, the role played by corporate social responsibility (CSR) continues to evolve. New research in the past decade allows us to better understand how CSR is linked to industrial clusters and GVCs. With geographic production and trade patterns in many industries becoming concentrated in the global South, lead firms in GVCs have been under growing pressure to link economic and social upgrading in more integrated forms of CSR. This is leading to a confluence of “private governance” (corporate codes of conduct and monitoring), “social governance” (civil society pressure on business from labor organizations and non-governmental organizations), and “public governance” (government policies to support gains by labor groups and environmental activists). This new form of “synergistic governance” is illustrated with evidence from recent studies of GVCs and industrial clusters, as well as advances in theorizing about new patterns of governance in GVCs and clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente trabajo se ha realizado con pequeños productores de los municipios de Bernardo de Irigoyen y San Antonio del Departamento General Manuel Belgrano, Provincia de Misiones. El trabajo de investigación consiste en la evaluación de la sustentabilidad de dos sistemas productivos en chacras de pequeños productores en el Departamento General Manuel Belgrano Provincia de Misiones, Argentina. La problemática que motiva la realización de este trabajo de investigación se vincula a las dimensiones económica, ambiental y social de los pequeños productores en cada uno de los sistemas de producción desarrollado por ellos en la región. El objetivo de este trabajo es realizar una evaluación de la sustentabilidad de los sistemas tabacalero y diversificado en chacras de pequeños productores abordando las cuestiones económicas, ambientales y sociales a través de la aplicación de indicadores y un análisis cuantitativo de la rentabilidad de cada una de los sistemas. Para la determinación de la muestra de estudio se ha elegido 30 unidades productivas de cada uno de los sistemas (tabacalero y diversificado) siguiendo algunos criterios definidos previamente para luego seleccionar al azar 10 unidades definitiva de cada sistema. Los resultados de investigación evidenciaron que el sistema tabacalero mostró valores positivo de sustentabilidad para la dimensión económica y negativo para lo ambiental y social, mientras que para el sistema diversificado los niveles altos de sustentabilidad se dieron en las dimensiones ambientales y sociales. Por otro lado, la mayor rentabilidad económica le correspondió al sistema diversificado frente el tabacalero considerando datos de 10 (diez) años de producción. Finalmente, los resultados mostraron que la mano de obra y los insumos son los principales costos de producción para ambos sistemas, siendo aún mayor en el tabacalero influyendo fuertemente en la rentabilidad del mismo. Se concluye que el sistema tabacalero disminuye el margen de ganancia con el correr de los años, y si se mantiene esta tendencia, en pocos años no resultaría rentable para los niveles de productores analizados, siendo más sustentable y rentable el sistema diversificado ante el tabacalero.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La importante revista inglesa Nature, en su Volumen 340 del mes de Julio de 1989, publica interesantes resultados referentes a una encuesta realizada simultáneamente en los Estados Unidos de Norteamérica y en Inglaterra, para averiguar el concepto que el hombre común tiene de la ciencia y de sus métodos, así como del interés por la misma y del grado de conocimientos referentes a algunas de sus realizaciones. La encuesta se hizo sobre una muestra de unos dos mil norteamericanos y otros tantos ingleses, tomados al azar entre mayores de 18 años.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Too often, validation of computer models is considered as a "once and forget" task. In this paper a systematic and graduated approach to evacua tion model validation is suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The key problems in discussing stochastic monotonicity and duality for continuous time Markov chains are to give the criteria for existence and uniqueness and to construct the associated monotone processes in terms of their infinitesimal q -matrices. In their recent paper, Chen and Zhang [6] discussed these problems under the condition that the given q-matrix Q is conservative. The aim of this paper is to generalize their results to a more general case, i.e., the given q-matrix Q is not necessarily conservative. New problems arise 'in removing the conservative assumption. The existence and uniqueness criteria for this general case are given in this paper. Another important problem, the construction of all stochastically monotone Q-processes, is also considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A comprehensive solution of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, nonlinear solid mechanics and, possibly, electromagnetics together with their interactions, in what is now known as multiphysics simulation. Such simulations are computationally intensive and the implementation of solution strategies for multiphysics calculations must embed their effective parallelization. For some years, together with our collaborators, we have been involved in the development of numerical software tools for multiphysics modeling on parallel cluster systems. This research has involved a combination of algorithmic procedures, parallel strategies and tools, plus the design of a computational modeling software environment and its deployment in a range of real world applications. One output from this research is the three-dimensional parallel multiphysics code, PHYSICA. In this paper we report on an assessment of its parallel scalability on a range of increasingly complex models drawn from actual industrial problems, on three contemporary parallel cluster systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend the Harris regularity condition for ordinary Markov branching process to a more general case of non-linear Markov branching process. A regularity criterion which is very easy to check is obtained. In particular, we prove that a super-linear Markov branching process is regular if and only if the per capita offspring mean is less than or equal to I while a sub-linear Markov branching process is regular if the per capita offspring mean is finite. The Harris regularity condition then becomes a special case of our criterion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a characterization of the so-called most general temporal constraint (GTC), which guarantees the common-sense assertion that "the beginning of the effect cannot precede the beginning of the cause". The formalism is based on general time theory which takes both points and intervals as primitive. It is shown that there are in fact 8 possible causal relationships which satisfy GTC, including cases where, on the one hand, effects start simultaneously with, during, immediately after, or some time after their causes, and on the other hand, events end before, simultaneously with, or after their causes. These causal relationships are versatile enough to subsume those representatives in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In advanced non-small cell lung cancer (NSCLC) platinum based chemotherapy with second generation drugs improves median survival (MS) to 8 months and 29% and 10% at 1 and 2 years. Platinum with a third generation drug can improve survival further (BMJ 1995;311: 899) (Spiro et al. Thorax 2004;59:828 Big Lung Trial; N Engl J Med 2003;346:92 ECOG study). NICE now recommends chemotherapy with platinum and a third generation drug for inoperable NSCLC as the first treatment modality. Methods: We audited survival of 176/461 consecutive patients referred for at least 3 courses of platinum and either gemcitabine or vinorelbine from July 2001 to December 2005. Minimal follow up 17 months. Chemotherapy was given on site. Radical radiotherapy for stage IIIA, palliative radiotherapy and second line drugs were given as felt appropriate. Results: 64% were male. 30 (17%) were <55 years ; 66 (37.5%) age 55–65 years; 63 (35.8%) aged 66–75 and 16 (9.1%) >75 years. 5 (2.8%) were stage II; 46 (26%) stage IIIA; 68 (38%) stage IIIB and 55 (30.8%) stage IV. 68 (38%) had 0– 2 courses; 63 (36%) 3 courses and 44 (25%) had 4 or more.