890 resultados para Time Trade Off
Resumo:
John Warren and Chris Topping (2004). A trait specific model of competition in a spatially structured plant community. Ecological Modelling, 180 pp.477-485 RAE2008
Resumo:
Coherent shared memory is a convenient, but inefficient, method of inter-process communication for parallel programs. By contrast, message passing can be less convenient, but more efficient. To get the benefits of both models, several non-coherent memory behaviors have recently been proposed in the literature. We present an implementation of Mermera, a shared memory system that supports both coherent and non-coherent behaviors in a manner that enables programmers to mix multiple behaviors in the same program[HS93]. A programmer can debug a Mermera program using coherent memory, and then improve its performance by selectively reducing the level of coherence in the parts that are critical to performance. Mermera permits a trade-off of coherence for performance. We analyze this trade-off through measurements of our implementation, and by an example that illustrates the style of programming needed to exploit non-coherence. We find that, even on a small network of workstations, the performance advantage of non-coherence is compelling. Raw non-coherent memory operations perform 20-40~times better than non-coherent memory operations. An example application program is shown to run 5-11~times faster when permitted to exploit non-coherence. We conclude by commenting on our use of the Isis Toolkit of multicast protocols in implementing Mermera.
Resumo:
We study properties of non-uniform reductions and related completeness notions. We strengthen several results of Hitchcock and Pavan and give a trade-off between the amount of advice needed for a reduction and its honesty on NEXP. We construct an oracle relative to which this trade-off is optimal. We show, in a more systematic study of non-uniform reductions, that among other things non-uniformity can be removed at the cost of more queries. In line with Post's program for complexity theory we connect such 'uniformization' properties to the separation of complexity classes.
Resumo:
The effectiveness of service provisioning in largescale networks is highly dependent on the number and location of service facilities deployed at various hosts. The classical, centralized approach to determining the latter would amount to formulating and solving the uncapacitated k-median (UKM) problem (if the requested number of facilities is fixed), or the uncapacitated facility location (UFL) problem (if the number of facilities is also to be optimized). Clearly, such centralized approaches require knowledge of global topological and demand information, and thus do not scale and are not practical for large networks. The key question posed and answered in this paper is the following: "How can we determine in a distributed and scalable manner the number and location of service facilities?" We propose an innovative approach in which topology and demand information is limited to neighborhoods, or balls of small radius around selected facilities, whereas demand information is captured implicitly for the remaining (remote) clients outside these neighborhoods, by mapping them to clients on the edge of the neighborhood; the ball radius regulates the trade-off between scalability and performance. We develop a scalable, distributed approach that answers our key question through an iterative reoptimization of the location and the number of facilities within such balls. We show that even for small values of the radius (1 or 2), our distributed approach achieves performance under various synthetic and real Internet topologies that is comparable to that of optimal, centralized approaches requiring full topology and demand information.
Resumo:
In order to widely use Ge and III-V materials instead of Si in advanced CMOS technology, the process and integration of these materials has to be well established so that their high mobility benefit is not swamped by imperfect manufacturing procedures. In this dissertation number of key bottlenecks in realization of Ge devices are investigated; We address the challenge of the formation of low resistivity contacts on n-type Ge, comparing conventional and advanced rapid thermal annealing (RTA) and laser thermal annealing (LTA) techniques respectively. LTA appears to be a feasible approach for realization of low resistivity contacts with an incredibly sharp germanide-substrate interface and contact resistivity in the order of 10 -7 Ω.cm2. Furthermore the influence of RTA and LTA on dopant activation and leakage current suppression in n+/p Ge junction were compared. Providing very high active carrier concentration > 1020 cm-3, LTA resulted in higher leakage current compared to RTA which provided lower carrier concentration ~1019 cm-3. This is an indication of a trade-off between high activation level and junction leakage current. High ION/IOFF ratio ~ 107 was obtained, which to the best of our knowledge is the best reported value for n-type Ge so far. Simulations were carried out to investigate how target sputtering, dose retention, and damage formation is generated in thin-body semiconductors by means of energetic ion impacts and how they are dependent on the target physical material properties. Solid phase epitaxy studies in wide and thin Ge fins confirmed the formation of twin boundary defects and random nucleation growth, like in Si, but here 600 °C annealing temperature was found to be effective to reduce these defects. Finally, a non-destructive doping technique was successfully implemented to dope Ge nanowires, where nanowire resistivity was reduced by 5 orders of magnitude using PH3 based in-diffusion process.
Resumo:
We investigated how queens share parentage (skew) in the Argentine ant, Linepithema humile, a social insect with multiple queens (polygyny). Overall, maternity of 546 male and female sexuals that mated successfully was determined with microsatellites in 26 colonies consisting of two queens and workers. The first main finding was that queens all contributed to sexual production. However, there was a significant departure from equal contribution to male and female sexual production in a notable proportion of colonies. Overall, reproductive skew for sexual (male and female) production was relatively low but higher than reproductive skew for egg production. The second interesting result was that there was a trade-off in the relative contribution of queens to male and female production. The queens contributing more to male production contributed significantly less to female sexual production. Finally, there was no significant association between colony productivity and the degree of reproductive skew. The relatively low reproductive skew is in line with predictions of the so-called concession models of reproductive skew because, in the Argentine ant, relatedness between queens is low and ecological constraints on dispersal nonexistent or weak. © 2001 The Association for the Study of Animal Behaviour.
Resumo:
Multiple functions of the beta2-adrenergic receptor (ADRB2) and angiotensin-converting enzyme (ACE) genes warrant studies of their associations with aging-related phenotypes. We focus on multimarker analyses and analyses of the effects of compound genotypes of two polymorphisms in the ADRB2 gene, rs1042713 and rs1042714, and 11 polymorphisms of the ACE gene, on the risk of such an aging-associated phenotype as myocardial infarction (MI). We used the data from a genotyped sample of the Framingham Heart Study Offspring (FHSO) cohort (n = 1500) followed for about 36 years with six examinations. The ADRB2 rs1042714 (C-->G) polymorphism and two moderately correlated (r(2) = 0.77) ACE polymorphisms, rs4363 (A-->G) and rs12449782 (A-->G), were significantly associated with risks of MI in this aging cohort in multimarker models. Predominantly linked ACE genotypes exhibited opposite effects on MI risks, e.g., the AA (rs12449782) genotype had a detrimental effect, whereas the predominantly linked AA (rs4363) genotype exhibited a protective effect. This trade-off occurs as a result of the opposite effects of rare compound genotypes of the ACE polymorphisms with a single dose of the AG heterozygote. This genetic trade-off is further augmented by the selective modulating effect of the rs1042714 ADRB2 polymorphism. The associations were not altered by adjustment for common MI risk factors. The results suggest that effects of single specific genetic variants of the ADRB2 and ACE genes on MI can be readily altered by gene-gene or/and gene-environmental interactions, especially in large heterogeneous samples. Multimarker genetic analyses should benefit studies of complex aging-associated phenotypes.
Resumo:
When solid material is removed in order to create flow channels in a load carrying structure, the strength of the structure decreases. On the other hand, a structure with channels is lighter and easier to transport as part of a vehicle. Here, we show that this trade off can be used for benefit, to design a vascular mechanical structure. When the total amount of solid is fixed and the sizes, shapes, and positions of the channels can vary, it is possible to morph the flow architecture such that it endows the mechanical structure with maximum strength. The result is a multifunctional structure that offers not only mechanical strength but also new capabilities necessary for volumetric functionalities such as self-healing and self-cooling. We illustrate the generation of such designs for strength and fluid flow for several classes of vasculatures: parallel channels, trees with one, two, and three bifurcation levels. The flow regime in every channel is laminar and fully developed. In each case, we found that it is possible to select not only the channel dimensions but also their positions such that the entire structure offers more strength and less flow resistance when the total volume (or weight) and the total channel volume are fixed. We show that the minimized peak stress is smaller when the channel volume (φ) is smaller and the vasculature is more complex, i.e., with more levels of bifurcation. Diminishing returns are reached in both directions, decreasing φ and increasing complexity. For example, when φ=0.02 the minimized peak stress of a design with one bifurcation level is only 0.2% greater than the peak stress in the optimized vascular design with two levels of bifurcation. © 2010 American Institute of Physics.
Resumo:
In this paper, we propose a framework for robust optimization that relaxes the standard notion of robustness by allowing the decision maker to vary the protection level in a smooth way across the uncertainty set. We apply our approach to the problem of maximizing the expected value of a payoff function when the underlying distribution is ambiguous and therefore robustness is relevant. Our primary objective is to develop this framework and relate it to the standard notion of robustness, which deals with only a single guarantee across one uncertainty set. First, we show that our approach connects closely to the theory of convex risk measures. We show that the complexity of this approach is equivalent to that of solving a small number of standard robust problems. We then investigate the conservatism benefits and downside probability guarantees implied by this approach and compare to the standard robust approach. Finally, we illustrate theme thodology on an asset allocation example consisting of historical market data over a 25-year investment horizon and find in every case we explore that relaxing standard robustness with soft robustness yields a seemingly favorable risk-return trade-off: each case results in a higher out-of-sample expected return for a relatively minor degradation of out-of-sample downside performance. © 2010 INFORMS.
Resumo:
Antigenically evolving pathogens such as influenza viruses are difficult to control owing to their ability to evade host immunity by producing immune escape variants. Experimental studies have repeatedly demonstrated that viral immune escape variants emerge more often from immunized hosts than from naive hosts. This empirical relationship between host immune status and within-host immune escape is not fully understood theoretically, nor has its impact on antigenic evolution at the population level been evaluated. Here, we show that this relationship can be understood as a trade-off between the probability that a new antigenic variant is produced and the level of viraemia it reaches within a host. Scaling up this intra-host level trade-off to a simple population level model, we obtain a distribution for variant persistence times that is consistent with influenza A/H3N2 antigenic variant data. At the within-host level, our results show that target cell limitation, or a functional equivalent, provides a parsimonious explanation for how host immune status drives the generation of immune escape mutants. At the population level, our analysis also offers an alternative explanation for the observed tempo of antigenic evolution, namely that the production rate of immune escape variants is driven by the accumulation of herd immunity. Overall, our results suggest that disease control strategies should be further assessed by considering the impact that increased immunity--through vaccination--has on the production of new antigenic variants.
Resumo:
In this Chapter we discuss the load-balancing issues arising in parallel mesh based computational mechanics codes for which the processor loading changes during the run. We briefly touch on geometric repartitioning ideas and then focus on different ways of using a graph both to solve the load-balancing problem and the optimisation problem, both locally and globally. We also briefly discuss whether repartitioning is always valid. Sample illustrative results are presented and we conclude that repartitioning is an attractive option if the load changes are not too dramatic and that there is a certain trade-off between partition quality and volume of data that the underlying application needs to migrate.
Resumo:
In this paper, we provide a unified approach to solving preemptive scheduling problems with uniform parallel machines and controllable processing times. We demonstrate that a single criterion problem of minimizing total compression cost subject to the constraint that all due dates should be met can be formulated in terms of maximizing a linear function over a generalized polymatroid. This justifies applicability of the greedy approach and allows us to develop fast algorithms for solving the problem with arbitrary release and due dates as well as its special case with zero release dates and a common due date. For the bicriteria counterpart of the latter problem we develop an efficient algorithm that constructs the trade-off curve for minimizing the compression cost and the makespan.
Resumo:
Here we describe a new trait-based model for cellular resource allocation that we use to investigate the relative importance of different drivers for small cell size in phytoplankton. Using the model, we show that increased investment in nonscalable structural components with decreasing cell size leads to a trade-off between cell size, nutrient and light affinity, and growth rate. Within the most extreme nutrient-limited, stratified environments, resource competition theory then predicts a trend toward larger minimum cell size with increasing depth. We demonstrate that this explains observed trends using a marine ecosystem model that represents selection and adaptation of a diverse community defined by traits for cell size and subcellular resource allocation. This framework for linking cellular physiology to environmental selection can be used to investigate the adaptive response of the marine microbial community to environmental conditions and the adaptive value of variations in cellular physiology.
Resumo:
Highlights •We exposed meiofauna to 7 different large macrofauna species at high and low densities. •Macrofauna presence altered nematode community structure and reduced their abundance. •Macrofauna species had similar effects by reducing the few dominant nematode species. •Meio–macrofauna resource competition and spatial segregation are the main drivers. •Trawling effects on macrofauna affect nematode communities indirectly. Diverse assemblages of infauna in sediments provide important physical and biogeochemical services, but are under increasing pressure by anthropogenic activities, such as benthic trawling. It is known that trawling disturbance has a substantial effect on the larger benthic fauna, with reductions in density and diversity, and changes in community structure, benthic biomass, production, and bioturbation and biogeochemical processes. Largely unknown, however, are the mechanisms by which the trawling impacts on the large benthic macro- and megafauna may influence the smaller meiofauna. To investigate this, a mesocosm experiment was conducted whereby benthic nematode communities from a non-trawled area were exposed to three different densities (absent, low, normal) of 7 large (> 10 mm) naturally co-occurring, bioturbating species which are potentially vulnerable to trawling disturbance. The results showed that total abundances of nematodes were lower if these large macrofauna species were present, but no clear nematode abundance effects could be assigned to the macrofauna density differences. Nematode community structure changed in response to macrofauna presence and density, mainly as a result of the reduced abundance of a few dominant nematode species. Any detectable effects seemed similar for nearly all macrofauna species treatments, supporting the idea that there may be a general indirect, macrofauna-mediated trawling impact on nematode communities. Explanations for these results may be, firstly, competition for food resources, resulting in spatial segregation of the meio- and macrobenthic components. Secondly, different densities of large macrofauna organisms may affect the nematode community structure through different intensities of bioturbatory disturbance or resource competition. These results suggest that removal or reduced densities of larger macrofauna species as a result of trawling disturbance may lead to increased nematode abundance and hints at the validity of interference competition between large macrofauna organisms and the smaller meiofauna, and the energy equivalence hypothesis, where a trade-off is observed between groups of organisms that are dependent on a common source of energy.
Mechanisms shaping size structure and functional diversity of phytoplankton communities in the ocean
Resumo:
The factors regulating phytoplankton community composition play a crucial role in structuring aquatic food webs. However, consensus is still lacking about the mechanisms underlying the observed biogeographical differences in cell size composition of phytoplankton communities. Here we use a trait-based model to disentangle these mechanisms in two contrasting regions of the Atlantic Ocean. In our model, the phytoplankton community can self-assemble based on a trade-off emerging from relationships between cell size and (1) nutrient uptake, (2) zooplankton grazing, and (3) phytoplankton sinking. Grazing 'pushes' the community towards larger cell sizes, whereas nutrient uptake and sinking 'pull' the community towards smaller cell sizes. We find that the stable environmental conditions of the tropics strongly balance these forces leading to persistently small cell sizes and reduced size diversity. In contrast, the seasonality of the temperate region causes the community to regularly reorganize via shifts in species composition and to exhibit, on average, bigger cell sizes and higher size diversity than in the tropics. Our results raise the importance of environmental variability as a key structuring mechanism of plankton communities in the ocean and call for a reassessment of the current understanding of phytoplankton diversity patterns across latitudinal gradients.